var/home/core/zuul-output/0000755000175000017500000000000015153510033014521 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015153516240015473 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000176735315153516063020301 0ustar corecore3ikubelet.lognc9r~DYd` \-Hږ%C{sg5݁ϱ(Ӄis$WU)X62?"mv?_eGbuu񯷑7+%f?7ݭ7֫'e% oo/q3m^]/o?8.7oW}ʋghewx/mX,ojŻ ^Tb3b#׳:}=p7뼝ca㑔`e0I1Q!&ѱ[/o^{W-{t3_U|6 x)K#/5ΌR"ggóisR)N %emOQ/Ϋ_oa0vs68/Jʢ ܚʂ9ss3+aô٥J}{37FEbп3 FKX1QRQlrTvb)E,s)Wɀ;$#LcdHM%vz_. o~I|g\W#NqɌDSd1d9nT#Abn q1J# !8,$RNI? j!bE"o j/o\E`r"hA ós yi\_.!=A(%Ud,QwC}F][UVYE NQGn0Ƞɻ>.ww}(o./WY<͉#5O H 'wo6C9yg|O~ €' S[q?,!yq%a:y<\tunL h%$Ǥ].v y[W_` \r/Ɛ%aޗ' B.-^ mQYd'xP2ewEڊL|^ͣrZg7n͐AG%ʷr<>; 2W>h?y|(G>ClsXT(VIx$(J:&~CQpkۗgVKx*lJ3o|s`<՛=JPBUGߩnX#;4ٻO2{Fݫr~AreFj?wQC9yO|$UvވkZoIfzC|]|[>ӸUKҳt17ä$ ֈm maUNvS_$qrMY QOΨN!㞊;4U^Z/ QB?q3En.اeI"X#gZ+Xk?povR]8~깮$b@n3xh!|t{: CºC{ 8Ѿm[ ~z/9آs;DPsif39HoN λC?; H^-¸oZ( +"@@%'0MtWG uIo1]ߔr TGGJ\ C.iTEZ{(¥:-³xlՐ0A_Fݗw)(c>/1:N3cl.:f 3 JJ5Z|&הԟ,Tصp&NI%`t3Vi=Ob㸵2*3d*mQ%"h+ "f "D(~~moH|E3*46$Ag4aX)Ǜƾ9U Ӆ^}ڲ7J9@ kV%g>a~W;D=;y|AAY'"葋_d$Ə{(he NSfX1982TH#D֪v3l"<, { Tms'oI&'Adp]{1DL^5"Ϧޙ`F}W5XDV7V5EE9esYYfiMOV i/ f>3VQ 7,oTW⇊AqO:rƭĘ DuZ^ To3dEN/} fI+?|Uz5SUZa{P,97óI,Q{eNFV+(hʺb ״ʻʞX6ýcsT z`q 0C?41- _n^ylSO2|'P'BOTLl-9Ja [$3BV2DC4l!TO C*Mrii1f5 JA *#jv߿Imy%u LOL8c3ilLJ!Ip,2(( *%KGj   %*e5-wFp"a~fzqu6tY,d,`!qIv꜒"T[1!I!NwL}\|}.b3oXR\(L _nJBR_v'5n]FhNU˿oۂ6C9C7sn,kje*;iΓA7,Q)-,=1A sK|ۜLɽy]ʸEO<-YEqKzϢ \{>dDLF amKGm+`VLJsC>?5rk{-3Ss`y_C}Q v,{*)ߎ% qƦat:D=uNvdߋ{Ny[$ {ɴ6hOI']dC5`t9:GO: FmlN*:g^;T^B0$B%C6Θ%|5u=kkN2{'FEc* A>{avdt)8|mg定TN7,TEXt+`F PsSMiI S/jﴍ8wPVC P2EU:F4!ʢlQHZ9E CBU)Y(S8)c yO[E}Lc&ld\{ELO3芷AgX*;RgXGdCgX JgX2*Ъ3:O7ǭ3ږA :}d,ZByXϯ&Ksg3["66hŢFD&iQCFd4%h= z{tKmdߟ9i {A.:Mw~^`X\u6|6rcIF3b9O:j 2IN…D% YCUI}~;XI썋Fqil><UKkZ{iqi :íy˧FR1u)X9 f΁U ~5batx|ELU:T'Tស[G*ݧ ؽZK̡O6rLmȰ (T$ n#b@hpj:˾ojs)M/8`$:) X+ҧSaۥzw}^P1J%+P:Dsƫ%z; +g 0հc0E) 3jƯ?e|miȄ?lm$K/$s_. WM]̍"W%`lO2-"ew@E=!|!p+,ICE^fu `|M3J#BQȌ6DNnCˣ"F$/Qx%m&FK_7P|٢?I-RiAKoQrMI>QQ!'7h,sF\jzP\7:Q\)#s{p'ɂN$r;fVkv߸>6!<̅:xn<# -BȢ1I~ŋ-*|`В~_>ۅm}67X9z=Oa Am]fnޤ{"hd߃Ԉ|tLD3 7'yOc& LFs%B!sRE2K0p\0͙npV)̍F$X8a-bp)5,] Bo|ؖA]Y`-jyL'8>JJ{>źuMp(jL!M7uTźmr(Uxbbqe5rZ HҘ3ڴ(|e@ew>w3C=9k-{p>րd^T@eFZ#WWwYzK uK r؛6V L)auS6=`#(TO֙`mn Lv%7mSU@n_Vۀl9BIcSxlT![`[klzFض˪.l >7l@ΖLl gEj gWUDnr7AG;lU6ieabp៚U|,}S@t1:X _ .xI_7ve Z@7IX/C7@u BGڔE7M/k $q^hڧ};naU%~X!^C5Aw͢.@d!@dU}b? -ʏw |VvlK۴ymkiK% 0OFjT_kPW1mk%?\@R>XCl}b ,8; :.b9m]XaINE`!6uOhUuta^xN@˭d- T5 $4ذ:[a>֋&"_ }Oõϸ~rj uw\h~M il[ 2pCaOok.X0C?~[:^Pr򣏷y@/ڠ --i!M5mjozEƨ||Yt,=d#uЇ  l]չoݴmqV".lCqBѷ /![auPmpnEjus]2{2#b'$?T3{k>h+@]*pp桸]%nĴFԨlu |VXnq#r:kg_Q1,MNi˰ 7#`VCpᇽmpM+tWuk0 q /} 5 ¶]fXEj@5JcU_b@JS`wYmJ gEk2'0/> unKs^C6B WEt7M'#|kf1:X l]ABC {kanW{ 6 g`_w\|8Fjȡstuf%Plx3E#zmxfU S^ 3_`wRY}@ŹBz²?mК/mm}m"Gy4dl\)cb<>O0BďJrDd\TDFMEr~q#i}$y3.*j) qQa% |`bEۈ8S 95JͩA3SX~߃ʟ~㍖›f!OI1R~-6͘!?/Vvot4~6I@GNݖ-m[d<-l9fbn,'eO2sٟ+AWzw A<4 }w"*mj8{ P&Y#ErwHhL2cPr Wҭюky7aXt?2 'so fnHXx1o@0TmBLi0lhѦ* _9[3L`I,|J @xS}NEij]Qexx*lJF#+L@-ՑQz֬]")JC])"K{v@`<ۃ7|qk" L+Y*Ha)j~pu7ި!:E#s:ic.XC^wT/]n2'>^&pnapckL>2QQWo/ݻ<̍8)r`F!Woc0Xq0 R' eQ&Aѣzvw=e&".awfShWjÅD0JkBh]s9Ą|ק_;%X6Q@d 8&a)a.#ۿD> vfA{$g ăyd) SK?ɧDEu&ݛȘPˬ-Ő\B`xr`"F'Iٺ*DnA)yzr^!3Ír!S$,.:+d̋BʺJ#SX*8ҁW7~>oOFe-<uJQ|FZEP_O1FPba_odI4K%ILM6rN+LxE>^DݮEڬTk1+trǴ5RHİ{qJ\}X` >+%ni3+(0m8HЭ*zAep!*)jxG:Up~gfu#x~ .2ןGRLIۘT==!TlN3ӆv%#oV}N~ˊc,߸,=COU C],Ϣa!Lcsyqxپ im41;P^a/zl5|JE2z=.wcMԧ ax& =`|#HQ*lS<.U׻`>aj '!9MHK:9#s,jV剤C:LIeHJ"M8P,$N;a-zݸJWc :.<sR6 լ $gu4M*B(A ݖΑِ 5H;S*ڳJt>$M!^*n3qESfU, Iĭb#UFJPvBgZvn aE5}~2E|=D' ܇q>8[¿xp/9Om5|k \6xH.Z'OeCD@cq:Y~<1LٖY9# xe8g IKTQ:+Xg:*}.<M{ZH[^>m0G{ hiOO|9Y"mma[sSb_b'Rv&{@6; KE.a\}:<]Oyve3h=}E[kMD,5 %sO{킒 8.K/ߴDJnxkG:8@ZO'rCWw+J>=$ts,cJZڗOx2c6 .1zҪR "^Q[ TF )㢥M-GicQ\BL(hO7zNa>>'(KgS{>/MoD8q̒vv73'9p?M&jV3=ɹvYƛ{3iψI4Kp5 d2ogd||K>R1Qzi#f>夑3KմԔ萴%|xyz>ķx>{E>Z4Ӥ͋#+hI{hNZt 9`b˝`yB,Ȍ=6Z" 8L O)&Oo7\7ix@ D߬P"~GijbɠM&HtpR:4Si גt&nroK\e5Pt:*qSH PgΉU'VKξ ,!3`˞t1Rx}fvvPXdQSg6EDT:dׁz^DjXp͇G|X5Q9K$)U?o': .,wؓaՁ3]Q 16ZYafuvrq^QT},!H]6{Jw>_%wK{)qH+"B4H7-]~|u~9x۾vS;kN?WƟ+fx3SukQqxST Ζ2%?T7<a{N8;lr`$pZdK=3jwlL Eڲ t|*n8[#yN SrA GYb8ZIaʼn8 #fg3i`F#5N 3qM]j 8E!@1vցP7!|+R@;HspSI]ڻCZUcg5pDcIϹ,oN-_XI,3\j ]ُ?}ŏa ې!rGHw@56DǑq LA!&mYJ*8OAH$77f|lgo I;.K*!<+"eK5S&`X:#;@B@[(K44sBFu M.MNWLlY]K᜴=/ VމYlϿ6i 6$>ma}*sJɪ`E@+B@IFQ BFIg@5ZP[F,'i7Y{rp`b Y1LȤѧ=+kcBacPXZs'UM}}6 _aM;Iw]?.L3AnE wx%汅\OPY\Nh淓籋K)-7R(y[M1<;̭%qѕ֢b(LF/ bSŧULy8QDgYU|U--sG8`u! qGYܷw;ȌCPc_|*RaIBKb+{P.T! =ĦiTob d<>SHrC[KqWK7ѝBYǭ~RR"p9dF|K- oWbY_vM 4,]e/dy,8!xŋ5 R<^mYo 3c9(F?hr,8>7upO`b NC0%Ն R}$_ EV a"҅4|T!Ddp-n Ǚ.™5v,VZ{[g./ +n䤓dF>:֓[@ QPltsHQ$J=>O!;*>ohǖa[|Wnya+Cs6K!x^>$ N7 l 2JZ=0]Sה(*CjaS:p/N6I*Mx8"EȿQa[1 ŶD3u8j`B59qU]ג`upHЍE_fNTU*q%b1! `ʗrǚ8ce~yWNqXC٩ȦD\!~s7[NRC˔d X1t3։:F_magB-Z%}ލީ׵Oj|Yb:.͘C4z6qmJ6`~#Eh3ŕS,|HrVQ7~ۮ 馋SVL l)}Yg%1C+t;_'|Y8Wd*:hUvг˙r-'^  [Cr?}W3Q#vc]ll>ŰAVG Y%.9VndЗ? ǫ>*Hk6>!8l7> c7!8bdEˊx9y:9244ANb n\"X>Y`bb*h%1(*Dra^sh6"BzƾH( ."e1B QhmvKlXtӈx92aI`"Ǒm O\B!,ZDbjKM%q%E](>Hm 2z=E`^LRф%V Ng2Kh}`ot.GSGd ڧoE+!B{'Nb!{SEpk%L1OUaY얹aZVnDZfW{os&ȑ|X!|i*JTkgjվ,$'qo%HWc\46%-D1Vga>@'@—>+@o:e"l |dv;2۽k%x90ݙAOe n}nHf[+.4<#/5߂ݛǪ0q(z7De/!; 瓠 Li% z ]ɯ"O-]J`sdN$@"J`Y1t3K/9`VTElsX|D^c%֯U][$z%u[1O'CXʘ9bu.A#O18B`.aN:ǖ9dɹ>U nASaSK1OOȩ<+Mȩ*'3IC~LG,?.?}ӷBYpmWg.~>3ڄ 5[C&-Wg}}_jVo,?s w*n\7[cpMY<~/"˘oV܉T6mn \_ߋV_}Z=k-nn sn.*upw pX\_ U-C_wS!|q?E-S_w$-#9?wh:R 4+%ݽs&Z&em-ld b.E1բ${]]Nj"䁖%5#3dCY%HAK1/FnRL3XɯEr^v,bfbIJ'@hX!<[@ ,&,]$*բk+E$dwS:֢̆ Uh``%NĀVecK[ld-'“5XIυU0؋6\h%1GK(-Yv% 'mQ; GdZ%gI-XE]V f#]bCD6b&!9VWnʂI-|i*'yW='6m$omB,޳X$Ic>EJ# 15ۑO2Jh)8Vgl0/NEU"Ik dR9mBu-)/; N Ɩcv{Xn|$̇ld`>1Ljn떚F+B٫jeTa+Cmίw+:ÈW\Xby;Zbt Ŗz6H5k5 9V/ O<UJfdc06JZW?V g| R 0\jWu~}Ѥ 9U|A ЃE}T} Tnp7h_A[)=V qy)U cigsN>Lй Wq1T$mBqZbRT2e8V ScMȱ˿ύ-03cu0:U[p^vm|YhiklU&Z>֨řoҒ"HJX v6„=zwҌ5+E_1;ƇUn&O1^'CplM)0\nM/ή ?Cֲ4Ckcu6/!_Ɩ} 8$ TkRy2Эv!ؒRKfs%(1Lhrٵ L.]s?I,HBԢ[b G-lMG+@_$c%* _jR|\Zdc5u= A`U`eUc\˔` րj&*ߗEЍ0U#X) b0E'+]1&! 7ɜc:x@dl}W|$mDWx"sX4*SxKVuV+!lj@_R;IQ8ŢތPDlOZ< (1ZRÜ:OUݚM/v{'jYXE4S+#7ޚc0ф5ҁ#x:.L!NUyL6i,+Bg#[`pO^>eoB4F\jtc \h)Zcnp2L>6^ʞXnlwìRXYJk`UZMW?CU0E [%U%nl xِ3܎y,<ٸ-$)q7-Ո:tC\?c%7\):W_¸% >Ę_"+BLu>'Ɩ=xɮ[⠋X [(6I"z)2c zp&m?e8 "Q8W3up˳ A¦^ʮLW's%eJ `uv6%OE-56 0-v/Xŷ%r׽nl-ߑst2S%tTڪ?>>{2])|Ը>U;Mѹ .Vfz0Ïd0>7. ]|>TT%69dp-*VVK=$l&~g۷&i"Ì{rQk壹n\殸,˛_uu.Jssu/*47U0)l?R_^Uon̝f-nnZTeuu nn/*0׷տ·sHH?Eд= _IcoWF}|ŧa`5+ n."t<׃#xuA0YGNC0%ڋAPev/aFc%z,T6?h6 E:lUc|T=ƽJ1t|Vm'!m8N$@"Yҫ\r2aR|=(L X1|wrO_g ux1ƱP+${˪^yq>Elq*E< ^X9ۧ@Z +z7$ "i8U 7bSeo'ki?I+/ػ޶%WK(Ҽ0'N3`d7%)E")Yvlbƌ[.]qwD U\Y:dGUaRYc)JYUIr+̏U r.cՅ$0V(K E^ĥ+mlʱ4ƪ(L\BIv=ZF/(ys+8+nO/yPH>Ua*/o"kGR.أ`J4~ NՌ1V |cp]yp]I-OxeɠNE2 dU ƧBOy!3陎?10͗8 B^%`8 vr2>.)6.);1:.);ameFF<^zh|^\نYtWiH7> ]] ah BSkg0tQ<_|bVZȻM!}rxgl:5u>sι[ v_yP4[aQZPx0Eൈ1LzPy|`31-q-@|Da==֛'3t,tm6+R#x/=R Y~V;oudvQзD4*soR~KZ.⨚9z߫g:X LfpjĒz=*,,Fs^- L,[fpPl< OCE^*̈:<<ϲ%3B <XGR0ݴ,f;`{m&FoF{͓֨,2<3 ܮst{>LzwGL8B-,,,dZ^Y:HkY/TU&%g)0aTE\L'I,2[@ $K xh㙥_iZ1=P~EE/(K> -#/tr|~΋ o?<)<Av5i mo-Og lc{[[zcyl?=r0g/M<9xL !Åd*q#AX(E̓2[WEBjݹwHr98hYIO<Ǘ^[Xbw{ώW|'DޣhPu M)fѤ}YAY"OQ{LuV02 $G0suʗqx̟G;+`8޿ / WdX:6|L){PՕ|%VxWg)0J걜U) <^k l32$H/&,tt Ȏ_4% {VQhraFIp?i&K;vwF 4zoyhPمMmohˢ|L%O^q Bonp) &h`j~c pc4ŵx%O9=9T>CBа"+!{eUeuEq9TELC*$_/ o)CԃT|[اDNcR9:;nsg ^H.BICZ%-πP 8:|গ:JĆ0Tm)7A?,ꊧU$A*+B%@N4-N+%LC{*Oa\ZNIdPME#!bg<*.[o2&Z+$4'Mz]W>>s넽 q Y.pzuFgh5EHØ7lzdv_OO. ᷬsJ0_w> պ> N oH< m7K8qB{v=_A-o>w18޽h5@6mRR NKIHV 3.SMzեGД(*yҩl=:-ă+Z/Љ]7t Bڥ Yvqx$15 ZpၹK"OUrɯFuB6`J4ɢ]<8 -^ÃY^d 7u'"[]籠A 5=IV&kD4 4ʊ_I}]WОmM4AA0*D흈0 DR.J E=,4K2}y*jjkR9-Mei"{ H,qT,3w"?O+l(b&˃ Yt+>Ө|PB:Xj.ԾI<Жf(ak=2uRR鳧o@e@/xKv(1tc-]Y]_:ʵmH7Lp]H*xZspwvPyv'3D*6J`"/^)Ǧl[i-Y!7R0͝NqL +܈IZr:%5-}5V]IIÒRI@Y b-*ZSTs+Q{R^HgxN\*rvזqXb X%Y41q*njx6U媈q$JhLfN cEfںՃՐٶ6XohD 3T07qD^m[  WC _Ռ?k@X>~)CMq"Aݦnҩ+USJlj80Am߁hs|A%rzxDCo[r=־>tW<"h2T]lN5&Iz(fmu.x%]Kjv#˖ZbEC>ʱZa :gνG|u[kEӺP`?W,̵mr傛Ҳl̑(Ega1;u4Ћ|3aBӝȧLҎnJC5RUqVXRwl=u6wbnI: m(6,ՙM:7f,G1de_&QnҲ4EZrd!m \r B:iztUM{tIm)sư0m;ycB'2W^ ʤ]2wgؐ>4u}B MвE5Hlf;*Jwe[6bG+hgm+kHP"Q׽HUUY%e({XwH 3rIKȫѭ$RP FkK &c%UϚkGf [fh|>9u˘@~aA {99+}d.KOק gW2В g sMc07a 1ײw0r~􍙀JS22k },;&9̵t=,lH#;;zKh"g)b@˝mPt8k2\H4 !,s}} =5yӟ xjcР؃ X_=M[o~ls9z$1@ia 8%',eokpsYr\N߶~tCɯw <-<(%׎ؽ݆2y^6@'os)N߿}=Ӥ?Sƹm[+μ=K&OpqN줽&Hf#:R)Av%wq-.8UԹl3.oI'4`wFcoÄ#_Su{) ~֔lz*+6o8x@?N2\|6AWރ-jNAPhWu>yjg*(U6x-y} [=Evpcext`;:W?;e,X1ޫ.jǡ;rY;r^][i?Fa|ly]Ei,I\NY1dӫe_f]hH(X3!Tf_&9 [=[l/Dd6>?بr@=?5S eaiXiƒ&H$y.4Ԥȁ)yf&wz/@vݱIlXZSكUg][bRpWF 0{KLP3%,Ƶ5EյZ+AȪ~^+z$Wp `Jc^F3X(=;Բoj%Hb\$؛8AlݲC g1u™*yqo@h?& Zc4'`dxGh0@\[K|iмF! bVcU+$}"q1E'9bHԏt:} ,KSt仮7}} ?D _꯫>ZxĦɄ9aۢ:}TkY2)]ZTe3AЏzZ=ZGJe eYm0زos<`}\K]uZai2:3a)$Q])UsRFmujTP7iUjmT=慨;8%{^ 0$~& O4vkv϶UYCzݍ Xv/4hAդV/ Ɉtͦ&]"5?:3WȯkIljP⏢lvS ɰbأGdeskBFxiɾLոՀ~Ve *1c@}>uŭJG9ז +8c4Z,HwV19Wi0\ikDڂ%8HDaWH_w||n}|:!;:k6NUt8#9YxBAGtVĩlf1w|2<Cq1 ` 43hC?DNI U9hSYY=G(_hϦ Ha/FDʳYXmZf`㣶 PwGB- ۞Piz;mALO#ߑP BeB- '4x[.nAh= w$4܂PL܂P=i ]E))e oH5cFNKPnn"&gyڧ>˻|0^өtl&n(4:LvOٔ%zUQA"Cݐ1,bq14#]V&}U\?C:HFK٨so\=/H^/{~tqVgн?S y_ meޝ3}fUx6l A9Ǡ<<=_9G.Dž4'i>"GK.q*O"BͥՁyh8Up'NPe)~y; ZLM4Pnׂy5] VO{X520J4DKL0믂y sF#Ilc,A -,MwpoE:H[_$O{r' #@imw'(?'[H H+)w52¥ 8o֪ i9 *ƳqVp2U,hQ|xnsz<,$~P=kRTrV4Fdm ۾Uv~?0*'9ܻhs˽ĕ a[mS,Paղ/wEe޲wRXdp:n bʻQGSj-Ab9R:(ZUYF>F#p#)Q:<&( kCI=Yu7NHTMwТq4,00 (LX0gĊC%̉|N}FBGsOvo#.-.2$pF9 s A@a0 ,څ=,s G4 '+j9q}Rj;kGlVa@: %b?fSk)z;{LA:-b?Qaǧo+\B}weS<}SG<^w~3g5e^Y?||;Bse\;J WkPhml^-sU\Tn|idY8]F9%=q\M$mx`áb-YߵY{WM(#W Q=/ނY5-u>fQvI<,s焾:y$Vݧ0n@9Rt[RiM VK824p6ܭ,? l0++6a|&k.nA:Pxk6οD}援uR X-G@Ǡ)[jgzomKXsԅ@ϖWS~sTo4IӞ]ÚBK~Ҧ\ŦљE^IM9YvaWjiZt "2Y2B1lsF) lY3QVCղm"^'b9N1i_3ދ9z3*YmVB9[fc\;G)(dP)b"ΦMN4=?lb,T g i$L*3J%~7?_7T5l>&i'({Ę}&O-`lyG{oЊwOp7b3RN@14075S-1E}?*'y|fh 0RRe+Q~[O"G!4oAģ=ksg*?mc/y+$!$e~}z@؇Q@ȶ\ns/mV?9ӺnS~`2uxd}qzPaCZHC3f^@f oʾ}ֳgu{cEƌq?B1sfaGi?ew" SSIz%ȲU(!GӥT?b *yPқz! f%G&RK^s FWm3_#P,QFҾȄ7Džb "ݠ55M;q[59 + =q!\:Xx([0V!ТVhbG+ݶy.^38UVK-G;1a^: nr4_d7޻C6# e6%VѧhdF{c~CN뽄ej_U%~J|36넉no8VNfg|4Ftg1Z1%X:K z˂WpHmpEM[́Z`SZ&ϴlb 78psbnj2)Bi g %PTl`ꪗG;)6:k4(R^Y&'!+'i;LŹw_.KObitqm8qVEJr/`~h*&AS4 1B "3!諒77$=+|aWw,8}UpI wo3`AL{PF4 g"Z(&tӎ!^^UT%[ȇ(>3odS,f BOwFC.BrNꨋAcM1njD.M? WoWgg'5 7[hU5S2&rJD"J \bOE8z%lbrV~d)j^thXljdb,5r\.5 r՚^\6EۚB_xCH/ysxL ,N2DsZYܕ>A( qnOAL7:r9k[샦dpy8.4|͘+`l+$??wfvY&md`E`rQ\HHgK/50<a ʂiK&"j`,s-k& fx]M JZ\$G1}_#yǜhc]WE]jRR5 1p$ 13} YSSw/-}xe+)/%arb1gbԳ05&)JSDKڎ)O6=:wz(h13҄`|Xe'E2mp⦆[,(袔Z˂S'*nWm$֧:>(NNʮF#J&&-jۉKgg1w9U-k'&= {+R Us"u"E_T6~\d7B,rd+$uL13:5ɦ2 \}4[kC,|Y !K|Ie Z?xaȿ7l of)pl.-[LaJrݼc1gJ`Z@+PF}QY<?U"O^I&%~mk{IdXfM*^sԎ6OZd[]5&,xu&"AsKJ8y_d1Qf 7A`&MAH feG<_Prn0Ưjdƴ#zrG.`}ެYpԓ7%G&ma!f FW-1X:v8W98>Au{BÒt':Lx7pcRb ;tCFʒ#USxDA5ZT&Y&[:NWéҖ<%s91ɻMJ"%ZTbPuU$si u, \)Ujxt@LՓq+>"J\2v qƪ EX>3Cg}n1Iy;# >oX6=J PpXt-\] D)鬞f]x5%v~LfR^9[8eKzdm[8'_0u֐뇗gF ƅZ83M9,ycQ,B n{1T X5M1GxܓcɼmbOӼbu6oXp<*SGZ?C+R=n3Z5f&m>=ON~XRR#ТV}11"%.ɔ@ȐSV :]9ÂKYǙm şyf[ɳΊآEq*MtSCv=?))'oTzL> sVpW%Ƽ(D(z|DNywGNYcQWHslurY,qmxRX)eZiEKgQcmYp׵npIt'˖]*M8 oL302}:'Op ˭FSw1Nn??Hđ؂DD_9+nwp%\fxl(YS@P>G!N8^=<\\?שⵡp۔?a4:39|B*.LK9av&g?`˛ E?x_/ual.~Re\Ow1<@Wa4>&=ySĂfW$]: gKf<>~{ %U h`|x3>@t0_#7Cӑ @fI6qU}E UE Bxxp>suj bd~\4<;5;l|ܕ?6g 9UtHSF0 KaVpJp{KTg{nC37/B?랤7hGTuGHTY fUC7R|Vq4*9g)J; ӅOOG`ʈz0g㐦̭Κkk_0> gLLF?y?'HM`ޥ9vVZz"u|u\ur5 dlRGݹ!0''@WoJ ޹s Yfa1Sڻ027BTWp-VbV{k3<|6~n*;(ަHapuql8^0\}Lz=c9o+^R͐SX}U8U ˿M J/(G9F9#ԯhk̙YW̜-Cuh^ka!cMIg6,JH ҤNWt)Ҵe <_ש1b.Ijvc>i"k}Gm{!LN`K+8ba_555N:d kܠ5L`]z@m8ϹQH)fC ̍ :BU]pÚ3ިY[RHv-";lQmzpŅ&o_`iEƪKI;(tk:S_M´SqU{'$j;WTh0>Wc4Y?qŸx2>(R:bJ7,޿pBʁ$ 2.#/=(/r8a4nVOY}6CA )bQ@-@?"(7c _*PC2*n8֕szٷM _xw%w 0^P!g8Oz_^|vmX3&_VO*٦V¤V uζGI/e@ڂf} i2~[hTv۳obuA)-)yȪu;; 'oyah5+M\1PZ<| sMoK.+aϱ"r5q^ DsKwp$֭QC?_K \siey*60K -Z C<|`JO}UeWS w1<xoMJ t;w w3e:Hi̻\lǥv~~O޾y|rD- bp |Bϫm lgP9uqܕ?6͢HhW֓֎{?ޓ#ˆU} c~y*LLϴP =~2H@js(*gbѺ)۴uCrz m j{+zΞCl $ZXGLI&'zUeKד-RNqz\+*F#X ?mAfkuf7,_yRPMZ8^]ʓ)1HyA#|墾\@_R}E .84+W-bHOտIU-fkX$5@\eU]5 NT][t]>]TVy|d!FNs̢ -jX*7+Z4@C@Ba ю{!xTZ)e {G=т`e:m(m;^D!pc! Tpf0AÅ pPl;>sWb"?9Ih0?>Z3rzJA0@ld=(8-L@Ú1s?ɻ&Ԫq>&{p11ij {ZגD-ڙ;;u<"gRG1bXZJО;odnF]lLfY,XMXZ"M:Eӟ;-nH>ބVӟrH3Z A j<Khq#"l3)$jmA<5Vln ҌNE*Zw(+k'75 Ң(Q R m)LuH|tR"&(R! &XV@/0_S<0!SQLe(kw-, fwroE՝2Zd2:DhiҶ5Q ^N T]طe/9o^V, 1s.0f(mON@x 83 b.YHE^̞6,-B@渦HR 7&"beF1(kKB:0QHh(V XIB- DE&u"k^hl!z[WLy^`}%{xh}qJ<5ŕ6`^F17GbQH.lGD'CNCCjJNiY@]b:]1HRކ-9A$4R$s&z++X< K"xJf )TјnÇug[O0Q",ĞG Lw3 ,EC8|I$6j_ g4X#6 Y=FQn 2"* R7nP!F)B 4`1Fr<82Hi _vsiڮ4D2~ 51 dcxFBgwKVP? 52Âv*Iv(a(X1/%F *XA%M44F8kC:$ Hk] +kdSx$d@ZozҌw'+vk49$Hǽ0ֆG YZ&@Y:epxLe`OV@!ALe20P)XJ-ex U?AIZ{1.D )tm.i,Z*b-K rٙEh%-3nb6/ΆNQ=sܖ hE .2Wт9nKGzR~ u RlmeC}/?,d>˔)@ tz늬\ cT/ &g 0JU:d8瓳OXH轘ud1%K5dgzļYݯJmEb.\fp;"u(SS`j eSipS'HX׵|h`Ng1^dZ Ok=P~.Oۣsnf[ 3&)jKE)wɍ M",񖊀00n#j8]oA ]x܁cm*%tZJHl uDM+EZ%$GFᥫJaj};$ %"cVH0ɫh'N%ihudQvh_|ʦx_Bi]BqLDB#m>}]Oܿw*d>4}8+^oliѫl:% &Ƞ%I"<ǯ긄HHwhQrQm~UFr ,@aMH0CHn r۬Z/x>w$޴r:||b oM(o丠RsNoݦ8 )?࿃qvME,6t|e.UBB9v-';Jl-  `:if59;\|罨cj#<$-G!R̎O?|_? h͢9ŧW|7u%n&P02'Wg'hO/FOWG+dzeFar7RNf؁Ĺ{:-2SIvW#T5mN#0<#{zP0tQlɠ>џ\UYy sý!Lg\׽a@| u<ߞMqY'Dpzq(;=;:9Ĉz0q6?q5cG~|mO $q%9KN &m\GAzm!QFmD@jPX}s6 CVq| > {IX}$jM?cY?jgU5l:7 lُg=!*pgޏ^Z*uK3KS??;W?|as77܉gaSp }/gGש>7GwkwMF,[JTo֐\o2w}E\y U+ ˃ aM̎J#96mD/m_/hZuy_I)e8Ѽ_E(iPF#lwvtG`l7rr \>IޚDM1I01L)J,xk0\ִ:$=.w)tคrt"MumIv(Zt"F-!TO2T)#9ѽgMSk-y *Uk>LiyKE` W%T w̻%;P*6Т`m=nK3z]ladW9FNSNVILճS>-k( hpLR*av^Ezn'y{GXoP=^U=G@>\MPH62Ρҭ(n3a m%JPNuv;/HCRH"S}[80 NQ',Fװu Z@[2QzESh i*~mT8)UiقE$1LVՈИV9$MFgIp(ٗ}Pm"qRDLj@zy!7&EyF,S'*`!Իp8G\9 \øH{8GC+ Xv˼yKzT_j˞Ϸ31@@|q!>_U#VG{MDfz£˃F#"p9`IY0ѢAr}Ҩ}3_ }ҫG8,k1NWO%u$ƙH WpMKOR@Sgo+xW/5bY2`{o-xs<8.Fmɷ|rἘφ{Mct|&\J /AyYXm,7gD%yY@)gK _7bpgyӇ}𰻞*mG<5k/^1Σ?\Wϡח3 h'Z\Eo u=EDPW@k37@&V暴]UpJnVUg\E<_߽_^ZfMPD5dk+m|*=:`R*X/gSи~盲xw1ao}vm5Zu|6<|B+?%wr ,8kPJ QY9 gL9+ބ:lH0,fě"\nHhh-ڎ>zR|>As[IskXό:+ H\n̗bUVY_( 57r3gW MU'`o9 ܢ{[5k:sgZa)Oi];L` xdTmm vԳ[%1=I'IIҞ$ "Ym MY}3UidSq(`Z,C}7QHH013746q V=FVlb縍Hxu1S}GꖻqISb}3ꂤ19$zLծDk҆"VoS`JĮ7q"UUz>P=,EyH})iwOHj<\V<*(Dr0@`XtKږ<jM͖ ah\[~Q>cvHBGw%d1IϋvxR,cDA8 R˹]H<ׯUE[&Eyjmm fyV^*YV`Yy$Ļ~\M]? [#9 |YyYy>F&DQNV)2Hdc)0aӟh(xu6նdEO'8Y%,E۹) Je&?)G)iD$=VV.yhywHJMWW/7ן?# 官{כL09&fk1ۥxR\ 2{hGPe6Pj~ŷU:.7CȰv0 XOeWDz2[g},"ګ(v.u2T"44C*{P(M"] wU;7g}oIzD$M5qz"K$3#T5v''vLc8Hr+ȍo-#Gq?آk炔#G}:99qhʼC!Ȉ|.S/(FHH'&Ŷkc1RﯻSCuCƿ|ӅRٓͲ|t7׷o"Oi!97ƌ Ũ\ъI|WkJYSJ4N$'!*@KK5%p!՚jQdJ;j<4ưLSbw2)豤IiR,H Iu$j Sb${㗪8T]ouzGo:Вd*}uEH[4 `95밺zFd ѱ^A3/øxzq.S(Iޓz=:uFrЂ4*T]ouGziT-wu9:~B_]ouz#JaL9z Tɼ&V{47۪Q i4T[]Ѻ^V(njZUK^pޣt5z&Sd jCuz4E'eΗQBz-C,q jjcttLrd뭮\jTQ6_]ouGz5a '?nJBAx '<O6a0s`۔ z oKAtFZ/Mc{Ynڦxs_/닧jlh럯'ݰ;?+э s^C'9Qէ?|F?emO]VOz{iQov3/>^|Kwb]B1HH#cC3> KbiFi >.z0Z'QY \eWej/)}bE]lZOY>ۃ'me6zEA0(jDZ-vUEaFCdž"Ʊ20h;H<0 y=-67T!m )*yb(*|Zk뤉nYoϜyqviš[`s 2`]9މrmd̾lЊv6HkȮe?z 8dۼ)B8K0CAP:qڷ*fiu21m? A5ݰ'ھwЊZ$c!1&k9b!(aGhu߀/WM$Ȟ`B``2Y*#H1KN B/ɟ˲ ?q{@έH蕩 }lMCDZ}* \%pΤ{. )c$z4uIӼ`nzi5~NkIՅ`w>{ rܮ"r 0bD%ʑ }YA*jw}b@[4FΨ R (BD8M߻,u}y\!{`b.H1@ɏihNԡT9 B~)H9BNMu4y]:`hI))w*Z,(F.r-l/ ŀyJr8-i7rVp>H9q><0t ч%Grt` @-qpL>#AHʑV#x}} 7btnԾAo{jXkkv{7`wTu3?x)| ΂lb Ǝ ##XGFXU7>UU$:a*n,B)HOwddXGF~@!+ E"H\b,`l{ԑc~tռE\+U)GTb#g>ZXT1?l%X u @i2 d)%c%3,1P_Ofɼ9,K ei>)%삩,cBʒ^tͧۯ[L y7c,B.vkl?tB+(2K,iFoF=7XD:fQn6H9UR~&1r~i% TD.丨 #KmhK 7ʕ/˫ |Ѕ R % )J _"6տ,k{[N3[SB@0z. vs1a U/2F0Tuٶ?m(bMT%*Sg)7pmYcmAYDrň1 FAzu06M}{uemI]Ɏ@Ǩ3c R0iL6j}mpH9 \ 4CcO3 H&{+ZAV`X:]T'{:AVym*lb! u@4lZK27/Ç4k]|F#H7oM1?D1[A/mƂN_]dc;W4#Dni䜡 @D j~1(Yn Ա8U?{ 1 !AAHD?!Dtj2y]AF~tX =5s>Ҏjp;tuU{RFɇS=2Cf6(;`Ir3sf)>S)aHO@z8ǯu !?Oő d)1VL=YƘV=*c2yP'BH xdAypLr"WIXE!̹ r4Y4NC|4K# \r=*rj4zS)[x}6p/혃U9#gR S\))1mCWlaJ^6`YBYE S@J_B1MQ>!xfR8_j; U›sHՌ>p?<#H9,a?4~tW;WmS&mL9x'D9>#H9`ad" *Xnwobia 1gA2=Ax‚܌~VЙ B烔# NX?tUym7߸ :o^܂Fd`VbHNF!Fpt(,[?\ržO*}ޘR5̄&N@ZY-;H1WL(*gݹQM{|rp~a0Tӿ„(q1 *Aʁ8i mP_2lw'%=ӡBcYAWm^I9/#eNF2 9GAʑ2Xtt4Ki2Pg mIu;fv?v}_+CvݙiޜDw?u?jV:m-{W~o>5oϟ~/ۚ7vQ/çvC{e]?߬N ^m m@7}wމݥ%u^}@[[琝e|0w]E֑TZnhzXߡ9 AV3`Py$Gqϓ:=0MIݻjaWF| /E+XI4(Ao+7r_u[ѓw?zV[=~ ޑD-bU)Sp>d%HA4z-p)|G. ,|/4 IFS5P/BTR*/Q<'FٗyWy Q8EwZJGs 2hj3h ^^O>3rl-<.˂:-Rô[ԏJrGsm]a׮dvOÎH#"A[}]˛ Iî]m;V`imK䷜ 2y81*.$:?QnѸ9&e`}xKVr SH$BZ&3%M%v!Hr3{ު"ݨ^fy޹IN734اIںe/(!ss M &uѝ뮛fb2sk ݕ s!eE?IOj#%SJUZUPVA*SêVAULb1DA ƬG4M1X}Nȉ&bbL@O7i&rr*w=,!%D RiJ]/"6R?DQHP:%!Y08\jE]>7&Cĥ5_v""U7d}Ѥkۤ߷EjI]9ƔEcq?3a3f%+ f?y:K@}A盟o+=hq\+ɥ=#Y"2G.,@PQp}zw7mAl2nZ3N0Ox1Q'<,1ۡhtTAqWߜE<|^uSG.wwd;F l$DtiurH`pbL#K7I<~.& cg>f |Ugqps˿qIޕUHul9nJ뱝\&0u9Yg;LfWtɋo+2|Mep/l .Spz'b󕏋nz]X}J um$*V5I"&f̉xB1<,L\8qYZ:6O5yBgTMSTiѶ}>1̆nJ4++eR otC+O̮6ӬɆs#r uK4G_w]s5+ λ1#V_ԀQ/w)Ùt#>]19ӯ5XSf~ZDZyԌ Y*kn)ӥH LQ ݀Q΀OM}ciS>roX(lwwT7J+A>6 Ј *WMّ08(,Xd2-L@CßA#%'I`N D&q<a2rBcN=xq(ޅ~~Wc8<*8aRV].?)raO5hӑQKR-r89DA:+ mUCfC!Ɛ sCFy$n ~Gw<_>g^^`/ ӆ>T\}SjJ 9Ζ_kh55Fv&XM1Ԙ Z0JJ"jj,=8dMɌM*ɡe_7{#c\VWt 89]n$0UpbH\EBI[YCH@^hjΎJ!*݆4UӈR˝"ŧrhޕVrOhՁ0n.7ie5w9Π1T89\J97KpL_;1Pq3;dLŲ׫2]y!N7kjUe.*rm t(ٺ5my6ԬA^ZFK7Qss39;1n('ҍEahD_7R lQh#*NVUȫ C G;OBZZΤ 'OW|s,t-0I@ pQ @5!Bh&̕M($eTߥd+ 7'QvVIgvS&=XC*U'i;2H@[I.:H7Eܺ|>;eQls,Lgϩf<ث \5h9XI]uᷛ1N:%TzFysϫk{UUWs3d =gA@eeKiZ8>8L]2(F1Wߦwf1Mo; Km"& NR lYj#qlHZ*MDB߲ EL?>Iz9ۥD8g17oje)3YSe.o[yfB>'QD1|JYoGӟs{4 7Q橰$lCEpCubM]~{wZnW[?#2ɀ\Oflu[mgT;?ok8aX*yWݹ&֏r40Mj\睷G싘?SN+7XLNu&[7:4fsš6J_?L]n8o=u;&.EM |?n sMK]F7R"ݻQ6*\QuӻhUdYĽOj*:=|!y󗿾J٫aWp7 <^7Wg~yHSmyuF9Bx^ƺvi;kF Z~c-[p%9v]Y-[m=eN9|~瀱%.Ϩkle=|B:B(V._1wjOp<lǟeF=I\kh#s?^wM%U}rsޙ#wsYz~_KK FQ`v,gEǰbзkJ ]L-UFSX" ^r(: h!c"H0 ,B |O 5.:R; *59%@5i Fq@se*D+".jD!bt$h6Q1nD0^-2ZKXW)),WɎ"2JI\ T#`2Gda7ЂrC:hd"x,mUBJF0TyE[4mH%,:bGEĸP!Dsѣ> _8̓ ,hcVwp;L0(f%FCcQ-0#-i#Vb6Z.aez!&ڊ:jJR#bِ^JƩшlha0h #9c `x5` 1"+iR2%ўy|Qc%Z &;@jv\FςȄv&},sHՐaޥ+<2cLS0!;А),C"AP ,Z D2(9y]EARН¢"ȣ: }h#έ\_E&6g],v0/ lĈm`%⣸+榐\u!c1X[Bb $p`Μ!J@vu( l@d2i X=LpvN)dqpʈq;Ţ){Б% e? vj;+!NF}Ь IISķk=rp`u5vIO T-RHlh]v!f| tH=(aJkjE}k@&NdPgF\z _joG]^%^1lBp2bpQŦTIkB / 9accGV G[ xx 9`:HYDd#"Gn1TĢ dNUd*&*SRVT})uY-F`;=,}XJHJn䚷O]D,UebWtr QKk32]ꐻv?6A T(ɣB-R|6GЙ kR EQ!X>hpےBΨh-Δ=q 򐮑:xT(WB!%r;h27 f0t֡?DQE}aM-ц`:uHfl*J;_RFʀZGQ񨈬wԣ=ψEAiXTV(;vY5(cc9QRolqB7En~F҃,rNU@h iTf%s㭷^U.f)e-w+R^NU 㧙h4G C!qhh}[l`|۞,!Nhܦ]./\dg0znaD L Gʔ a[S >;ҹ[d<\ZsLQg5 gfHƴ;׌ֻ Aӌ4dJ G4G=Tg 3svPDsI(Wru R!w Xw(H-@K%uNOt$n6x3bF=.{c_BfVBUA[ɕU!t7+*$èPqYTPc$4֡b$;vG됪R,W9Q{LFӡ@ioLݝCuШ Uy5U3AgKH&S@rLc='kw7qkPqx 5ƛ띃}C3UgPibwKB:{)[Gi3 ^ |e&RiᰃG;k|T0Z{F(=MB%Ēt}&-JnriL, KW;feХ+oBEu2 dHB 弫\gb`!(l}G蒏^) T Q/>bXޝ(x6)r]~8(DHX|["wGF`M0Z}[Bx%;) -xqy{I)6R^Sv{6ǝ0w06w !xjOz^_^h 네?XzV+roHUyzUXQ̀:,7V@*tPUo&7Oߴ):#:#:#:#:#:#:#:#:#:#:#::d-:݋u ׄf@ Ń:Py}:bH@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@u^,cy%P]lԁ\NG^{:$bpV@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@u^ {| Տ|q<^}ߞw¢^jWc~}~d?7xN./b7\-g\>cl ^٧[+~oD,hTkY;Q,b=e=fF9ԈX2ԈX.\kD&݈Xԅ*FĢ'ۊ؈, @knX̹t|MFĒOF"mD [`>!gZl 6Μb .߈X࣊e$FĎwFĺnx>j"jD,Rot#bSa;LUٍ3LKKqSVJ;,O<_}VVwO/"FDV!(GuSr1bC N"6c+,g^ecҾ:)RFĚQned]Z/xhedԈX#kDlJμ&M`:b)[Y7fFڄ@7"֥l+bRZ`Qim[٤29Œ&ԱRdsCVƐzb~"it49j uEk<>;~\ޝ{LTƞC~rc\#(t+b-­u'E n+6(Q?Wu8eƌ?/ ryy{?z~o~xSq5r ]׫ɐ?Gjx\JF5̀b1ߢ=b޶2I؈X ԈX4s3].{cA{CIetx朋&%D]KC yP#bT;#kHsb-m+bQ '=n@fBo"6M>4"6WvޑxcE{z| C,Qp5"zӈXK)̻)Xg`rЊCڅFF|N=Ӈr('wt;z tS<%t=ۓtrk$n/􅝸nǨk t^Kw{n/t^Kw{n/t^Kw{n/t^Kw{n/t^Kw{n/~y=_É @ǹ:PҋuܙP[u2M0f `?\%7+30b+e9.)ۈX^Z1]Q{]M:(;Ȇb.>JԈXYjD,3ʶ"W[?~6Wm_Yl_||q-2؊OK Kqx( BЈXf}f^.PZJިF:HFzC)bhR#bѵbɸ4s&M|̇䦋cmտldrjD*캦uVm-_ԈnDl]xdm"A$֨c,kDb+#K04"jD,swͻnXȸVg *C|??ԫ>!5bYzX}s3bF%h޳{5|mDu1{cΥ4bWZq]k[vޥ=&o &c孝wf#FĒnޣ{5>8ok}6"u+f}23Qj#bS 3xǍ1| lXl#b 1AFĚübmɶ2.ęC)yjDlH[1UhϦütx!NJ;FH>2DNnbo{[vzv>p+>x<>!~(Spy=>UԚךT޿e{=EWK~7})ˋ~l?߯^oZ{n 0??_`sqVa£fZoލC/;] -gg˒|9CˇuTzzwώV;}unb*Ӎg\Nwt𓣡db?7#IRGf=}zZE 69C!}><2#+*cŲ/ډbJ_&}9w{Մ~ULNGۣ9JUK oJ_7KīIeOh* n~İL8Hۏ96_? #Nz;GӪ#d(P޴ &(uu8.$튴Srmrs~">xDwv=%gzEZ7ycbk(0eI%i ~fMNthV՘JG]RU) )jJFZH{U]*EzO f0Kgma=uJ[WR#X͆kCuYw H/XżcxmtY+ QHk"Cv_̻:YJC.:Tde1+xme,!z إ)OfU@@P)_@1@ 9l;w (L(۽,Øy{s-ѧEK ٖpNJ@A5H/#+P(\|`ȳrFN(~-;L ̓ D xd!nQm Bg$2SiU\(ͮU$y㠈ȱp}X(fa!dUR᠛FeR]@mDRa5JiVYGل$eVojYnUՐrO7mU٨I?fR%C\[e)mi:ktryv~6>ܦ<[]D$@ڳwP7llr̂'eKaU8%PI(88jnTkM!jK8O%'Ѯ dL P^ʎ[J3bO&V&%<%*`rX)j=TF m =szTcۂJփ`XgR $SbFf$=|!0@==`oު'b% x{+Aqj)5A'7 ܤ#(^ +Nܡ0FdQƈi)7b(#PwusE*cPMU 6Cc4i?̀[-Y4S; 5k+Mڥ d 4gҼNd@&ciP VqFb]A?!]ۜYg5נB݄h?jo\ ֫l1xɚ1`NZ3ril =A!W&#iDF9zRC%#pU{6 GAX2\LA5 ہ .`=n֘M5br"&b9\PHlM./"!6M.H,;wQՒ0[ UynBב];yd]pJ{&WKd!닾YWv)fmXn1.?'H$tN k&-%gAkSўd"}Nr18x/( Uֻ[J]`w*cOnQ۷_-Y~9ҋUt'Ymo/N|BoOf~wTK/#H|ɲՇ+. ˻})+/ZY_"߷=ŨC s}IFf::Lèg4Hè3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:֨ZxIF>Fux7 èg4ꀙQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQ3tВ:Qpr:@7xv ΟѨ#Q0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 γ1|Tq^~'گz].Ok65 n:v9-~u~L\.`06x ꙀuZicV꓊  NnZLm .z!`fn!` )`* K[7XHp2zBz]ZJF!M ֡@f!`Q5ư K|v)E:XXgȫ"Bꅀ2` e<)P⁗֘p+2V6h6(ZM$Hvg}.?m^~b좯^}RflwKYtXXo`56 #BY,Ě+ןeMaR`.};PfQZSysZx Lk{{x˽z-AK5eYkOkwؾf{%7 !P(8ly.;5&3edOp?>/B/,YZXYHyp­6oW?oĎh<8|.xxrӱl[[zZ~d?uв:$n,y/'[]/O |SڳJEj/?˫D( z,::/Scdŗ_݀: | ('KM;2cݔ*FeIᣵi^Ů}>pAVG5M5|;\B=8pP.R=`ⳬlPRB< /8 mw`#Tq&TןYVh߶\˪ ՘l\yP#ݽd9t~=[+$0?eXE)PY9<.? A?`1?V qSѤ1YQ#U$޶67nؖ\"5=;zo.]nS`Je:[9}|#SzwfzH%ͮOӗK6+~6??TpZM=A,l "^DI EZXM.·XuK{RQuk][^.^.^05th`gOVT6F{HG2jk >XsZY. {.l`v)Z"2î*'#XB1H m#n/ʘGb).aŶ l[|[,zI;<)Kr((p83e I˄4 \(+B RhX%TH&zY1Ҕ9bd 4=ȏt,EfnX0qvoaq}I\SMPsK^!JOGtczr ye+өTI Epvڋ\M^~ձ4&oC%!Y k|ү׏ަW/ǣP?*տ_Yl<^mX@. }MB<N~mm _@RǓ,]]"; sć#qxЎ{U)acYqDo -q^y. W/-\Sz?`X #ԎXU,8vzp3"^gx76YD*|1_X4Vr!{e*cqk$26Tbcl<\SC]:͸xvz }C9XwgZ:YwY&(As"p@F Ԋh(l [!m#6QfNj|¼5QTka W{lLsT-ėQY7fC=_:cL!,,Gq6mfi+zY6 zVcY y8{r`o9ġUV p;ދu+́鏣[4]Zw,9)>g 6Fd&vbt+NBntVu%&K"yx&*W?s*,toӿ_P'XѰكc5 䢶<_;}L5_14ξko'`|71O5T 7~7x"tfw#@hcR#}]߃p/h3$ߠO/U4?ͥٴih|TKqxsCy{anaʜNC>//2&At4u :os1]p_8Z# |o-uKK|iMmXR1/ 2 Z]3worҔ%Sz0J۪Vn0U]<[˘X~<ڳs8bZrK !Wz\k:hh'~17?FQ>MhcF6ۇw7/6],V"D:`î| ? /J)-u u]6H8`ֆ|9%Shl2_'mGs`\eo'|Z M(۩P;-åk_MfW1@$]WÐ_/3mMVo=߸N&Ʉ+6ᶋ~AVm8[4OCFfmn'Ch! !N:2:i0-BpDz~Vl\XZi$Mp"WcNa27X,tpr0Osy:8Nq)Dtgb%8 i,R!ǨE`_шp>KmBA')[x-1XjDIWtBKbT(.R[a0<he9,Jd,vkd-q̕(oI=5W2<-ei/#?Kr$s}Q /8/fѯ9d[l+37vL~hqh[87` ө䏣Zx;r[>'YxI`_ow7ˇemumoΣ#.0AT jb)%j޽}S Y+r;cjc&y1&) N/`+,[+/=͊%2ޗ 3?TyC^}`+11 JE ` r`8chvby emY.!GӏYF~8Wp' Ay`VHCwT瞀oM@U3(R⣒ʣu+4͸[ޖt]@=+lDWBBX9%5{M Q}[Y#2 |LQ "nR Ĉ$f5+ޖx[ޖYҮ7Ey!oR`  {0 8yN`Ê hG+X^ޖI'ccA;P: ʑ!Iu()3vN.iq %w5SqU(L0Ϭ^Jta%}p V,1J@$RFc58ǣ}X^#cyxKBiPמޢII׋8Z9uJ*]ߍΦsبbXg~780,N$|b,Y_䑯_wCj'2bҫ/oisk:< ffóGnx\#Rtx|;7UazNo7u|'u}H?gQVL{ܼrMBN<F벏 q1K[\w#/gj8#[sFNc۲\>F=f1JILAsp9=p TǢ%3Vdu6=ԗoq hJa[BRHQio(cx[nwk})Z&{LONj *AE/&. |}:HYb01pX}48"Hʃm~ƪBh(Aymc9H3\t[hG*",7sb/grh6YLG5`-*E(od9a*ȸ'AR4`ùTd0p4Ll*eMG,orI5;&g3^ߤP};ivkW9u,+NGH0Ba%SAAB1 i1(hG 2U^K{m;a`pӏQ*qK/(OU1i))#|̄ WbȴQn\&8]c*NaS3 dʐ>V8V>7L۲Z>eJQJ\BFYڑ)Yt+o]yS:F x×d Re<ꢜZ1252FQĬE05hG p,/orjF(Vx.]pW(*SqjG[#-or]lqa-ðvGL%*ȱ a]`7:&0P4 0'ۙ/7 7( &BYň(V ve5rwWҘ)V:/VޖͼC6v$`qjMg^<(0<!%†mr~"gJJlDPkp!&*mg4S[26уmrzf{`+/C9. r)q&ϧqU)oyU-wʇ|J GP}<]~YQR5cYѻZf(a2ц.KՅ.wfb:<a>vt-{|IqUWE}٤61#<|8Dg tcS;r3H/D*"JFqH23"ZU5>Y2'nczM(=dux=IXS^C2{MI~ⴴ+!o]<\%>w t>l2"؏Fj]|.E2|o2TfÄ"F?rKV+aݎUN|馴v.KuB~TyMxKsҳ] Nw=һ_eMyEA( kUv=c3<(h5O >vߦ>lM|6f':8B@bDrG *8_1? Gx[4+^TTiG=De/x6h6]ݎǩT7ʽgZ^],zcç8ңXg~d}M,-E'Ҹ7ɓEzwgծ&ouыwEa;pg4=gNwEF)D.wSIrz/j=4v~arrГL IM(Qj Wyk5q훃}`f|9.geS),?+CyfVS.96iFh%6k[0TZW7UG}#`f#|#Ioslt1ȸD;g#gt3lIE>J{@P@Z=LC۟@B@N8񃊈yެI=B\3K.RsI^TݫAzxdܚʈw7 (A"fj)G+T2Ǒ\hVhE<a RW$jP1֡x 7v?^^#$3"B8]R j0gW(@u{;#rV*b ~9k(N>ExiQ3@13e9VrX餼2A9u r0Êc^j=Ef:N'u"S/@3DH! e8]n侭&nsB] 2s”ZTgbd, }8G|ty^ X@[FW j׀!3GλMxQP?_[#4^Im_eaZ e*{c ENJ\tUqj# EDOZTHpwG["Ak%! 9[씷kYtj5)e\??WY?l.[YΛ'Wo6!i kaFeE8dxk}0F3R"+6U6ʰP!eZJ3⢐1IObjfH0"$wН)n ~ ,9E8k5k@c5d69['9"b fR)~[ŠY,c|X2!J,5د"4:0X[JL ckyk"+rYD nܧy_VX%̲/JH~qn]Ml7f,VMjfB/R!"4tq?'NPAP `?a>qކ̏N oᣜ'⣈(NKn]WHt񸋉bó( bq 8[Pۛ[ 1&:K3tZ _WPiB104f{CJ3$4c[XμD]ڞ"+]32t@wkCdU0w"fWva:4IqXH;ka½]CY1!cE rN N\j$?aYG F+yv;Y9AD^pvp r{7v BWHŸ7ѽyzI>ooyq᪭pA1ݿfOoe?u}ێ3}dc-([aMPӫd^~N{t|㽨fAtjMQ #U"S D US$ n:~Q;*OݯqP[C7EfrI&A}lM(qEiK^) mɢ8 \THkS}W;UBU "Sek|νTmHp| ZB.% ,@dܽ@};N'U"=nWk^m~@d佦@ĽNwzUQLB7{AL[}`4v:ߒi!T d yF( -O%wˈSnyn=zG![׸Ie 52 cU" 4_MBI{Sa] R'?_x7U YrM#4/Yj>uh7:oִߵv"#:ٮ2DFq~'ѠI&d~mTP~uuNC. Q"{(ޝ3t" jzZ- *Ck$MiFrt*Y-u׸F5Kx#6{g{۟~. 5nj0NF׸E5 J:(*)'@5xߤʔ 0<*`DI]ʭg$@+)x^s+<F{7e8=<|nwD =GqsTrRZF " TB_̋[- 3T.y k@ m'RZ>EI^[DCu""q"'(.QA мuhĬ,VcW$BT/󰐕VS/2,*BrmI)jG!#\qQxULTZJtظ$Q+j[jH%.b*ȸ{UϤ(txj #g?}fH)yXVJLa/v~p0 yV)SFQJAuI42g)-- . YWW>+}F?0qŽ<@h &!@E2§{V #v JFD|^Oĭ~4~T rdlwR+#k5w<")Z@ZYC[t!CʰyD չX~wn47 9(aGfADs[t`:n)I)RzWiT 6姾`)Ԇ8^UB߯G4߇zgYc1އu?'xz#.+9Iz%f%7#bmʩ[!ɟ}Mg$Zf5N52֎BMhzk"1#1roKU]apATM;" n+V)cBVE+{(;[!U`[@\'_ٜ84$ t:45F>Fi[#6c8TG_ޭ7Iчs? (-Q|εeSKqImFk~-h55sP.f51l=JSy+! .TkZ,^.?B *G`a$VabWŽأC⫼v޵4rwN\%!z 7#@񊓱 eAw "G3Mهu@D*K[9iTu0Qnj5^VyDH %eisAx^h0;Nby3? mv k l٦.i؍3}泻`>q!Mqb@&OޡG!DhxHD$מ6HHpC!yyd5;Y8j$|o3.Pj_}(F-I"3+HүERTNzȪB>$->`=NG,?Wq/Vo2C!=< GazU3tt94I/CzcB6ipvE!fn_٩Z5Z/杘D~9XS.drj`W71׋E?b3N zxW.Oge K é*_eHL KMH59w- ]GGdᐫ͹ttHvx%q0&:'PÊˣ +z"p) .ͯsliKr)hR~2bthv׮K&C '.‚lC8.k#pWM6xr n¹0p8]P;#d%R{ _*N+bE|c=+.o]8%" #Mdp-T袻Ba  Húmǝa S穝ޭwY]yqןlO^m[:{'B5Ff.IF߄(S6߶b+҆;x6`C?j|s/rVwm&<[Ow0EM H/W%~R[h/?;~\ΗkJm()'ZRrlv$s^<`{3ȹPRd^S;d$`d$/3c&UIT8o[##=ko8E00(J$ yp;A܀)ϲ#ċW%-u6՘;h8R,w1HC*ܕKvi7$בuiјŹFF..()ˤ /9%EF$n<)GT'-Qe؂wgO[JƐ0ޞ%H.$$!K[9Ibn <\׮ ҳONw 滓 PO;r߮/Z6hGcـﰤ *I%m0-+iè/(| z۵=1Ƕ4pXQ4f@(N-"|%]%oh[% ^ɂREc2O+51g I;Ff-xd4j=b԰h/Խo8c)aB9nRE9,q42`Gᘩn(h; 7'o<$X]ݩj38 4gťWW/=[d^}*~-*6&j}c?xVjiMylzoμgUjV"9h@včtMC`)y$Z)C)&Rf U&hIAkء1qLIVpEӀDHM##22Y !#(`L֢1Lԕ~ވL8bbfDLDT SUld0'ފiކla)bb|kƂAIZJwXD9TVݙH<^bH>i[;hdg a9#I`tgZ\VR6ߧ-mdOM"?;G]ݢyq4f`,7#Ą?}h Camј ظ(*7T,[)@oZDAFSAcH[WܨB]ܖUKoҪ60sLnoϲÕӫmsST}yn`/I; nQ_y뢴MlÅ!_Tbd ;A O5խǦ13\6MN׵nϪ (M2]['@}ՕQ|Lx$I(*q,X1%8Wfβ bY~G Eۼve[ ͷ٢XT~:GxL_\7Fx2ywWj͝OV[[1UWbn`ٮRDo5yoEގ.ZܚEֆ.jNϛMSme;QPհ߽G`|>փav<@ p!Be[ziޜ- ι$C+zӕ4Abgqgs7OD%':YL,|5-fhenDj1-DOjưq"DƥO(͔-b>! (E'4/mb1'ی}g&_>)GL۹{j1a:T'!{&x~6"Ńx%MF7W˪{e]TЂ){u8a|6g"eTF~.z(ѫ0+V DZ+?Qѫ {$;rs_1w ͖)NmPOyqvrqkJϚCn2h]{!_TMmKN=I}w$PE]XC}1xk!c.OrOU/~8C~;tfpJmc/gQfbt7z#&z뫀U s(UBId;Mڹ6<)җ'=`B;ט1EHGNBX"*7%Ti @R,1$*7a<6ˋj]"HܡNIDr0qF"Q_vKxJB;јCH:K8#. Yȭ_0  #4u2ŗh''KX4hCHHeB`i 'M8>I*x¤cU:̨z]KUqf,@g,OfncW&-pi:,En,eŁ9x?c`@#<BIu{@;јJr:V5ϛr pùì(ζcAx۸a}?C=Y-NIRz.eU/Ҷn#)Kk|;ǫv m0< % &Qf|¬GjA5 w07uk!z 70f#WGcd8#-miKx/݁=H ^fʦrUuy ]iYQ}w TʤפZKj,Y*.ߏ8:4\X}7$xJ$FW~`'0'a"&$?Q& ɨ&GFRvGmQr#dP[|}$7 ǫ=:91Jy?)@>i|O |L2ϒC @NW 9 MB)!OMQʹUh̡/1(&m뀙n 9v%/(֙x^PryJO:Xdt_J(cH8"},t;IoY@;Q7W UjT^-vr~36QQڐgߎt"ގL4XOqsQw(-F 0 ) 2#AD0Q4"aZ )~z^Gx" !·Ϋ 1$Vi2Vo2Zܖ_^Ƀt/_ȹ;PI*&k2ժi&ߴ_Eg;a08u&U2&RP_JD Ni`8eAdFG,+1C׳4cZV?㖕RsI!:+SX_D`sa*2e0B،&b"2d)y+#g\$.l`zPFMRabdKbXIb¥𳀲$.@˻'kΆzgC 0:p$IcvAɢPwVf "0$pUl;YF<軛 /n>w܏umL2d$D}.'u VÎװcw] wRwi !2d@v 澑!3 c(>ɤœ:"0/eQZhјc%ECx94LYP1In5)BoD#E{ 0-P4ҏȹR鵺4 jJr _cJ1:'5),, Czbu`_(3er2#ڪVc_sVn9N晲|{׶[:??7:e zvm[[[e_rscRN^42]QGJӲH\~U7z ipe^0#̰^~4#PSyiq7k?mʿg~UyeO`uߤwuQ&6}t!u_}0Ϫe12\p ϝPc[9훏Mcf7~]ꏁ䀫!ܞ2O`  -$>ܑY(&R:)K3fLO̻;tyo xgn7fbQvn<%+u3s>K驽+ڴ̝< cko? [] [.Jg{"]#ݰv]\dvmU{y}\=$}05r%4*rl9Uy.6Of0dkpNϨVz;/麬yQ٘Ѽb ޽GՀn:-"lBޅF)[}*ϭ]$WfԛM Lmm+&.4tU׫< ,9~m{u^؝Ra-y#QѨ:;yEX^ⓡ4+1@J`՘"-v!׫Eh))D*TX h0O$3R6-һё];* c0٢1ǎmvdMM"|HE\g'_bO<="r5H2n(Rlǂ(RA(0$j rtvh`'S5ΡݑO0l%+xҬez_v0e>lA<E@vmŲO%YR#f_RUKD:N(r 5 lY}0ܜ֘0V0~֯}뱌Z)9kEh+s{/Ӝd*xQ|/c`qӲ'eg':yGe_sV:jFW ccie/$DEb#K卲H>1$[b!.J8#= +4ƅrZϠvw*1;a Q33YO ų|lq/,[2 b=.+Zd/׾#O*ϾFȁ"U>jyU=8B[=axV6Ej1v-և䪾pu{hFqfl[ҕf:H톴z}ц(ۣoʯ;FBPw{ИudQcم77q։ec4Asq ddf” w\}O4~B0 3_EgG&p'oS[g?Nc &$cUxN,/?EpYM&.$wrf #t4N16Ž#K![Q%[7ufB잸125Ku )\l9'k+`cr8*Gۖ[>NiƱ(2Cmo"SJd#HXW2b#ʅ;zn;.wLN/݃d\[uq' ʇz|} z]e]}_~WU̙7weytus oߟRܭe;LW%ގL 2qh]7s.W.Wm~?®8o:N9otnv$[ w[?mxC=kw}|^Ծd;yS6>,ǡB}=_]l?_?~= bKC1OO`6N.nSM6#DP=bUaw!ST+Ty-151~?bWL FDu*UƄ\ʿj?PK9 kI#M@4b u1"hP-3^^D9F^"ԶfQΧ'0#9%jpp׿<3$8meߞ$F_r هx%(N?eس }3M_HO]e ^r4E vʣu 1"IQHYXЋnӢ ۟RGۄotvU`74Fpp|1;mN=bPMRbJ.-*@^B%ZcpY2 n zobbrGUf蠑s9l/TRà8|w]Iy?G,&TW v,(l"VVDxu=-5x;Svia-b9+V!\e߉ṭ{))\0V ZZd+a QB].oy(@e)-4jFJe)`aO^fmXY`6m"zZXAJ_yӞoM<8*K̍FM=^Ạ0@4҂Lc (ns51]es,o;Mr:N3Rj,|/$bS`[#PCѶğχnIń$tH $K^Vm`" nH%FKH@05T>A q J)Eİ1 !Oi҈b c(8e+˪/ 69q-/9#`)e*, (YJ{:h8b3#Xr4MG댬 DX:NEWQQf˽]g7ˍsY靏Nb Rs=t_6gzO9 DAxL9 xv26]-Ns7YΩ2ĸ pW>z##/M|ɴnhV [X}ĸ(1}6Rɭ֌xF@Tz&*^dr頑E@YsdĐќ8.3r:q2*cC,,,['_X4%RX &486nh$Qbbn3m.#Qgl TpSqNvJIeB>TQ.|iY bZ^~^(D {&a+8 (w }Z%A1V( E`.\&0fF2qFd}=/ͳh@ T~܃4h=@ʝQ]pR98lK)4p5=Z5E>4yNH$EQq0A#18rñ[KqҞE Mz{lt3O)qÇ̞*ca'WA$T*DGcy/m743z=ϯ՘>[=\_ e+䂋 8RB743}Z qF͹hhFXn·TbDL=}JI]K\)+D,IYlb!8[M@ @ A=/˪@0c+NH n畗@A$HYFuu^~ @ZU|PQ9:h$g] 3޺Fb|h{Ď F.,i7i5-RONvj<ǫ1Xc'J4s05?]F]vH wxF;5x|+$JɳB[M߭rz{b@89RC3!r>Jj8J1#Rd+$.^q#qǝ'F$A.ӗGr#aY$q[ [TybA –dž -mkH|,e P&h%8ǩGQҢ&HϞz N >*TbAĿ~iz u FV6ܒF-`)2)v  3T'd:W"%`<WHRXʊs($f4l=tJpvd]Ut<%WpY)=!Rfsec̒@ḾT1B(a%_.f!}u`УB[r |icVb&PJ&(fx#>[=(\Fbp}87=([ A(RCsuKT:iQk=]5ue׉ l|ұcjP~7uj#o54NʹB+Ǎ=ᔷ>Ya'ttA#18m͏˟GWrx˲,&O `v: A=&f["枭/+VLB 42FQĬ%`jrz{Q5*bg'FI9om4.vafP{}bHAec 1&VE=]6h6ujYH >xIWA#18zk!چ_=Q)a1Ԏ1،pc QdU̐_fU@ =&3F%lB5F<V렑Q:h$G o#y.o{)Q:h$gsFInrcem$#}ȈTr.h>sߥ?#Ix/ƳF9_(0D!l[/fh9s,xkAH!% ia-/ ,$V(%Ɗ/0>$"D)U"^i, tYZK6ӫ|nwAljPß;7Mb|;h$g2Al>~<C:3ZJ!ûm_ )KB^0+U)xUHǵD43D3+,*, ‡W)M{̕Dr4#hK蚨z蚨Qk r3أ1l QbJ >";Am13ħ$FE6ʪQs;E횙8yjYM5%eM5.ΚgR0! )RpX-ڂ*왬\Uߥr;-ݼӂ0lq=͔t~C*1ZT/[Gc|xMzxG蠑Y[ͬO ZDYIЈG^)eB4LPRBoeJZ ^@UcUGQK8)' ^USպɧˏv QnQhjvj?LW?}/.ɉ?php&Eo 7ۑIq2q ch`rÁc}|]W1}wLlᇙ]5?z[7]hL13/Q/ھ_o Z-okaj$< v:7=owqyS*zϳ{N>,Ns_W_׋#O~_E`@L}t'0'_Px&m [_s6Vgn>Yy\.kmHe ی.^ع~E"lW=3$GPq6)+Yawuu=3c0)0 lbRБ  +*d _ EZ}v+˜̤v)̄_h@"G`,Fc~bRJ-"r@B,>~^>n1_GS-V-09 S'7f.JK-ؕJr\ g~:^ګ<RKn(EϙAKQ?p2]6fIJ< 0\)( i1 (`dRdyHK1!% (!9H^F?*hm+ 5@QHLPз|giq3E+w:fJt,)Rԡc&R3= ݩZqf!lC-%\`D$+hkN(rj1HyYFBerVq'93kQ+gֶ ?ǔW 3V3V3TV.IN.q|L&䯯?Y(qz68Yz4zOҪ7}[|Sp0 OV,UmNt[6#1nc4ƸiqjۄmVgD2Eնx@ DNtI|RyPGiC:4#XLjAz&O?DerDՋHV baYOL(ۺs Z?gqU~s\qTWjc"}XiT;C>9>Z"a|!W!^9EPB\O($1-A^./ϓ$ӐO <j/6ѻZrs \~7)E*V1"/= <J"8H*z&,~Tw+e-uF .7z=-j9⼍N]9nՙvۿB\J3IUWe8|JR(@hH.Gj>>RY)'Ĩ\{?/E|k .fަf8e֨TAA!+·9{}z1/s=Ua;[ƈMJ VfCtf:T{ ]D#;h/9vւ $eTKRa 3kA} Sw܇toRH9Lijj,]̦&}>y?aY̝F}sF8@`Rg(40ZZ,%HkyG>h%5q kюVq $W,g^0PҤJiκdH9uGa,b/~>ZojӁ V!j: S3p+^X,J._]]5[m _o;wmz> g4EA15?CQG*Á~,oqIytTUT'XCr`~(Gf;*z+wp쎦WΧI9&E\{s> 1ɞO>gw;[4I{Y_i4ɵt !%љ J4AgRao܌봤)#,v5aJu+nيS/ZLrÛWIQo:̒)=Cε%?hӪI&d/Wo+=u dzʎe I~ɬSN侺zlE.뚫GQ+9:QT*y`=\\<\=Z'Ijb|aiݯʏg6P_w-ȣRp?_6݂joΊ9"fb8&ӫX\eeӦi[*{$uc݆jёG O܇0sg=-\5 x^)$^Lz|qI}wy=Nl A??- p X(^5+:U'>o'Yy/Cf4W7=\k@כXjT.qU3u;~~}G@WͥMX88 _kiw+J X.{wDIJrVdPeu_O1ų63pfR&n^IA>0A1F: lpFm vVkы1)5X@Uv5W/V ć/bbpqꋷ0wx8u~<j'"'.h hrksTD7|41c609ݟc@[Ɠqu#gab&mbWA=Zhjw^ߞT `~7K\j(Q<kĮ963vM7-Gp=\q<*]-fu8`3(Msb ^ߖye~;Yv0jױz&ȝ?X`bwd!9-q^{ʲA.|pvLs(0ћ!SC|%21IV}X&MlXEB+H+?o%W?_{%hQzK]ό(11qA&: 3:74/nq0e>>b wnK?t%߼\X!+mxpn!j=TU6m}w6#OkRoh"UƻAWG:~nHËyy/ϱ`H22pt1%s}%PjPQsm\[䪹= Pwt1c@jX =!CxԎ)RR%n49 QwEjK{Bhq|3p, zc+*ۿXs3 `}۫SQ҇{#㛳zbX}(1r`> a)l2|X+*S[3rYieu'k침'q*%> wih48˽*vFMsOiu`2S7C:, IUp;oB!ghztAQ/SCn֛K8yQg}]qY~o46ڇvd[54itw 1B∎9ɕXH%agx&}]gQ6pr*mW[)䯘}>}2f=+|7PR/NޘpAR X|UG.I#>rF*5!3TK"B_nS0Sz '*K/ .Y/7InBx1$]9 Ѕ՟ Gsy>US6fIJpaCyʚ`Rke0o Fz$2ɒ_hUAy!g6̯īDv4Hi l)Z@ӳN з|gfDwp iVCw@.CX\pCwJ&y. 'w2Ҹ `n+l}BހS.n>N F ͈]M_ [Jd]$wnE+siK#Xy+%x(Osf4[QNcD^O&yEXǗϭїRma~G)Nt_Xݗe{=}ٞlO_ljkY$N0Fc0H>rpՏ tI`dyn?>d#G:^$y¾.c2O|SRb? ƂǴ&|D 'Z~1bǨѵ/cG}Y_Ƭ/c֗1˘ݗ1˘e̞gcOMu4W1+v^NY \9}jui9ehSSM0m^]dFXMw`J73t̪.F E\G0ġ_ C _uԸW=+J4UJww:1J4_{JlJ%hs :[&j2;DSx4$"V|13;eAMj^@ِZY4[&߭㖪Ь{`SrBp,V3A RP eH){.r~T.|\ˌ;4΄,JIdK ˱A25A {2];XcV"B0)E#A,jy>)1$¸]a٘ UqR˴ .!@B}*,QLƌL`^ cY iH-VQ){ b*Wl,yKG#%&z kFbASsT1]r#Ir2$bd3:E69I6kH]/I$ tHFzF<2y9TUf J} nj6Z-b&^a &`ЙW"=\]rq 2[ ieRXe#b cck@Ql`aZS!MGҏ83iU$%k<66HLyaaK:7h S` E+}5XlsLNSeq #+ R:i4DÜkÆ2PHЀ6WDJRBPHOX R؊y߂Nd@]4+xP4WL%HwlP`&vb,!dNatNeHKa;&ՀސwbF θ#MAB{vpH J <4(ZvPWMU(`GyRyk 1 Ü/p J%[ LREٔ| pȤ@gZ9 YC a#XEdr LKVJ (ȽUv i*3jbcp0se{ V :Y8o-Ex@_\ࠠ*"!Beve"B.-%=((IpX}^R6Ԉ&ayYEBb~UTXxꅈJ! RN2a+"ƛ1sՕb1J LgA˥tki7,3v̪Ne.*)_,K 8G"a^x}ia; L|Х6.&ܠ*MNzXղe޵<8Mu^ DŽ0^ \P8R`Vi>l8D+*Z@L.H8[VSc2Rg bV[$_jPaBG1 A1J HĄ$1-+D^`eHT. U E:/Xa@)H^\'6XL_ M<oĭ |OXduvP$ 8U+ϓCWaw%,Զ ))e%)( dhz?70խO8?;8 ں!5"+pmpa,@{%!z \ZA:]%ྫྷbT5ʪ+@ @0IaBA*(S`a @pȃ D:\'jy=(f@cb- F')@s"mS82Ggm]~am'8KmF|d@RPp;H-2r#,2aE^Y`H#ԁ=UdlN$J ƚvnAgY "kTRpdf)4ͫFjWrV-=+k^! @̤N@ fJIޢC:[h:/:HcyȭΦ:ϻcuޫ[O'esML0,ԵW@7]px3 L[ m>QXahZ\UFl `DS㳘/WC{o>(5k{8`H/7C|̈ȇpy@a%}zD[9Q"ʥP4]&D*xTaH>"Jk8=] ë`{z-2'ӫ YW*ϕ+Eӛ)طn:幵F.T --J1<*Gd'Ua׺ oeP?y ڰ(RVXGϊ90`ci Gj F5h*\#|r\ΤX LGjJҸh 'k3k$Gv\mC>ڋ7Pٗ9p6\ f_X@a33![V2#a/j@_D{/ĖO#0%7aG9>d(&3RO࣐Xc(1LqJ*w+W/x@8\d ks !$|Uj]|]v)&D,ܬlmW.BFCdw]HAS@$DiZҲ5]w*;c70&^ _'uz{7NXkyy۰\y\e,^9߷~l0Jt1 4)9>KsQGx:SGxO#[,VkKETMETMETMETMETMETMETMETMETMETMETMETMETMETMETMETMETMETMETMETMETMETMET~S7p)Z:ꋾ%Q{,|ː>c Q7{g>k]iL\$,Gu#],.$"NQ*ͦl6EZU0l!Gu\Ӓ&bHm㖉*iy!'U_O۩~Z8{Ǒ]οofx)wzyuVSS wZ Uf`*>ĉN3rEM:߭ߍVWtX,7XȍȅobVWW{~žXy%^8dkɉ\+OLym;>}*G" WI)c;rZb9 ,co}"m~j\>2!j<\=jȸՑbmmb a) '",""",""",""",""",""",""",""",""",""",""",""",""",""",""",""",""",""",""",""",""",""",""","zDXZ!mJNnvf:ijW@ztgfo?,19θ\UɇUnT aE:z[^Ү׈>w"tms\mvXy\^_,睶b7_] +g9Ϛu2jgODz1QvtJsK񈺬 uhɾSi~9k؈h P郴Ҟ6VuIM;ծcیŏSta'czѬ'^oa]n3n1PՓj?άى֒͠oIxyBnY}Z\tnQ^=e:KE<&Qi^fNUiݷg!*wUݼԧE<=s1G^,3|l-_%al=ު=vqAoמ718B= Ռx>8t43.f݁gY^(ot251|F5*諟N:f ~A>2hVß"Vg4;e83{?hV@ow4Wt5x~1+^M(޽X"8鑰k6{67n{v3oݩSn]|5_M|rmf|E0v>8)Ԏی%dS_pjg0}r{Xʧb½[d5/j>ƱؖT]]zBkb(]r s")z=VT&jz>YjOѮO߲men5}kk,Q`QwO@Z{Zɷv'֞ړZ{RkOjI='֞ړZ{RkOjI='֞ړZ{RkOjI='֞ړZ{RkOjI='֞ړZ{RkOjI='֞ړZ{Rkwb5y2_L`>h>596j P!]vř dej&w}l^W[Ny|>ozA'WZŠ;VtR]l Q鎂K*.)V9﯍wKÒ_Y_,јq0-n[]uGwb=CoQm>RYryx7^@Nk6c#3lZ^R*j1vhV9h\0a)1p9VdW:pU;]|WECRaUV Hc(+IX),R"*t\+$sNyƍ#lLE,rոZ'ÿH[;`a ć=Z,nj[;XȽͧKY4麫ۨ_#Te^pu߮VW9?2dSql.ĕzȷ?>zoܯtx7OnG1 CN3/5z\<7so{8{).? `m-?( \j:r*1 ڄwJ.jCWFbYElUaJ)146"xK?,/!m_v7k9ʽK8g=N]FȜ& 8э%?u԰Xy|V>/|d=zFo -#&·N0esmigMyZ\'uA(˿_V!NV | Y0ilj*[᳭@*/ʛ겒uC{ !oª664W\:~Y,)(%X #!sņ0=b!#{C66SsmgӒ,,,혖6j ;Q xti[#{/7)cVFyCwvNdӢ_jsRLw^5琫h͔OniUz1̜%_,QG[?v tyT%Fluf]ms7+S%#EU];u뺬7uo?\\uH-۩k̐Ԉ#qFi8!G < p౪k;[.U^x#HXPXPH~4t0ɻXl4p`p:Jsx^ZƻQjDe:arvx' "TF+3~Be.ގw|_&4^W'7!wxh'WF=~hh-6!j+d/gU4Y䐾3>Iaffp*#ۂjٱ=CĽr[q6~4֪Ovʿ~XpeE*A5al)Y QIm&֪kk}7&tO|VcnƓym 8ɩ4ˣ|Hl_@<"XYsk1ȑ ӤEׂNI}/ys 3P>L:ly| sh+zEA*q_GnČ"NR#)u!Z5Kqwudz?S: # L9"E]OJ DhV:rqiA/tt}c>}8s.`9&AA9vx<-)E 0SH\eq]` )rJ3-2}lYOE<%KU&jG6z+i֭4%.>Pb(qTSS/8_~';i?^8hU짳7Ee&9||Ñ Np.VA"C)x!+=qL[Jch4g4XCIҝL!Ѩݰƍq_[OiNTarY.)Ba<0IVmGp+Ær75^sʂ1R{# $@'2gLo- O??OnWDh\! -%& mDj O]#:f4k$0!uṔm:=E4Sֆ Jx n+9{'#·r+N#H5E]3)}b,n/ nztUj?M,Ϣo6 C|H0Esfͭ$G~Ze:e*f]39& 8M,>Z)`  ~( $1f{2fvlt>ulw-Ϧh} @ɚӜ5^]j'hOhM8?_˩pV|v^xgODq ~(W#74uW/_[VKoe^֢"]=(RU-:XK;+[Yg|g?|P' "{,lPwM:6NlY y)8k|EgFU_̿/XL۬rwf&i^ ^fWг^ϮT <m̾>M]/3˓m߇SJ#;[:YtÖNn}o5yzѝ=FK*M>X㪈~&>,aBFcD2+[xh!5eDL oXH8 Hߙsgt#1xry!n:[|tnn mwԙ;y-Xhm5wF6(GTKIw0Bט#WFոhZ#4?&XsL9WUOZ[~g-~kKӵ-Zjm%}-qH$欖AYuQ+R3Xsmv(aL&4 i[{R~:Oy]R@ Xt# ȯo iiDHksMyT>(eE"0ךOdts4'MoDuB9\ TTRk1}D%.wK]i>4wK]}D%.w T*FHTr>50~2%1)np1 9sH[‰uXUM,+S2:_||8mzИteCUzMꫤ mz>ym+,j)8GX߲e=fݣlE!=,SX|ϐZ8` Q T6.e\|T&*בE=(k{-?yjoN.4GVp&^VD`,ճ1Fk P/W&_32}r\]]Om5ߓjZ3kXzS]Xj+CT{SiJi}ܬqFw{sno?jWi(zwL6NTk Qt_MDs]cEʕu id#ݖո.7Gq6'% T`T"F@4KJd 1Y,-$6n3waMi͵E\>B"Nnn,o ;8Lyږ7|G%!\oCS"67?v2NwfhJ\YCx ފ!i?t=Lp9u:nt  T(wR$-3{ju 6`"X"j$ E &`w]N Vw 0;ܤy^u6w(VvNg;V%45LrMͰ/ wfןE|%Һ3 u6֕4hPi~>y{~K k,X1cBˀchnDtk:gefYU&a]aDb*7o2-s{ӯә*KmFrshzoz4mښ *w;kӑgwDxG5=n.D亹*oE> 8Lvѥ%?\t#.}d4(hyPq[ qyÌIT)兕༻.x6vWw>w?A?v]Kd(`-B"PLSNqYmIa|^3 j0NlT6d]-Spͨ6 z 7zp95ʦh`MAP* VH -"a3-9XV´],,p!c06S:%S`Erl4^X%;! #$$9X!1-$#!-S /^9p0IK]2YZdi}ZQ[XFaQPGA7&Na.zH`byp+\LrB5ឱkTV0bBRfH].R"Hd(uX91m"Bhqmv$\c;pwvcDqjQXܷx`ߘ*&u~ dR\_uagnom;cDqvj2&_0s,Mƃ.@.$Ux3Zsr;'M $FJ(` 4nʴ3֊Zk)Jݰ&Hu(Hŕī2Wzu~__8˪{vJX¨ivq|m?LgQjWv`k,eXxˆAXǀLGOo"N>]>{zvEL1q}>]GEZ)٧JnXۮʍW5囯P̹Ok@M.dzd~ ec.-xXiWX*a47XB$g!ЁaN(Esfͭ$G~Ze:ắ*]$ON}Uю{8cݑ?[h.m{-?Q fhGMmpkj^=r%ҬkCX gWJvܞ^jrg<yRg mEnP\`bis45^R鄡VOJ73;$^/9 7c.q)V'pjp>LIPN)5iS<n3p1oxGA'Y&͋9a7޿0=̒yKVz[azOWقmk3tU:j?tVyX扷?6'"Pţ k%ԹוvԹE'gYR|T/U{Yֆ}t~V2֘Cp + &sQ)ڳ?*hG|G\f#-lXgF1s2⑶EՉ?5h!QJE E#?7ϳg?b:˅~(;A&_32}r^Ev˟9zpjp[tj0k~]dk_JOǥ뗙\Y1_9DkХ1jbRlj&T6^5:ǚ4[SN҈4ёiLVFAaۈjk?{;no7.Ds#iHNu\i"M&ҦL#pHED5Dn hU暠ߣ\y̅\y̅sa 5ZRNq~e}b9\<aBFcD2+[xh!5eDL oXH8 ٻFrcWy^y,08H"99yY:vƖIA{Ԓ%ufKnO?ԪfX*Svp鯖 $7?fbٽ+_FCsk/__moH}# 7%62nu4[mMr=)ZKfA E*c3\MG.o{㫒3!rl)r@=`p%b(=h=({= \ L؍罼T1|&.[\H_ Ey* E֋XT`J갊*fʎEC%#őV]$ןh%*.2q# U2sFj<{ 'rEhG~ӷ3Otz4o-O1]8I>2D9E–*/*ӆ~Z E$B [.xĥHDTnҐ1F77(6c?c:|,އynmj[Hז;r-<9E.!̣C6Ӥ3Ozq74+hi0E 3}wz6+d..؝}Dyp;:ߠƓCN(;NoL-]|;Yo!\gKmS4{uv.ږɩ{l9~w \埍:歞`n|w`8&47QGɑws8ga#ȜAӳ?΃'V,;Ece(6c"i$-DK=;XҊ2dV)/A/򁷲ΟTz^*r[aAuHPЭBd 6*I$nha]{+aԮڵa_>CZ}1(Ózٗ\kVR%#sѼ/0N *Z$MF&JqҰ$iA8T0,*P4R%ZPBDmTH)G!QF{C:f4X*5 ΈIR)+ keD8KbV󼄎 Z6jڨi}jڠ5(4O':ٗtd>Lt=[a5ꯓStl>/,u[g*Q(-b(G]H+Fb[/QG=u7n@`/;e\DPrX0P8>6q<ѷ ZN_@NڞSTBx=8--=C|2ɬT *ٷM6+M0G^0d)μe@>v-A˄d}c?| ЙIe*҃g53.m}}gC|R1RYuNOi`QS츋c2j;Y_\lHZ"Yu Xy>iTc},,-AKBۏl^Ofgevz%I(=a@o#ꃖ8u.ۢ0ޣ=hu^]C=ɆA9ط.͖;6os^J}TM:L."/뿹I{kɹ[wz~HhPKd FU Q<1C*&HmJJa)h#c&ˉEZ*d?*ȾoWl >frmZ˲:%4SGىҍzjߣfQ1Gżby֣ѫ +֨gVm`hԉGky0Y]W Mi۠dRK>C)g%CI4]r{!q{^'kQ;?Ua}}'&fj<_oiVLϜsbm:3!pBLP^8 ޖYgZ&ulovL*5uD t2碯zNm1fPblUng8+0zS | "4m'w5T~tvʹmPLA7Իݖ'>~ JLVW~ZPo'랾q0.V/y ?7Ol{xvoO`5N曡ga_=׻?_y!W?E r184Eːc"I0UhD}ї/g5Өd"my4AE,XqY Q(Y4z !_x8||6(fsޞ?.>|wnlA?}ekO)ob0ouΥ$2ZFcT4J2&IzT"[;o$RsS8?59cvPNP11p 5҇Ј6=Wsnjp'υA'#R[9'=6 gP"NE=::/qtX_E\zNr{Aś, .+atL` e`skHd"tJcjn0Kɉ:q~w3kS}欤UA Sofq7 Hr8ن,߆jwqr;V L\d elBȁO&I>_hWte%V.ci/9KB?;nP $):f|_8iL҂ikb~zz\H{z~& BG$s9ZoK',w]3Q#5"ʭ@z5.+P]oyP#raT 33;iJGHu!H% !Te^l doa^۩.?YDTk9( tc_DEP%8bBʘNBj'2&@x=qh})*0KtP/P m(;'=$](ŀm}ň>؛̓ͪ.ϖ+k$!(zº@>O>4 1[-ʪUOYo-f+3^a И˿O0b-'“?Z-q-MXvhzi_s@7{y- Y_NÝZ$tÑ X3'VA-ٗuhY1|àapڐk aO?*P7Sf AEШۏ/(* B}+|m|V7qyǠbzNzC~ڨ*z𠧞e7uN젧]>W|V,=CYl >+zCPYe-vu$hZBF\5`CyГڂM'}^jRw1fҔ>W)W>tO$1L2JS??M Ak 'cݨ>} yrz`Ƙ[O⒪1eeq4:غM|qs ^=q&|˲aJ,qoHN;kdA3c,qjk%N(^2 J<%=f }>-ӑ&WoFu 1===!WٛOQySVI'y8^:NJB(d=1I `a0 %LC<\Ґ8q6$Σ4dlH6$&(m;Õ' ; nub ,Hx28&e2gt} w/bF3sAEw/- =Q0Xŷ[{KiK%{zvh!S(c: B~Nk˵t|%~aW^J2_nŽpuiN|;dLc! \l`  [AB.?"E7~uvW](\~ 3Syc#˧QG$Z';P(g>&/C9Xi㴲ncB 5k\!)ļ B8#E-c- 0/9nܼ,żR0ءR2hFZ*(1'`6)a`]y´ŤK,G&Y (,N4](`^(ZyTtLR )5k X1J| %̔I$5.k!Hs1aEP("w{\'$RE5^9-iP(6MXshFs0ڔL[[@FRQy )SJX+!˨tn߅B6E$ (qag$jUwPy@%ϰvGL`5FG.0b#ʝf 3K;6bbث)#3K@p0߁BaMfނOBB4D}2@rJ0%+;(b rc3^ќz >J (`^F̱ l8f(q VI$I.iB Ҏ L( ,7 n  +iP-:߅B 5u>KJREPr9 `6YbP(g3NX&!u>bذ v Eײ<1Z Dh$!'-3eT6)`^cȍ琥ϔ SQp>9 >%`m H۞ՁB Q*K䂂_#jL_jIPr`Ŵ&](`Y(<VScP#&'߂SP(Kj +kWBp+`< y!\ hmup'B"}3䭒\*Z| kI*ţ!1A*yDe)4%aL;P"=e@(.쟒w3YwPCx#Uu-8'`YZGڄ@v& *HDh"v$-T1!2 ;jO-TsQpB,K+--](TQ Lx*XωǴ)Ui[B\VBTTHI)CJњtPCx 8 o)Fu,lLR},r)/q,^hjd!qὊ$+SBd 4BMٲ7 Z 0H-Ԑ) + JRDmBF0Tq=D 962⋴PPUKMօB ҵ! T{ꈶPIʤ% I”.Y'bDBN5Y6Y3ەm K| r\xD%!y$'"8i#*;P!<˒b.Gb\SQeCpKX*3V-P](vlI^;))8"$0AK!PFtSY"07)K]!,An>\;/K^NY"}HQB?:z4Ǐ!LzB.Ջ W (\Ì WȮ8pJw ԴW/0\i.;&txAdWc WȭKӯQpe<\*_'w|zi;hh. Z'>aJ;|&1HP [yюVvryf]a\ )V2OI3KYC&b3ܥ(QsOi[*cP6RIl6$sVNI D^DfLЮI}fƳf,SgpKqJ0䉤x Od*҈l'kKJHclTܪģ92Xd򙉽hZqilO7Jm1 Y^nqFi'ٰ%D 1/rtFDwޱĊAQrf)"X& W pSGõľiEiE&&2!yo ybi0AZE`'yx#|Rh3ujgPcp]]%Tj=5)% K62D T%˅I'ȟA^Z'eK @VeL>X(Ǎ&AHUUl,K{*UT<AY vr!;%KHӑq_8ش-TtOe/mYdd[I8U.DH*2E ] 7t ooooooooooooo~ `jeoz~}= c][ WAܭCtrC%{a-.ʐ.SY!bp.ijHQbur*02 |8+D|wm>a4]M 0YeܟD2`Qjb|ȃs@|>Ô2B¹Ĝ^ 7 Jn .%κscxy>xylCb}4׀ʾbv-Zmiq\v ~SEoB)ƅP> sE`qH,ZJ9,0g`hCE56 &q2kǴ^o=L.&ASJ[agf Dn:-L!H׮v}.x}Yܗŭp'8|] @0)ve\D6Ay|Xf _\3Ϛ [w˞-ZΠfZ'O7-`r9kU6XrNA2R!8&efД)L| T&p]N?<+O>Ř)cJjNmLY+m2$ `҆ %L1v3f~ w2Ѯ˺{ FvހoNZ(!Vؖ?k֔ ^'[&'p+Sx :U|$޻:zG(CNi$i8,.˷+h81`dE!G8m6s Be2)uK%B5b}}FV6> ɔIQ-!hdccǞ9(F@J 6L$#yAG )D+Κmg4&#Q{czQ{#8zvZPyZ< tb`o߈DHRVeu{op]/?hA5e//Ch㈌&>`4u^EÈo:?K1 /yyN7|Fð\>0|G\|6G, dI}>z,?UU0kooE蘒Kxh1 T" \Ip</ dNQ%ʆ4Lv͎u6g  mָa7jLmL+ֆ <m>|h|Af1(wHQxGOhN֞ng-4o>x 像Ҳo@CA˧ͻc<9"fb]G/Z Pi-Y#5I[2ܵCʳBQ~I܀C 7{GD|-䷰'ovO9W*o\z͕ߟ ?ߜ:`'7'G+@wӛ%x3ٺ?{N.7\ v,kMLs 9,sZc*-^4۹˔F+*<"1 u"U.$JԊ{\  myd$U(`qxq!ͮeи+e7W'qj-ZnN }c)LJsNgerBVVc7w1Am|Wd2UHJZq_O _p5B)AZdX95XNEn9kk_ɵVjvu ?M HOzƬ.4P}ӯ.zp7fãk!r>!69q /Wgя2W0FWJG􅮸kw*:d ƣd/^/;9z:69A$eΒ@]&!Z(AH~xQn5w.7*` T#q#6`rs6;q|##v#lfKمuɣӻV?qfr2q&d28*_~cbŵ}䉥 LV2Z8is[\OC}̝}#qnʰ~`8e NU>4cݡ%x)-^^%|x!temsAJO(&"i&A%K!Rɇp{D92lc7ԫρ[K١%R_b}.b<]u({Tb7*,^tx֢.Z/ξa͸(gm3mn~699P>vw&E5S9d"<+ZٻqcW~I(~;K <\$fxft UDž'`U0`Wbk>0g+*~A9`t9:`Ibfrhop1JS#DŒ8#Y^[A{8_E[BC`|2}^=+exorCp51\M@F'_/AQ5YcόGW؀Ɇn>M+Nsy+4Jm%6NYl5埗Xyoz֍L,Ua&_0.l6$x!,ڀ(KT>*%-]IJ~+b( ̓_w&١$t۫}Eݭ6/V/ZNpt^uKy V7k3RN~7~=XClK*\W?ޔ$DU!ʙmQe(U!jFQnmE;A2TePB&JJc%J T0Vt8XOޗ^.+w hS運K㜭uU*pIΥFELgH#g:ڵ6FSnC^W$gQ2|vd&zz([ysCo<5sGWd٭HZO%[^9&[?wވE ˎ^Ŵ(\]7PЏ}  hn`eJ:a,5=L`+0+,rUHVDr4Pc)zwͻf7ߚxg.C?f 1n-ĺNno.n>G_݉qO[Lr3[@3sz H3)JSbQ+t}insޔuŇ_CGESvRH/_  z+jw/πmPo1Oc=ro{#:˻'шD–>8 -($Al!o ouf._$&5;6d<;uȭf+&H}R#*ԣM(; ou]oitΧ e{}t 1`{ .'A$/JY#gW7)GbJE#kBc2JU( v}/'Q]yX9#cYʛT9`7PDU0 S<ֲAa/yݒxo>WgYK~tb_uם.-8E& hiZTuA!olXB[ &d8BݝoEhSTfӧmk:Vnj# Fh+1nD[W2\U׭)Լ|!Ɉ#ߌFw_eZ\7+UWR>pٺPUxrxzrKt'j!Wmle0c昇*1_Td~nx<蕶8=|A=Ez<$[P5Ј⁏(O Xr}N/q9_`+n\yWC zjnvUf+zϗ^L+8`:&{>KVWsvݶW:%yK!~]j38&)}۝_bzts|NsE52y&i\"B?O,V4b#+.}Ftŀφ.B.tEh<]1Jg:Br+B|.\њVpc+S*#bO0pM6N~Q:7ҕ✒*bRBWVQʦtuNmp*!cefXE.s+L(**ERP`\QkmUR$DZ(QVUHT(tXCRI[<8}kk!m]c`0v[Am|Y$hU EE[^J $XPDMM4C^Ci5NNV (s^|]9-*ƨMitR8t"W ª&i;*ORQ:޹ĞOIf*jj)S?vJ[镦hwqs^߫J5U=[3,pVGl#s`YK16~ho!/-Xl`.XFr R|< '2+`qGLt([g:0šվ^uLPh3{kHztuNScn?R^u^ Q٬|5l%pWcMWc8gھ_wT?%tt δWe?*NAW~]U/lWOB21~(1V*E: $:uF'*jhAO}dS+[0/~jf;]z(̆r:]1qLWCW.#ṄԹrt(AtutŷUNtŀɆl2]#]Y5fDWd]1\M0hM %]9%lˆ\Т+nAuVҁ2/TWQ1N6)تZj 2=qܘ%q{zHw%/9-}L\isYVS_%̧Ǹ!K3}PȆ&)*3ubzXx=tUBi_` OU?6U`Zt= gUR]`'L6tp+F{NiPj?1ҕ+;ZX~ֿ+zq}0.׺r1P~{\_\2uo,EiJ,jeTA.!ZvZ?9J˒T90PDU0 3:hMk*Xkq=[=]T֢Z.@ u)uRh$f0`/h*N?a`#z]i=tp˅[?+FLWGHWS2#b."^gCWNxQrg:2  +^Q\̅3fte@Kxcך|.dCWQ:5ҕrK KR kެK8t*G?֞V%`m,dJcEMÍuܺDeLI 4Qh(vсI\O\'Ů"`KF;3Gpd8 t1zސ3S1v:b>zFntE(p3]!][}6b/ZMz]1Z'NWyEz#N4U|9ņ4&ڇoHbciX fSw`#}3a9}nCj[sp_;9ZO8~FtՖξ.<5ש&؈AgcmF'62jh6y맲Ɉ|."Qj7ҕvUJ?(|6tp̅S+Fa#+RAˆɆS+Bٮ=Е ˈȆ?!JؙglʕD9]%]9P]"beGʛZ8uU:4Bpp)_+WӈnHؒGT92`ϱ PS_wK2Sy/F}* .fsB:B 믌~NyK3.;W|̚B(rmy|" ٶZgvbg{[_w/YFڼdԩ=rdq7ʧoX,w޷Իc}٭W}‹7o t{5Sƃ~8AHkz;ܚA“M~ӁB]N{&gQ5c ]m.'A$/JY#gW77cPv*F R3%&]A߻KqwvH1ww*t\O"/x&,S+/.wyJWkIf%M;s^#,HD<7kL}7✼5/_zq=mf5eXwmzIr~1093ٝ lց,i$yITn6)b%UWW՗ ;. h),ϋT3%# +JuyEvvRϥ miM9 9m?YY&'Tog ZTp-80oNOğKl]ӥe ?*eT+ pFX0OĂ!Xа ʟ.ݾwBNjX/WcVTI釤uc\~(̎Cvz-KY'jp>G+瓧⒀3|∐O*p^s|w>5'Wq秚sZZsUڕu5 JܝQx=<8p>Mc(GgBp? Bh9=u@P ܬ:8+ѪXX'@\0#|var.ST.iwi NW\ou狌^Jy7z+Gm5|sm,eEz9j3?lDC;4u ¥ܕA?1(f4-P$%aE \@!XW R$NWIU BCtЕ+thM+f7認t1Z8DW B3tpu-+Ѵ;: ] Qp \jAeʠ]$T]LBWPM+RHWLݝ\F1})_F,8H59DƔ,JL( QC$CF(4mH50x~Ox#c> !_"J.,{J!=yn},m-ɛaG<0B-?? @ho-=3X~yl@X A/(&{݇mfIx:VD  !D \LƠ_2(B"]+31p3tejtu%^]'ڀY瓋g.^ -;W1adRHGWǪSF]̩3te ]R5 J%;j#]qL[Ajͭ4̋|w}#2)#NAQ =R xD<,I4GH}̄CN;iО2b(g 'ZXBW~dI}e2pv ZNWR.j#]qѕ|bp3te6ct2(ZGW+$%!2;+WkW 2ɠAItGW-+.-U`Ε3teJ ]Zj7(I]$eO/#ϒLktn6:D}$A~xg'yvفd:_ҿTN yA4MM$,ix aY?a8F&r0ViWa&B$tMWuӟSDЊ`,j`7שXSN[RKeY]~>J17({YO%hmZtWu~.č7fHܘ!qcEt0AD>ǒ!?T8I$~@>Ua&}$tFi *׹ қh]>[s{ E5M~<}go~[/}zˏR7rjoXwWq0 br1G_o1 w܇o(,a8qLIĔZkat!*d !!%V KHD^B]ҝWԵ%lU F2dGכ7?SC< `-o>a. fbz/fi#oܧx'o[ܣ+ח`hza8{Ӈo!t1g~ό],@;z CDm_eT gC6MT\o{Ksd ̉ yEbDTT~"1}N qj:?4i~~- O@˳Qr%%̀lE꼻[vPNIBDS"%Uuw@U05O/?8H -lOETfݍ}.@Ç?:CC-O~_-р0EY/sa(GW%իt?2logʀeo:{1Xo3I&i>H~2EwI6a\x17盞ΪyG;NEt 17^f; Ou׹q gr[94_etڿ^zK=# __? Ơ2`(_D|sLUKRo]o< K%DIug)OZh8JQ=9Kx""g_J!5'D렄SX6ǶG̷j4$XNǗd>o[PD%.{v{謞nזD|8;xͣ]A tC~IYBU aW70\V;JfÍ (LO`pH-=so)i뷼b嬲( G&C>zZVLpCm&ZNAlB;aG۝8(Qx_WoK?ofBUsZqi=|N d`%gv p7Jim 6(1Lϐ :ZK=.}`Ɓ'_g*h!y Wͤh}g 38hUaM6$v懵Z%d@.'!+g({Om ە2wjKe/t~%o;[hfbQ+H 5URVն)(fא|mFfǜ=d xUNtDGhxIUf6m;xcpLD9=D+~F@w5}AEN_ExSXf| mpZ`h!'HwqpѳBnn)gzsĞgn1F6a<w9t\mK3vR;[~8QHZ&sܕ2y̟1tr:еUE9M*tl 8vѶ<4sWUƹӯz=ͷYmT0(vAźm+eVc21*fx=}ƣriZRpZ7mಶtɕG[TafVQ~;D.iH->.$gRK{Ҧ-1LJ!d=:g$)l]4JGQ嫛mSemuY^ $EUݲ{?<rU]IHHoy|.3Bk)Ie3ܝ]epKbr= 5J@jݛƷL*gl~lmuBx=].^XR1&Ixuc^0VyXQUeVtm(#/䴔h y;//`hJ R tHdߗW EtRB|2̴uxNNu Fkl:Q zv W<@*q-~`03V/{B`6Wا j^6.~( +JHFX>7빷ˁ?.-)A"]^LhxKLqfx:/ۭUtVNSAZ|+DazJ&ʡUR3*OUoXuQ~m_UXJH~l{!w-59ӺVJ[7ߨT0K~aWq,N0Ay1_{yTJ"YJQbt^2U%T ڷkٻGr#WdiT$ g.@#'JH ^<ͧ$P )+-&:jFv $GHۃe%;)(dDP /묢%BD}cO$"yLD!ip:H* 0ӏM"/L#{#:`G Uݕ;[O2c 8@oݝHe4jW?73v8/lTGXr#~B&~$~-łtas\v'׈ $xrډAfH [vvaȇ k2FQB%( uv㺞ԉo\-z$TD:?vՓ6 JKH^4ȷ܉#/.)y<?t4aJ%0PH!PRZe4E* kجe1[k}90TD$[1ѩ{{0ܽ%Ĺ{h >p+txfnE)!E$tcQk)`y]MSUy%uHikZ fr[Qٛ`pt:'pνc!FvaSX{/^6pdo߆l::-R!C!̽JFi@Mv@ko<(X0]t]fz g{#uҍ6"v*T}0-$7[eJaoF9+(!hA-<)MSA:c˦l3e+vUTzR$LcDWPޠ.#_; ?VFfEq%:Sيe&Hrj KsԩI8&tq([d9<`4߇Mj1Q7p5.@e+_vE){i¸ ,ӖZ4w?_d9X>LCpRW ,Ε~?u85ecϷGƄa0i ?o/ՃyvO$_<o-jx3J(ZŐcY6觍~9Dh(9*vɘ} nk˓QxuoGh%iֽ@;#\68?FwRg{wDCQq"#V [gݝ3p`P>ĿS*8vBkT./=Gşz' WVyf`gC:qLm: B$ P{86}0؝y3r"Zƿ)UL]'86}ڷ'60x45N}c+QB͊[1@IIʥ3^nS y>sxiFʈU9N&Q)U3J.X5t\JHr7y@)c_@)Q㺼~Bul 0=L,$(6gSbP"t_LglmTQLK=sHWe٤)0pcBL*Flοێy3 UD2Őhrf(m-EiFĮ!m:]$i:]jKB:WDp'ϱI-7X,/F+!1]nv;:91$E}!_'Ng8$ΒՍ9m 挧"DDV~~L39q Ǜ]\fqU]$(hE*|H=iQ YM%FH;E$>Ð.0aU/Zւ&~z |tEf[4IAMT+%b`}r<(F^v|n;FBt=sbE R%iC*SNW!Z>0h>BFV =.ʉͿ9z6Lp-Ea蕪N>rfS2]49&%8: D#}V]FXLpT FMX{̸ԬrbLwNۯ=5 JrC+GS,#y/`OQ B#u|I*?sR9m{%FqA}f @rQkd$!X|~Z:11؝,*8*cN3fQ?kK8z1xr0>9*|fc)/;:tHiiܨJ99 4Lr&kh=L#0fI'I`"H*h]Ӄ쇓_3ƒL,˧%SN|rbDzE:˴u6LD$y}y& J瑓߆%Ա[S_0V؀;8뭷3pVK/e~ҩS~ʍ7_կ$q/ijEY|C qZgT+7]ވ@!AEZu(ro'Uwۂu2 2[iQN@6NE3l`SN)M)Xif| tbL7V3'y]6u}"!4G$)HPa*ef $3^@ BDf9տZ5n:QT^V /-KGeԼY;RM˥뗼,Z$F~orjD`;S<[.MjoeRjc?tEyo E0_u->>f~L*B"\P JIX*󿿞׃mo5짆w،ujh9b\(I"{tbCѹX??k糀a}?(r`8X#ns:}=箿~>;D "c.." w%LEk'_NǮ=0R_f;5؟ Kޫ%X@m0xWځ*d Z<g&fp,5J!|-{$P/!Zr1"ޓ6rWI[RAvۻc)qאH3T5"fuusZGYwߨm{s'l$C!mI8X}RAԈGQ![ٗy+iWHsPSQqѭ1pPW5rA׶/) ~"ԋJ V#(eU:&|Ц q}W z_K%59HH Àѫ=%d/䰵Diy=zЧLTiOeGL/qX}x?Z9o3tlbf,3DOm{whAПw*g7^~,uh6O7w\ct64['# 1%~ &RC藟M?a|?73%}3տލx `K|s9s;okZ;Cmc)(K5R qsUP#%i0GXsX^}Phc D)OqQ̳rqnd/\^asdZe_luA)nb.DRs xI\Ga0A=SuAC3.I@kHAIPkRP, P ""4BlK{NqJ\07WC+]!ipf\6߶&a.2 @i1!1ZoPuv#04 9`d-˿&.#ZL,s<BN?ie5%pK>MF v{%kUFoiRiRF6mIm 5QA+m%̤ܾзܾhI4ʦXIb噶cmrKd!HcD**ޣ -jU6cŕPݣ<l1Q2ӏ*GwPCS;FSf]Sr^OJki|9^EOcn?׋{3g;}; 5sRbp4\W(Vn(I _frG&g j5)D&ȏ~aU^Jj3/tV@:e_)HꗇT :t~Z3|w$0`=x.aE\~78a *>"ऱ!]_/8 8ks6o 8D+o~*k콢]V0F~Sn bzoDU)؜陛ٗ w{+ϵؾA) ;-ѫV3qrmG;$o@t"*i˔WjTשz喈Y}&}skazLg"6*x&N̓۶Mǣh~[ "fuSՆ vPϦ=AjDB cNbNRnE\fzb༊0G ޶d Ck9ØroqrZkickOaήje :'Xha{6:j<,wfN}}p: c!#A3zjpJhq2)M9q41LO=(ڗ׏7r>:&$߅@E@g>bų%IӼKFN M^E98CruF%M$u;Ka4 hҙJ3֖BhE^Fi??#~ʩb]*8uX}g\CS Z2@ /SRIoPD݅u1=8= ,m%ڇaH5eZ#+gV[V@APn2$6֡X h\1q -W ʵ}ŃNU G]ݑ/\sRݍ̈ ,txm+oqeS7 4qZܠ"hqZZEQ#\]oQkپ;m®oŽy 栵 REq˔V M[ֲ`/MK#N8xUx-7نy^`zٿd/y͸,mCE{L pLC%38A8_hDP7N 3(nކeƱ\Vrd'M V=&P6fU:`v6fvйdVpࣴ6qPt~$Fo[߸hbD {pHsvMa!#ݷe!! 2@HkS6I&mLG!Sb&yqpJhハf?f_ܝfj{r/nsߘJ'Q\#ͩLO!/ dA_, ]7'ńW Yj=QJ7\ԌO.G7NP!?4%!B6߭4TnĂ Bh+ltںSWISAl:3'TO7kwl#I=2@d̨=l8,6x뽿=d 07DhHs, O>*eW`P$[q_J)",(oc^ !#݊t 35dbןBM;L-> ;B\+ Y^yKCwIDLF< PRQ&HV![}ۃ'1NvIÆV׽< & ;F8Q~U< OoP!]yA&mD;MҤޖ_GRKjyq.}m2 @HBqWHva o%ˍ UAfhL敶`VTQDmƫk06'&^Pq~BpQ0DF)iO􀊜YvIj .@ã4v0!췜۫8 \[\[wE ZR MU]/Cb=.ʼn~ J۷4P.U0S4:nU5[/x~^4L[yH$4T,e**5^غ slG.bR``ʂ1;_w@/pG~c)"@ch,<"G)UZ<נtPfZL\+ɚb[qwز5{ݻ o8  'S% _4LD+dF2cԯ&JL1qFe]C=6c2)}`9\tv"1WBjd!!'42&V23Aii;)* ۓDD. RhN\pImV+F6d0=+( ѡy֫vSFZQ6Т8R5J`rdl]VRT(i"b-inV~Q;lT;k 3]LZHJpP^ޚr:] 36]R,xU7Zd{AՓ)A: 0i/fXXvS"Q/{T@%e5)P84#L"PG yR*kzþ0~7K<I4IE /O0Җ2׺cm+imtﴩ$'Lџ,-f=$@g{h7n5Hp*C.Ã+(e95vAoˊkMUKnz-PSk~>{ 5{ h{`qX~Vpvh_h>U:A^F upX=ИFB7 6#= =n:U],jMBkJGﵩigzeSS;QY@)"qWW f^l[92v2 ;{Ӭ֌bN\v!d2&r]Ҋwø\a\цpUd,3m6E߯?C~.W3&-_' MQAL=r0աAW`)¾[ttyGīKªta]9M?ݲ<1Iqڡ۲yI٪>!t3d[kB&[30=hViaR&;TTw(.N d1N-zʡד%fqk025H&ގkXmr#rN1=lPo,?yE>2 Hѣ`&E?5{ Ӳ`|K_N`t}OʌGPu]qQ3J(L oɖo^92j2an7uI96(mwcwjt -; Ұ@ŧT֮e.$W&z[le|wFn\}kV#j~)O8cNzUi;f~HH违Wkj/q?ՠ59Z'65F/=R$s1 N,F  o6/#㲇ݔ=oE}Zcy4|[[ shOEw{>>ʦ8_*<΂R30Cc6q8KnpTI#d"9qk,rQ!$~WFJ^Eiң}އ3tΔ헶0Xlc\{~i2^RŻlQLplOC p)rtp " ̀rmC38⠛s9VMUdjG,J\&KҌYmtIaxk%*BtGb 5'́u4DRsq$:vu |@7P!4P}Y&@ooŠ8̢Mۤg%|duE7%>n4Bd!-,6zrpRffm6u)yK 4Lp !$ Pvd#xBmה}ÿ;Qs)+uvOMșk-UBn_Suqz-{|lk~ޱrZ;y*q)%0Hhd9M@H.4HK/k _1w UTt wW[߾cZ-en.d QZ;JznWZڽWʔ}-PJ\BEK`Bij 0gfjiyUmWWu y+8p72g‚Au>]yFVb^ћ.g6/K35u Uli o eQ$A+<"~#TCVG*%$Wzj ճws-TZDnFȏx !uU!^鍋U=K9Ǘ uxF0e|y#&ysFWV`Qa՜~ J]so~#,KX]ѷ ҏgf9v%MJ dXa6-<-=`sk:/*q̻e/%<23xZ̲?Y_ i6lGOlޏgʆ$~Ex26Ye-+P;+LX#޸L ^DSDܞ?Fß|N6JM* 7bT,#f3էa_SBa"eV~R2qSLjHTsb-G?ٴ]&C\02k[tS6hBk/Jڜ92r c몙`^})Ѹ,$ 1hbd)֊WD4{Lb8` -çGJ>䑩'! u:P]H7V`ӢpKXAMEos%ЃԭK3eDnL8oBǍ`+"(ѝS,SJIA5k eS #4>T5ru99ˇjtl >61~/cĆ)%1d++NC57h [xɂ3yY!$J?_aVmԏ~B ǘDĂS0"Gb4&a̢`Fi`QpZ2!ĪG^ed+NT^?0DfN0DgrmN' [HδFA,"(O#Ӡª<tޘn7idYp4*y{%sޚo!8mKy O:s4:*I$% "`"q M5"ԥJ2,:5yA#)Z&Fs3dEa\{~j0z+sa1[l\Ffʨ Y9\#*Lrb~Q B9Nˀ9 IN;[RdcbRMJ8r%k(Z?=lPR60 B)SvZbXPʫMTSS'E)QH8v鈧0FB h 1֎&B'зO>ۅP=ST, g_#W½VVj%VK&F1) eAIk nݰh{߹2Bpk'H_2dS5eG`Y;y  u"Gt31d|k}pۭ$FA4J6:dD;a#/řM X8$B|J`[ O|r#4exj'@+CW8{}tL+ܹ+v+G.Si)}>%Z0B ht .ޔ$[EKb]cPzҹ.Nfe-ɾHY htܰ6Uh ٮ{ s~k#P4X,ۑ"el0o'py8l~-g4#Z$E2*jhTpΊƲX 9+4e֜S=t5yZᥠZe 7NT[(únFz`ُ*RVGT%ZM"y]*TU^>9嚟r+ ;I!l>nɗe6 7䂲I\SI\2K!鳻08.DĽ`]}DJ~VMᒴ}OxZ Ž]ӐQ\ ^Ilg sιDUu#>p.P1F>5^!#ئfOnӇN7˄ʕ& H3 v*fó0p-3ǿNe|k0؄MJ;HNi-IC!3xɛߊ%w;7 eiÇ>闇/bU^@sUv`z#L^ܶ<0e0Mjv*g0]h6e "TN #C-gpDbNvƷl"U5Fb5{بfx48}kqRI' κA,hRԛRCxEt ;'8GȈ̏ :@B}|"dţydw+9m5̼N}G("mzlWiMq.{ݒaBG/F\pDR}Q$c&J*]?{Ƒ_ 6m!8lb%pi()$m߯zCə&e փꮪwWi # lʫ8 85%ycZ@ГRA)Y8T=WލkzsCVm%,zp7dXWcqE j$QyTB 3$!WPRlЧK1͸y)B PONDG (}ݧ uT-0^P0O:K549tu3T ԑѦ/FܤfG_hJ䢢8^V> !ho(N=&HDIu^MEcɨh .ݓ:X48FI PRƊuxϡdIcax(k1r_i&Z^h@I,[&qz4fʢM'ވ4`U: &n~aDM1\ݰ-0Q,"V&9˝!ER(7\is.'Gm$WC3?EsM]"MPqw[`}0Qhω}|'ygw k?;G(HmqS9ӝH^PV!fǻ=K-0yN`}߭pxiyͣyS9`|ƁrKV ïR98 XR?# x@O_}35؄8V;d0nq&pбȩ5Eẅ3L<j!3I2z?ErU^OF p6Jrlz+ 5Od&RP [&ٟʏS\GȨ ^eҫlej̫|n&Ycf4 fa.66V5Z`WC5׬Tw >ٮCWRz uwٽ˦1j0EgdKI$DXyLBMY=@HcST[iEAoŭ֔㾉E@zׂܰ=";uw?0<>#[#5yoԣDiHx=(u'4lysäFu~}k-k 0'"gd߄K+o0ݜLI l޿)ʵ{/67"Q &iI)-;)fx"M:b5M2)sqWkpԮ>_o}g7Ré!8-5#6n:Tgzz(1S1[u(ݷƲGUui*FƓ"[++|V* 1Lƪ?JP]8T^F;T4 I鉗fv`Rj^wP*2/=s'Z qJmN+ - H2 c2pPY`Rk3{)b({O<0+LzAil+ JYx[ʥ}hd=SV:ֿ/5>:Y(?{94=@ e'~^=#<,>:͆ʻ[1(\:g<ȁMTq%DD-8u#jr XZYsѿzy%h@D`b6i ,55A 6MK3?h dEsXusSFđ]5fA 6'xw]uȼ{iiSu3+n P|%D )Qj/|_qb=)f[i֋yYxoYkѶ4ug'/FmhA N3`7ۺڛܨMiO?@D ˼Ay-dɩ*Hm!V䶚Բ GzwWY (17~t3/gMjE3+5r . Pӕ_~׷WU\[ YXԫn*}O_*4IT{oނ a\28,/xI"^W3J+\Sҩ?:V5`q;7}fjp|HIYBrk:^aȅ19L 1xϘ#L*kJ^S KˉNj;_м,dR3c_HV k1L*KB0#!07W{ܶkd#?0.?27o3 gFI\2~5ُ#meVre~yn{=,(j@1#jabqi])8.+$4$K²*v̹g? 8W3 jS˺ʀKp>*V>=aέ:͔I&]Ah֜zׁX3EHSovf ټ. *jӨyub(FcuuX +qɹđf(Y;yhKksq`?s-@Җw_W0tY^A"mB8JJs%DJ28k*TP.%Ct0Bi@8~uyWQyçet2`+#N J*ƟN-i;JJX#KpriDVD?Yk^NhŻCoX!@N!<#櫒U3*܉rZbfHJcFN_'Gl{ ͔)Dq,4ʩsH w.BE28֢>aF.d:4*}1 N^f@EM Upe+ XP\PL@ꤊTq'MF-_阱;99d5c̘G+Fyzڶ[+ ZJEQ}ex_QYN+sy,ѼHp,ч py 1U9Ը<s)Π$VV~ M4뼨L<ϼ "dbLt[dYS*24vrXrc3YlsSX1mt\(Nx?ĖE}m*RbH@5NqFlSMѺ,ы Fx: ttcu|)a87T+T/s2ŰT/)\`I&Ze>TNgP.xSl%^R$U/DuUoJfKsO,\ O3fyN$e3vn[!)eiBoD Jm.wE)K `]#gmKIʙH9ߗOlLSЋ(H0*Je,5ㇷo/i׸n_栏Q1-}C5ԅzKr>4a ]i?/ME~jz{AW3[Y%C7es)?X-(Mo9ӴaJ?]5Ҽa|@%hjy8vڢb(n:N,udE?~$Dϣ-0!Z+"R8T rvv䢕F"c*-qE'k~$f8hPi ?(HƜE!oSߊ)J aMOr#9|.0j?l9<\+MÅuVyQ ߉Yᵺ+(I^-0݇aU>Si\4]-nN+,\(tMOӊ4|aQ3=37 m}T`JՉ:6D#K^JdR \ nw|0EV*ΰzx|9&svAd.'~v4A+w|gv0HubPfRsytK%jhDM6W`T.%Z\8ސLBq@A5m"|Iu ťPcۣ%اZ 35/!z92cv/~B%1-C%[@rπ%7;9*ق"Ni NW[vC pM\L?mJUwC}YDCXj3m0Z?).EU=I-iN mZtms$uټQRD n4Vw}Q}&O#‹'.>(%OϠDFEzBaBIsxOQg\G`9Y"#VT01I9iZc5z'>KçJ.% I1xXi}9- %@+)kSFgxiO`F\^ Bqb* {QD#S$hAվjWD޵6r+be"&X`bҴ=Fg~$M%Ib]I~u}r֢֫xu/̶׻`WIMeO߭O/&-$\a4 7Yfo Y`%Q`MC"QЬ"a[2ЖXXfi%iɱW/3Nz*_rλ1R]HTb8xԬ(ntpa6˜A[/+y+2T0ưlءmeÅӴ4seieJމAFHؼYK1z<][oٍ^槄vs+3)U NG[D5D%PςUaUB:c=Pa饤F ܳ*+i岮  L.ktq>ls3R+K.>IS4R.@4j$H;\Ymܞ W/?ŗay y+R(jZR7Y2XG) v/ȉvT\hp ̓{S^K:qFH(T>Bk\9GUܲFUT1ʺصvSGk^>JCO=5U/ ={+ iʲ6'7܅o492z^-vR f9ͷ ?`EhkAuwZ;kGH؜͛R48f}з5tmc tc Ȍ½a)d]W̩EN?}}Jd[ɢ5 5(0I'=^,TEjֳVW2]J=tɩQtŢ*ATxOF)O* L-+cX=|k{URJp@{?*(:ѳZ I}5vrzb`}֬Ϣ SSkˣtZDb`9P.RVLȹ]O4@u&ϚJ=qdN;Xyܡ&C &]-(}jׄũR}t sN0C wPgΞ5!a$F=#(%^ 7T:"г Y 剆bٵ U-ee-N meC v/%XFm6muHs,-u/bOU$g=pFH]Ÿ/\QD>n`sU53+<\n8nf]հ۽E-I>h2 |rprZzxBwq1B9sH6qkdNn!3զbGݽuj\úlݺ&V2?iokVtjENVtP4BH^tC/LlH$/OK[,ޢT&+B~F!-fE l}#ڮnwu䫇ƭ;&\V#쯪Zh}h;iͥܓ@z=A}ea``ts1m"b\8 Gշ _uS}G346ɇߎq ?.Nȧ?a|vM8Qo]tN$UÔWee#$4NFky5oϏ5͢NYvm }<El4V2%X,|1J 2p<_IԊ֐T?'-_hcqUqOm*Ym(F3$o. ޓ+NЬDbs` w\VH!8 WZ39@ jj<2ilS'jApd.k9ZW㐡RY P)LhDc"S&4z26B`X+lo[e!|7'gW(/e (*r@ tTLōx1{LHJ*@Mc:1'0"h)X-VcܖE{ђʊVdt+>o2qdz}^]H"#hbLqgNR1˺8ߪ[F| =FKMU} ыT[l{L=ʬ- CeT-5`PZeDS?H d,CiG/[Tv;(K & 8Dvh)?Qa d}CmN Z $U"@4gw24@'ƀM:"Njv ^ " ._qSOן_j!'۫_v_S /B@ 'Bh8@cn\8fh~7p DEP~%TR,g j gb/jo}[ a Kb@\dtBnUP%՚:sCDINWuRBQ+Ԗ X>AY&]r &۔dQ 8% uX%w{1%S~QTFk}`MIڷ~[|u%FIPXvl9 /eJhuWBk[,b{?ZXa7|(}=MrELWA|n;*}mcLݾO*veEpYǟܟe3h] U ȩ"a=X= w_l%$=zwDǔQ.Mb+C>KLHF!$Rdmu(C唐nZgF k@z.:1%̖y *x[jwM'g<_O?,z|6-8\iAѯ(M??~x+3Xe0 "9ͬ-Qȩ`fwLFC\%j=-ZVE-ˠ7!}@T_D3$dT~|^Mo.5c9"^MMM[KHnHtF%4[oYRW7fFb^``VEJGGD Nґ 4O74h) î. m8@0ŕٻXO,}k6i"J+orlVm/|_LS6Sܑ^OO t3]{y5 ٠3!L,}ሩDh84XRwpkOh8sL3=.Eӆk~*}?zy9od(P"5~'. %h,+E ql'  8lD9u6dMS }CJw,v"`A/d}zeM<]B_ĈdRF@DgлQt Lq46{4R.y/ad<86ғo-mH3$P%ܧlHY#e*t-368z>̇x8H2f^ `7ȧ+ sj) ~_}C5UOqnK8% q9s9hE :[CՕ_]oɱW~y/kCv ^FӖl0IY^SM%`twUwկ눆MFYw4D[6U -F"u]sء Z%Ŕa`X OM&;a˥X~g4 Sߦxm6 N]Ou&_ppbJY08_A$qafclp b6Fm2cm50q׵WOйpwX!c4 vpzQ2H@ez %\8c!/KG n ai.0ԫ!4kiP]P=͹4~Aܞj~ui>}͏wDpR (8gXHz/'~xUs a #dKs>6~-H۾=Jin9uGDq(eӑTd^HxR5%9ZܺX9iAjzOz83tj},u$襺 'Ape:]] \ xbytuqt X3xHxAmKiR9Rt̹t@sšc.l`_(սq ~I`ּu:@o\:SY)]칮@>Rn$QL2i|3R~.S^dsErNމ2{5@} 8ס%65݉ouKuo}Ku~M-)Rwt6*BЇ /?_&vH#s;>d\jU`ؖ1 v'"X4;]6m9?LǏ Ղ74/'<.C'wу=E8CK?8\+. KHvϿKFЊu;8,mZoi[Dt]|WLOkfqƾfԤsnL+Ԛlm Fd6< [,ם+vk@!nD> _^]Q LL~s*/P~ ]`%Ƈdd$b̮ segL T"+JˌV+R+fĞ#& :"b(%7%YIKp=NmE0UgM@BMƝ"6X|)ށw 0{h[[byHnE2.^@$7j_s{D WV%3<]sStEr\sid 5t~u-'m;m;~^Kr:o\6y>$:nC5FFݝĝ}:UYm2޳[+Xӱ8}5)ѓ`5Y (Yi+Dt1b_V 3& .Yr&K%5({Zh) *YXi>`QV>oEU!AbࠥǢ!@lԵW!@2OXsr >Bx@p+Ug;xڢoGw7P}ے pnfGS/|(>˭˷-)÷}zdN*]ntCInjY?&PB2 OXLtulŽs ZcMn6Wrew`Fptȱ:N(Qhs {67w?~;gkmaG3pJVcY +XT}{}K˷ Fۨp##PA [m"ͭJ2I&%UNQ\iD׃ gc,܇Q6&Fc;q@, i6wbҪމۻB;fd/R3& NK_d]T>H{NA2qO^d7>Mhji|C蝷~AY e8OB qk(L$HI ܤJkeL |ĹVF*e$Аh\W>%mɞȭe)l[xGT_&*(;%AyQfLp96"RZ<`^E! !P g0"m`K:Zp[V~*fT* I|G||VoOg[ LAXY1 H !?|%Xaa:+PT"{QQzG|eX 9 PoBNpb#*f,)cE6&+IzMl nH*M*x$Yd[Ps{J"'2b= 8i5ˑL,֑* L >dgm4%? FOxP\˓)m1s )u`zq#!MF-+BAf.Y69K1Z(&KWx$T2dHV>Ut**E$D*Z zKCT\2Pr4G2 f]eɲK.کVyZۮ$]_΋SO쉒{mq<}O{w'8oFU󿾙99l2>9> cOw_tGx}(dKUf\],\N[R-ψH8mKsVET}Pn$T\.ͦS5gt̀SX=M:$/9!(qYE U혲HVx֒suAV;GD/b2QvV^3Q1]eГ39-+Be3Q,VH-¥,5V5!,u H " "΢$ȒE⚄:*iGr4|M*5!+? Y!cdY=$l i v\1)]iK6SmT*ʲ ˯,W1#٦d *X։Jdܐ(H }zql$U5jOu*he;]g _)#e:ẖ[}BY'ÙKk;sXp7:tϽog1XtiXl ;}X Dz28ϼ!q`W|3Vv 0;Ƞz rkޒL~R{?2EzXvaP5~xQbX&溍]gܡcdCtē9#i?9_YWOh g%`Փn$88]r}]VL?®c $ ^ нw`So@힦wH~rZ.Ey ޜղ+VZ0bl. ^O-M-QqtLJm{V\tRR͹r\lT&c>Z7Y'@ J_K<'rWBR BPs{޴ݍFueǃAͭGOCx]w gWl[e7 , R4Ƿh|u3i1e׃sﹲI~O_Lބ7\lqtb6łgm͞繊aUcӰbyI㱨En#+".S5 u`|9qx;K78g')aV;&?]lE.]]n[eZWpV:s!Zwץ`SZ3Vc_m $ f{z( zo8-y+ ~4ݩmYԛx-4 o0uY|LރE$ٕ2 ()1+EtAaJu*!!c(x Ӟ3 Pw$&8gYS%{s2kϧbV@'G?O3CR7t0Q5Lc=փ?yXUeBP+S QESNjo;ێMwћ.2]HLs[2w6<қ.EcUk-7^\9gt ӃSbCӺVoJiuQmJqctIvPrtw,:_[Xj!!^}ď-R . -Ŕp@ $b}Mz#hKM5k7-z7>荋b\X-+bNUzsBVFEo\loR.툍fhG28Mw3Bu=vthzhBJ}?kVґ b"ا-Юv=&  ŭz{?Pz!=LiJ٧1t3oKݭ^DXןqoqj&.:;qQ]ω79 /O+IY!:dVDǛᐄii*O;C?ywf>_&_9\7+$@e!TUR[*e_p8[νiɠUkr.؀βK# uņj;$EmN-CΫ*ʮbjД.%-yz_0!u%@ Y^ 8xIW?(X Ӫy'ɼ/oGn}NOƷXnTu( qԍ],])]MDbIF衋~lPX#չڀ|R@Ui/nWuZ^hJ|CR5fv-T: #Y[vq羳^Pʆ]?;LD^[+L)a1-·B-!AXsA=ՒRxZ1^b3񘍸M)%ѫO_^ZvŵŵGpJnq7*JoX1QûflM`eabS}(Oє*a#s9(QG}M2EL]Tcv[9tJV&@ʮdГ@n8O~rӹUt,^"^Pܙ#[>[eq,h}+ C~f$Z~* N~3I7v$aKxG.Dr!>O724+mXre8]+̶so.tN?β:1|Mӯ̥ꦿ?fW]'.7 J_Oį;jdHQ#zL=g߄警EEJ"*%*lOws4}EOn4z ]>|]svL =rTs/;>vѫK^4?m r=C=\Qw??Ex!?8i'07ߪk+nlɖFG^B@w>T}Eה"vY >QQxBGl=;Rn$ޒAat@ގMwI/N|fW4J2KE2Hڤșٹ1;pX~ue&cO!H!|uoT?;_f"WTu=%Lb:qca-0w-F$`a,1m-2l[/˵>hσ߂{X6*|rkaU$޲8o[_jX= i5Wߨ sUlNNox͹NXԳooWs-ZMfRRޯG3KKY~h> M.߫3Qzow,2: pUFN7UJqHr`zKƛ=ƣr6_YnIoIq6B\2^CI8э}=I:mWߜ(9SM\zb<], 2D1x4qx*iZ7&jOd_漂J(#{F=f}RE,|4k!rG߳l4t ^ߘc.T _qz䷮5EX7~4 E-{jwzR ~K`K^*?7^)禦&ē,0]|ku g% Wl֠zQ֏`R 929[άsI!N(Zd :f`N?׬Qs8ږ$4)Z:vIya^؞XqMFJP.!q`5bemKP6Jz=bX]Eb7#,=Y*v#) y$7**+!iɤ-6=$CEWթ'}Ь<ŀW"Z/wu+F[kUo^kQA \z?>_9>r͝u7t! jU lʟowUƮ.y1 . 3[Nx+vt7-qFﵪ_8UW_z~޷g?rѿv$ǩ5(8O?Z՞˦]6W/U##r*X3F/(N#[9$.FީY?_'j>|m>> 8 !='ů.y!Y6#Mco~ wN!Hj>yogm6X응QZ+:}`7R+HN/;SVuD؝$=bKO#J)w0ʟxIwVBƱ fPM^IE, cRk0G:hcu5ثrTљ橈,ΪPRe>AEAm|s,?ʼnI''?i9*F(l75L74eK_nb{+Uq("3 o¯7)@,LFZ$&!L[V!q]Jw: JLHi"6jbc:2ʮip ֹ4 keU *k SlY$;8 pO}Bf5Я~fIݩCr2YNEpM%a 0"^uЋG L V'4Z7-qcZ3ҹ90 ]{i걏|[հAH8Y\.fN,ޱ|hmn-*sK 'IkkaQtY"NC`{NdU`{ﭵCQ|Xs='|宜rfʙ+g6w&3fG2?|ר,'~c:VbI?3܍ջ?SM%H՚[,g(ca /& Z#Tlco-T&a!Qzuۧ]\2ëᠺH;x6.Cfyl+@2s Aq(SvAu;H.fLUɷ+HLE$i^wr{x(:U ^B/>2Zffyp>nAɗe ?VRv^5*chONATbHG!ͺragj*~WהN΁_<z"}&f4Ylhl6,OvYԦ$%꛹)7M(:P˥VF.KUgK{P \ >geJo/((F(nrhvn#YOc/Yy˺f}4ufiJv^YJDP$pj -) &ލzmRE6J+y#툁^}V_RH 6`$pp`pJJ$hJr݀%BQ-&y7[/ZU^!9+{7j.c YfP0nMYΒ\%.o][%u9{ʒGu:aqkșkz;ķB)ugt'&OK7!ɷ2}ɓ-GNfo]rη!qbA~R18xN'q5KM`824 m>۟X5y |ys(˥e}Nf*/rT[VP1M+B#30\cLPb*lѕPCt-ӈ6RqDr: Ծ XJ'r!:Ǝ9zkߙ.`PaѤw˖ [%ը|4SWQ"BVbk'B]QBʾ@CZy`*'kY&ֹw}hZT%UQ H *= A+GxƬoR0Rr^N-JI#d:iKbk-dJTO \{fMQCm U!@9]cWmm!JJ:ޓzHؼѵR TJʧgkZjHҡlyeE(h1 }>Ƙ7&:I}'ŸYABa"}NYI;A (gջ5zr 6u:}9\3ӈg"vREI"9ib{ٿ7X‘"WR}2K{kMƝ~ ϸsd@9"g q%C^XzpYnRor&OǍTهm}ÿ쳭֐3${ Hz#!G>Ķ9݁݁*>ά#89|z%|  (`JAߋ)9̋޳0%ϩN/F4&sɈpsYkc@]wZ{kșkx:0~ ޅu‘ , !x@>gr1= vT eC^ZE*ly5aD|8F$7s8,P{0ⷜ6D pG\ ~C)&O&aT?k55XpHW&j7>KȳB' w^xN:{ԍ!g G##+襵K(âg1q$com>v!̒%$CߠN ֐Y'7N4y2$c4w^r>\M?>Cf'Wc!Y݀ߧC«CZ[bҘC7HEv#DvVaF@C zqY]8srεfx/rc㑝_tԟ^nu|柩+bhj\G}ězR^tn;}P|>Sn?> B1C`FwPFRILL6R31-ƕFWWڙ-goMh!@C%8^B"V Nv4Du3f:vS߽ڳ@^ >2ۡ醦[|(0;A;]Cwg`NE7&{N1򳚮G}b0mhPgGhK裗EGv%]vȘ[9RpB 8sv;[$AR]pg2NԎz;j(pAFG8ȸ];'BH.\^ZgeJ*8 QM&x s8#'ڍh\#'Q1/ZӨ7&@Ō???^t>:j>&`$:ZBeed_cƒ&$D<%`-9Sv`n>pBG2٫W&p !9$TH0‰H fLRg6\tP׭3Fʌ59bZ]hs9D3zGV bEG/zyj.ܺ0Sds!翞,ŵej(ňo8_vOmlQ4 [YcO|Jd*`JXۃl Qܱ덮)tefFHG%7iXA,l2V]rVwH -$52{l'%SF}3Ak9r.0S>'(#(E.VIHɜ~&evnN>ABC)^Lϴh#תùlQnj)Pb''1Z}KzQ' (m惌Ia!G߯FޙBsSɛdĨ.!N*M{:l^nc"$2)8kO'j6AlH>eQ-*2:XN/-b b6υBΤF︪OV,ɕJEVgND9ZZmʉ0Eۭۦl%N޶'&}8yv5b˥{:9 A T1Y]tquA y|PAU/T 2Tkfbvբ1^<מ8 7WY޽ /8뿖"41(A$qWL' 0m /yӓb-Ean":T?45Č j#ũx%h#I%|^}进ۏՅo OwIC8#6O&AbL|ip֢ QёKF,[ q qH'N66DfmE=Fβp<us_[l#[,^{jx!;Ƒa y߰4FMuy//Go@0)yPfDhS>!&Z+Xc'De &VN0ED;*>,@}̳$]1+by8[el޸kb1u[9h-W!sx 5TM%^y YN [ /G=S~حjBT̑96s!o2m/Ls9 &V d[[?F%(PSp)[ǐzL,$`R2dCdRpDQ>34S9kC+Hq/e~kK6WP993UBȒ"sRcVLZI{UTߒkU[I=AwbyiatԫiIa3˒YJ>'pXt مr (rUk$mPM. /L-AD(T+P=̫^cҭ6%v%t.nA}+B'޺c\9dӪ8Q]U ҁE/ؓ06dvG{i9U\Vם!k38&^a3k'G༗8r6ع0;v8]Ob$ s,yt{nNTq3;<'f'WbsE=& }V-[A\hD'"f9kHю 9sm1/WK۟nּoR7ck=Log vx9>dEBn 9sM&Tt1Y@dvQVmfP (tyx *}䪔|٨yBf1O at2}j{>CNXP?a~ Rq%L)6Zō!g)B v(Be1zKQPAq$Tb,nOIY( 27G3c:a(,6@p nvJAwA]p&,J[%*Tu^Qx-V{fRpd*`ySV.+(B= y*o }/<>x*o a;.,zi.ؗVxEq9M}DxMAyk/^=Wa3zW+wWSp9ҭަ鴤~}"O/zZz+0bZz~מGr{X੣O}ՠ_/OjiCY{߽jÇ ⠻F_ovejɔ}pn?xƧ~d7|>k7 s/@G{r<ː/ bx۟~ks .ZHmdXeS|.3kݠWԢuSfQLK-'[8SvٻnWT~C1/ N>%Nlśƥ#[hxZ>-4횱sHظGx )WOKv'iMft]i 8+kc @F{^;׏Q#@{_"䳓rʺ+d0R[oz"UIHN -L$mHfR NWja&ER:fL.TF ZNCFf1sFӃ,.BgX"[kDD?IV|E֦TJ|H<L$YnC5ZJ)9YBh!g}>Q6(CVYuW~\X'iL;o7:|>tVh"K"d@D A| 8R6;m `gLq4bcb5!*-)&ևbJ&hWq?\39" e@me^[q~?]O)ݜ.r2+E\1J[r;yz{Nz}W͠9Wv4~NzaIX~޵mw0C[2i=6@8Uf2M`rV S1V XQ"ImȅE3q&Z)2fbPJB`⤌)V%YX޴=޴Iv~8zXY9{ߦm1?NudAb8-*otu*Ot: I bAy'H2-J9/+g7į;ʖOp©h8|8u*9 20x;ьfMr#&PP[Ew7"ڊh P}t L}Gi-ȳFMVͅl`-gapn/* MFc5KN=~;+_,zNy |#@oэn@fPm[ԡrP MR+G+8M׊yfp%SIYnb PfoUGrGf6ԓ;e~ý'j-p -lRln7&@^XO,z/<~2j6amlⅰ VǰZTg,NT,X9PT84} VQoDl<01mZDD'A"hњ^r=8]CjfudNs^iNG{/}o2Iv4|y _pV×7|Gv2_;c5}A+r#Zȶ3^gx@G-ՓNb|_~"S#!Vٯ#zNvBX"Y}6,VMZK/qu7o {=b_A7Sʣ됧Io/`uϱ4g4gᩭS:Ewx\ӆnhpeRNnV]Y =|~COU2o-WC-GK /@ݏ=\xÅppk(T*ZsD.HSLT1lX9|8S{7V*Jhsu8 m^M\qFI}]r:sziKd5ZCo9!֭9цӆJcz6WN@v }QDWa;$9c<{LPU=c%zK#',qy^iv=}y>podn~2Zt΁@.j"/O A ΉxMuJM䘲Q *PhgPbs&ɀA!,MDcR\.1 oKvMħJL;1XҒVash eLѕbCۚ D('2*"NX. `c)GTEiLǧgS+S}AьAseˤ-bٻxr(v4ֳׯg)fMp%p>R0mx#ꓸϧ-B @^WJb&Lr7MI'v3hc5bpFj=E@&ϱ%gd2Ʃsef^))O~te;йm\vBQՠgBQjE1{Pu=C2r=p:_I)k"-slVg0)Ӧ 9 ﰕҕq3;X9&fSpqzHH'k.;G*]U(YTTWVXMq&o]TbȖ91Rn}ql:1Ǒ,ÔyoNLud~C=O3~ӏH }Q ֤JUQ!GEJ8~AM՘)MR9V5*ECh[G4XEEb֍u o4K|^zwB-f k|iqm3[n#uVV>S s64=7 wjiLλر3g 1Ɯ*TqQ18B13N`H㴆n5}A/c ܀uHLV爁M}0%'niLj3YgͶꂀhHZ,2%#Z)P,SV', Eه=DFv$+$IwIwƤ ]#e/r͜& FE:ÖhQ,29 7GN ˒jmHHsٛӤGqMAƤ h *=iB+XZvɤzsdNцh=s2wXtܹOX<F(swz89{+1{SYߘ符1S4swzCЧ YV W-Ċю|EVD/ukI0e˜4!h㝞>FB;XsU DF̮UT*Vr`PA 7+N7C,(8 E6rHu#*_/jc9X2%Úƌb7*IZLQh,|3JZ{sЌUCsبe:>VܞY([o|X d}1uM?č7n:`oC:ۭ7/ V#93|ҽmȏ{C]b31[cn9bcZ9Գl 1ŤjkZNgARl y1҄w[~su/8KW7֑ 8f G.a0C,{05)c NalUrD&C-PjVPZqWvV<"Wuw@ j֊ĸxu˚1ǀFJaJj'8%gT9$!u J T6U90I9.ҳ`0(;Ƅ4(efփPB%i/= vb %9#"knm`@לO;O[}D_Y[I C9cDĥ|BO02iĔ[)=LHNƧHɄW.fRJTTJcр6):]iΦ,pjvx) ޸gN0}4`5y>X?Oxe-lڶ"A$ȇ$JVk0o|ܻS {q [yρz i4R]`n8t}7fnyJb5?QqO4v7C82Wdp%Sqw>Elgt fa)7]9J3TQ:]gmcK4n8l^#1Zv8"k"H&w@觴IKcF FL!/چbt##C:[Ʃن8rMjsFq4RC\یZf?w6;A\9ZFƛ'MISGW$=n%wC^b-"YgG,Lv'kB8>m!XyPvȷb$z-qgpw䈍g&`u},4)";7&8 4Z5چy%:0U\_kdJ妐>^6q8^>QVaKak)( F'Uҷ~^?OKl0xnk'x!{k֬)M>&k1$#OcVzQЪ2lĠ2Ԃl%2RWYJdާf_ ꜙCr LfV+Eb0i8U+5V>_v!dXvR*4՝:nOpA}zL,_rZYѰpK"d@D "rȧ )rDd"_пQ3_> Dv#6EL}Wqy ˱#dw^(il,k/ 1W1f4ZK Q'DM{ciԧxbN>,Ar;g[@[ki[ܱ5+9\Z϶(zs/JZQ+r;;4'~wG8-o??{Wȍ/{ l;,U`% 6r9 _2kyI~%,emq1TY~yɪU?Mgt䈠%!X#ynjpG?'ocGגּge|lLVq~ԓo׹m6r>1ІP{ja%תzj9%ΔfjIW~c2no'Mq=1Ќ6iY%KˑMOk 䖎1\u OK9gVЃnsl[xr/V7|Fn=ْV׺~჋ *.%V4Kl6.5ܵw\s(k=v# TsPolZgUGS$l}> lz8K>Ub;.PGQpٕg;; !_j;q[t{LoԪVvS y%nf=Z ٲ) |M8}(B(6c^^\Ott?7Wq!ce0Wˮ;sk&Gy~C_2Cw#GkĆ#ܪLj`2 ?aa{>tRZItnD38 v9[FT-:4% 1Rt<̱s@aΉ@!Eg1o\ҩ&댭70;PBVC_^$^\}Yj Wv,U (X/Cj7oblFj[2":B Dw&&x<vO"~ۖIH^WZeċ4m.:]=tFDr' >XnojXrFr~3-Fۥ)an؛R s)j٫mOU3yƣUD(bՕޢ-h:uGrkZEKɜÔEӭ7$K,?=9˜8/?߾?jQtN(fl?9GWrC[ GQs{FCVWU}[w\Y *,UWwMm5@t[3o鞙7g1`>,K&o% PIp GŭJWaV6C9f]ղyΆ޷lz&۴tjٷmz혛jk=W Vo򓣟>?v4쓎3%ET>GQ:c*DR;_ d*WWOz9/ǣ8S5ꌭ.@CsQky=Nq_$~o`] ^c8 ȗʎ|Pd|"] a:sn,a|#¢Q^FQ7+{(O~4U4X6ClHVsPj.\곺 stɚ`\[ [KK12yhB [ }. Vsv2%sߦY`;lrOB [7%vĥKv>@q~?1ր jxBxG;_׍vF{~{/zEg.&4@35oE:DKd|5uni@s BKٜMԜw?TCב,4g='{*.4u;rnbbo3MuҾu1Am,5dVݙEǻ\榸WXO͓q36&hW,?jw}{;Ρ7fVO ^x/DZ{Bc)?}:;SvX ϑ1o9>8|37vxM{qJ0ul3$DЏ_֒du0}\ * 81jj8 $JBa؇-'012ty,JaOKwz%()5 SwBa*xendKT]vPKv"S:@7@avڭ>|)fv(fr3_񇋙o3v ~.HX V"_yؖt/]kz2x0yւ"ڎ ەrBo Oݰ&7d0}4[HR$ 687qbeXP D '7vŃY<(d┷I8k6&8[,U$~SAmpr!%Vk3h! 6lиkCaB9Q17Q8 h,uZIϿ!N*fX1]Fu4r؆@ꘂsδy Z!#Z[\-\1 ѦuUAРzHAې͇G6hOjW}\1mk6d\q8ejzaٸ$^EUL =\Ž*=T6ӡU~kDZen6~WH)8tx`D 8VmsODpLӚXA_ڀ6`+[4z3.LiVYɽOFG2Fϩk<wpPؗnve 84_ ?ouc7e2㰤')HUM*v%uXR&S y[;CN7D9G2D_(#F1ЏpUtSen5@,B?J[Hd9\^ظJZ3dz$׊r &XbQ܁9Cu}Eq̵n JkWxVe7i(=:蘽R=Fhe=J%&Xd)X祕#-Ba,Hg(<&d)Le]c{!ijCVLYy%mӬkVNj5+߈8mo}0}0MfpLMK>VCnQ4kT—%zPլN9| 'fO*cc<+Ѡcj2 ]9+(J< Vj h rrUir5!x*j$DKDd1LNۗV($ I)CMBjx2]_/@zl/W-N{t%HH(7&|vR@^V~e52wgj]:w>|y=x~0c6qQf_9~JƧ=tV8zꀏ'q5H$A{ Nha~&g"hlgF4ĴmEB=ʃ[/-J5嗉{R-> `/)ڗLa]5akFѾ_]pqBy:dj6TBfs6f{Xpk=zIRցIhnJd<M@;qis. xe!;l츷,(ٹZIk0/ڠH{ =& sϊXu`3X#SN"'ލaAXye^֋z[Xm6G8RBBKPB3XʩrJ(XN+XzbӴk*#Yu~:6ԝ(EM:uB?ٜƲ4$ړo͆h|UC 5>΁GR Ҽ ʬgY}z6K)ڠ6h\qE:8pRdRc>A!+{RF/ձjnF&sDpߎ#`̱~kF(iEtyiqQ]l-FvDX)ڽ57.?Xd8F.Vpo YXص"OfFݛm"̤XB_>;qgۃ.M$Ív>#ˈ}*e9~(Bӌii'I&ER7Y?#rÛ[DEh4:(,-\U.o[({@ĥHX_FɕR%AWnGSf?] q0qH|PX,Rϵclc6wJws/3A m8;J,kbcg5^K 5#Vqhj&_ۃG!kjw>k˨X­C2MiNgT&d7YRuUvlx +zӂm<ˋD~i"[Us[jYUZH: :,bQHr(+=Ra`E1* Cc~#.K|\sq9))4Wcޕ0,);LdL;~rս ^#uvZUݼ)?TX!,ݳ`=|E(eK[C7N-ٛYQS2hO7MnSa&܏p܎qCmͿ?dϤ9'XJ0g N[^_JClrXNdby[T$FxpHt:I4%Q̱kr*Pw(4.ge)].KGW_!g*5}s{7Q'''u"]\yxqstd u^|J0+%TkD ,JL<ǔ1.$2&"Rmh,z0 [o݃ cڊ}O ! H ڷ% ΋QbkD`)q󔊇F=6|[ n%:I6XI' C 7w.V#]pZ Șܵ#؃˟.~8rRh8qW7ݳ!bFn1@b0]"S Lt o[JW3̫d0!"e`j(V8Yws NXjEKAeo~@ ?0K2a(/D|! Q|ddpN`juŪw꬚i-jh/R193" ¨D<jDOBYn5 ֲj5HabAFI ÇfIbBZ1S$;K0䷰NUԃudDFsBQ!H2eLs!xuܞV9|йʗ .jqHč!q Pnq('zq{ ӓK54?ch/%ŘɎnjhMs8 qTZ"/=^,i6GZ{^͍0Wzy|6W}s@w!bHs^O'bCPAqP뱴_yŽne 8HJv^;5p`s]xto f-w+_or$^•*'zjhu9,;^_vcyw|xC, ćnt7yvH]6ߺg}?FPwe^Y܏$SҟLǃ4K~\{>Yhgtk(!L]*Zֿ5ʹaђ7pGYt{RDsf%{$ZU[e}v"md:'2[њ n똣AGLe>ba<,ˋM0l mF<)'H 㬃L.sݔ{b&3yUx}Ǽ!x6dk,4Mh[#+?*{86$|YlmpoFr87rWT&)-f^%r^T1Ul(g{@u]oI 荟8QΫ/`Cjt[<,a^:^\la"T][>ZlU#Qu$a _ɔ_?X5CH!x?̗*vlllLQV6OP$Zt38Xe43 v:k 7ȃnt27Fs.B"cӣAilqoס¬<uRQ {XqDU4j\w/<]$eCA^/ 7֞.h < pO?fS5 :wťQ &X4X $Pqe`ƻB2O`;]х0(rUFbLٻEm¦1J_mڥ^\嘠,gri+IXhGƇ6* `CH04:0 c< 4JTexk)a Evലl> 6K0,n-0kIAֵZ6D(AD]NkpүHW}Z1͐bj^dS ?TN)_s'镕ʯ[,Fym1aC5ޓY c6;Ի.K%B}{3󔡶CP]="ZU./]җ4%M*}YOK_6,T'V @~4xD6B1bMN$ۇb$s\P$NN+o9ʄ*Dy 3C/MHLYqbL*[bA(X BYw&=)(,6Dj%'`H+(\[imu}$>G 1}(m=1 `%<*eʀgQ'28qu醊 uo I3RhKAsJd6\n]e%SnOZgH9uwLN [ .K-2ZS K+D)d|=nXq,O%sy)0܆J;D!#',q7DQmg\!ƹ $Nx!ypw()b4ORv c +XȜ`eB)e'"$J$~rrOH>mNHG((rGS$2&δ*[twWX5 P>zox8 c<~4=lN8#u aCiE1-vL%Ӫ%k1ʁ:77!5DV@+y$iv33kb oiD1%ՅzM+|J.5J #T:[‰J^ǻ&ZLi1Ɂk;(5%`)19k-UѽʿwMVL[2iג])ԯ|"yP/u}e ݑw\TrĕR\>̻߫C⍱ bc<o_/`|p16?aҰ]EޔPw{/,0yA9*M@TBJJRp \~Լ[xrbމMh^S`ki{X3r5ʌ z\$/?H_nUʈ;j).Bg2%(dֻޘ"I ^{2&7PR)`?*J]>9X`eŽ?Zm]^E[!5"J*Fw+wٸ^@^Z,WpOO*[frAR S͜UgU (g{0@ I&|]tmAG7' ~xәovᄢbuzs%T/geEj̶|Fj ^$ñf31#;0 cQay,g;15=ң\y9#ka,bUp-=k[cR~Y3UMǫ=n),蟃-t٬L!Xn3P6oѷN>O7Co;9.~3d fT 5fMmXz^CZ=~ǩ I&]^UX}$:AqTq:¯zڶFo5c飣_S@?Oj2F0j B!%񼪇1s\Vd5`1x9b+>Z;ag;"zMBz1=+Cmt(Wp%VG ΦpO+ 4M_Ocޚ%4AbLуw2Vrl]Kqj2tq XL( Uu9ƺ&SSj{PIXg+wڅ>t1Tof_VB$j;pos@yB@z H]">8/Dtӟ`펊7.p2 .p2^騘zENGx.@?*N3EK S)-в`4CZ}ˀp̽M1Պ*nHB3!0@9JPZo쓅^ͩWv[-{Dl|ZpX`F3adl&W,{"T=T nw{V igEnBRoփrJ l_EyTN0-KE2O?b#AI.8şEm!jeo^-ʺtroW@V֏lݜO?vb#! l;@L+o(src<-8vYE597'Xz*׹*hhAY3kɷUjBrͺϷŇ5|"(kk-5Vh27-g2W8^2@~P:<"h1`0AVuh_g9 +Ýr*,`>]?^Wfw\~/ o+׻c*3Lj<`dOL7e%9{3Ӌ\ǰه26g[ƫjq C4i*tNjbl&e|UtVlsJ$4.xtCX~ox9ҀafD1غNw0zQGOb8'_n6.b,D/szUOׅdyVL+~~*aע/R1F/BqV>r$S^=ZeWT$S 04ܫ;~kv^H^c;Fe E 8ח[?8X]sW &'3:to&3_ż+2ث,X .r!c7׶,0)pŧp ߆=ʷia3!)J7^ m0'|r_r [);x <>b,@6$p<6߷$iF3Vb죪9IT)s.Á|G&-}rG[`(,f{eꄂó]3.L!Z"7x͠7g:*V\,D:AY,R7HEÂ|Kqhs);2:^t4,6yWO&?СaI~vZ>\䓃i;FE}|;Ȁ{x Z=gx +Ig4덭p4wݛ.{K_42x~{nyv2s~+~NQ._ 7$Oy},pz_Q!_}L^Y3Oz_der r.g<6p|֯~LgYMdzIXN}@W:<4Yҧ `90'scٷ7v|#`wf4ݼ{wxwM||k]0 C۷1la?>~x9}Vڟ= f_^<&٪fO!"=pj8v~J1{ 1봏N Z lb~2J{QgRԤS=)  Z%OGSHNp߼_dL}ʜ`,C/9 e9gɚ{>&)f:3ۥ+oR^v'A2՟Iw1̀jPA($ r._fAAOk p0>YqXpA~0 |Hg@^ $ƒ^JU(^Msu h0zCnų,rW^k__y4YM.~F< 3]n~zz "D/6[o\Ɠi&stH=Xki!I`;8njaT3JT,=rō>v1L "TIR)&P.r5<jH7`tKZ`hT̈́'9BLL!ϡ wu0VA!Im[m]& ԆTYJ4sLϤdLS՝01Yp`Q` 0i.h|YX^ {PGX޿Ƚ{/Ɩ2Pr< z6Mo|w@% $qwJOw$;O`>^;MȰLh0A 5 t>]CXcEqMER7'ͦ) /di?]Nb-r5:Lz/ +dMw*Cu& `Ǔ8&,VeVJb+ 1 CLPr#(d" iFv"(NdXHKdBT1 މcT)qL'e|  >ƺXU cbRE@ʘ:f\„ !rjG82q45^qj^qI] ] n-ԊZRuW;{=q)/| s^B;?*p mǷN/oysiuL#һ`q┓O`#T(AT7-Uqտbwq z ѫ ol8exZ PxDeY!(: @ӯoaۭV=۱ʳV?0$ [CSvj>A5ߝH:йIP !tqGi܆TjŨIuA] :1IAk y2ӪƹVK{NZZTaAspͧ9P)Z^vI*3#[Foa5dݓ1cmshzY(g˗o_i+Lz.Wpvg8)rwKԥp8•WUcAvK`]INL,B].tF~87R_!1x9 9볋34`'a]&|h,UτL2=~G+5 m3VFt xm SzTVCvK(̣twY(QD#H@֔[± Vz[`95kv:c ޘYc*kQt538c;EO-%`̐F qHEPB% f.EE[A 'QpQ-RS}mwv')>Y6\c4`/sц "ބ/6tȳQk8}L?5 [i\P9OCђ~:(xY%,N>5_HE 2& AIEB[o҂pp4ЛLf6j 7S*NF;Dhzq:MN0j,ۨqG^hg455֢pskYXXX`#E7^`0yRzGMi2OUhq:Wv2O@=@ohM֎ g%&-kLbr5&%U#VR趻bäTɜ28nS!gyA&ڈ2p$+>m##q.`۩([Z"ZEV"|ȕi\Jd#M1hym5-"rx701I%\RA-!K  U!l0Bä7oWY7p:.w G¯Q!t4: M[|{O^9-7s|yet 2 qN-Z#5V^65CyUsn`)hkT^ 9Ǻ:*uTBGkBVix2HZhĽR:Y I{\c1QVp/{1b8s) 8VjPPyPm'6@#Wᅆ=&w͎!G76xH!5yup,\ hMwyI#& &mkRgb|QU#0DNN[ =TWDE T_v#ZVKeKm5 H Is1 QRu_هO_L٣1ªO ֯fQR / 4ׇI=g0c*ѱL LX7\3h%n^so^twA7+K~Gʀ_aTjXY cJ-~ֈW㝕CUdžnTe?Oٸ$n|K+Jckepn1Wlwѳ}u=+Q :Tߍ2VeǔayU T-~i&fo'5w4Aʃ?X`">1Qc2zyi{m6}?i?FВCSڃ(ED!3$66a8-gHRhT[a/!?Pcvv-¡:$Kj~\0'qn6 dr:(Tu8TCp(. {k8Xct8F Ð[/j47z5>:&#CҺyi'6q32mwpݣR[RYcq4Dp-.m,>3B6Je{iޭMQRXW_FN9=sJQf rhO -׼BYU( e]W[PfƽaXk̰U:u# ImlU6c.Lp GMu ʊ&A@4h]/fM1[ӏv=n/< qe,<{ǴLwgݟ"R1"OFFUS7^NzQEa.io̺@Z P23I+/ n ?hDUT܇dا&T7NHUuښ5֚E@nxDXԥ aǚx:suyCrgR&ךx_Qk]V$E5_.~+ű^ 00^BuW UxŐDpDGjiȂd.$ƒR 68$I^ffgK™Ygv:aOϼtZג=7Q%X):m`e"_ JEv1ھ|oUBh_ 5zHN,sgxל•ܕYDhtAjJ}RcP'Gi°@ zR8~> ƚWI^ҖӀqaMs9W^zt%%5C|fW-?qvvTߦ\$ Cc(|^[9VBqt TNf5WK8'N ,cB+%B#~tZ:6q `Q|at\ {I!Ncťiv\iiJy,$z_2ޡ?Os0[0z5=t1{\2KrR9 A-ep_pndS~l'IciCoG rN]v/ljfEɔsP;b$K~98_TvdOVF&G5H8Y}v`bkz=r[Ix^u&0%A̾8k=ǖW8z~-q%Y7/Qa'uU×N12-{v9Cm^(m=0=cQ:v$.&P0@Ik ༎k [#Am> [ :U$_`4vO8ddP݋P'@CڡPPiRDjƻ`8jExJ=r) ]u>sELg6}Xpc]Wh.g[bYq/5`Yo5Π>K&^Z 06kJM/MԁD 0z'elT\"P۴LUyKkp{c ʮײS.d5`]q\c*g6*fRPO˜R f%L$VS % \R-V&IBb]("0٪)QM03L%FSLP;Xk6\%{j` n($0 h̃)F{R7Vd"l^|1[C 4_܍/UD7;>*o9,ɳ >S%| ys޵.n`gXvV^fڠj\Q~^>wc -8f5QoVBD'ʎlVyqǜ#6{X 44&=­rZ^TW@Z y\O7\*F ll6ֿ GgW'\QDEh]+ٯBmPJFPj}:p,*0]lycd‹n=i͑ת1p_"KؑAxnzH*X 548q;4%o4O|OjAD[Vx?:<NkLa(^(B7QРD%49vNG,e@TTKn[Ahi0+LJoÚ'9E*y_L1vh1Dz>[lCK]o}Π/3ы؇[̜{opzsb?oߟx2觇*qDAkIT7M{OrC<>Dc^ާxFs$FN ))sGhdid$PQ P)sn? #_ށ3l =Fp{5^P:3!r[JOD=-^0GT r\C@FF!bqz e(`T6S1PTl} /4T(KEolCJpL`p`,VB C 8N&)sbЖhIP+:HtZ WV#(AҨe|602#ctQy)a@RCKȹܫgj{_a' q9 d!{K`l/rf+d%u[fwKxLVͪHV(vkPKI "#IZL.엥SGa3'怊vNө :8BhePO9A3Œd^2aP8ìW>e\8kA F(qT%QWR`QJe]yW.9u0Hƾ[(-%/";}Rpt*!"Vi^NdN;^$N5|2E0nC.# }{e>#R Jp4#;\΄PDi^V"!"D*h؆$=:c TdtzHI3HI4BUmM&C ljἧٵ\kEk6Ĉ6zv!QuWNT-r:<AV__?7j 7n&z?& .|n5[E;V=](|xn;Sgytr~{ۮL|ZK*\W[z ߕ_V\ ,nm (WmX4 s:5 AP50a|iv%Ќs 5>FW6 z.&i[I5M*.%7,a*@12A(Xg*}>dxmlA+03Ohm_5E(AXq*뇮MQT@x{b"Co:_ nAe  k<]?<\p^sk.5U6_nz<3+#Jg.%xmN&w,~CdO݇IStvw1YJs|U\#Td*hN&@Ceބ]-\\=9V-7| hh5P3r|ЙWwep dI_$ITỡ25T]zl $RI9h"D ,uD,늩ESҧLW +m-(5 trxT{J]B b*۵ _lR4&8z ./4~;wYQ޼]]_!yf-R|u(oQNZEBީC>dx`|DW1HO񫛹7iQۦFǫCV㢖.~fӿ,6\(B 7Ւ'YPO PļD5z@d(Ƀ^0hrDкgGa3#Nm3PzXjO1! EL"1 k~!)@¦Pd ^4`^txluzSc/?μ4@J7 $QW'o$#3H!P0)Y5 .j/D efq 쎡V ^T*P3N+pBHZPPS)*1;9(Wd :Hy%rʼ"@%8\牆7 RSf.gpM*O//q:-\%2on S0M׭VYXWl,Ql*wI^*ť2mN/+\ gj>L =&= w#9cZCx"ͅNҌMr*5&":gNyf+r8=9T(N3;"!pjZ,:TKԫ_$;ڮmK!g3B9!LKm_8Ş]1ZȇgĠ5Lv;|[~@Rh.۷sd?j (Sp\393Kp+m6˥WXAsd3 Eu\ 7T:AZFpsZ9F -QJeDP2B!QVeuKj6IaXЃA[N5׃.rSg)A}SFznzKIJ#Ћ*y#!qd3%8HK=48$ظ@cx%Sԁ2،Tޠj,SdJTY1<(Tp".d5&Ls&=#$h( 9>ކ6wACVtzuq=pa! b8C=t 48`ۓ3J ]بIњ_bйTgJ/. =Y3{+N 3ڪ6z<4+ZqYt'ǵ=m}ǻO,?֍ ,E9:pzQ~[/HiTH+٤+qZOKB٠Ac8c  4[ѾG"#l|"Q)|yEr7Lr3YRs_xȴUY~dw b0CA4B:I8Y4jC4ÀXC4#JmtԉQ37lhAp9p+lNP BD;F0u@+g2\!hxnđ Q ScI%UyO0 NC/@‡:M?oO՛OCqzUo+Dž\~y(-6vg AlOw5Jbj8}!r㢮rVY9_0ae`8.um98U܉u/ g:8{G3lƻ 383T&AI5w$I6}e6xF +a4`f,>)jX^3üqԳp=@CP7R -Foa] lR7Yg(߁SV^p(Q~cRHmj|<3T6궮P@ LyLיQFj4=@VޝL̈́ \3ĖLHT Tqm1BL mZc&Zޓnđfj[DFr;p;8nSPkn0*u N@P0 󄠼v`ӄ p`e%Y0unD& F5mJ"*BtթET(SJwؽ. 1:IpGvrO5{rNoℭÍx159FLpSYNn',ou-24)ɲpe˯ƍhrs,av{Q] M?x4뫻D"B $ۓ$P*{'5TOU$PIFJwj OHH¦B TΟP*N x N[JgTIl@{zptTx;R.8Ӝz4l5HXY*P+Z[1&uba2#bزh{DHbwp+>ыsJErGյ*~5h`j8{JT5LH+D:4' >^ :sn\S[H%L@zZq<1>TW-В\8z]\ WpQg^a U'UָJ<4B֪.T8d ի>ވӄH#s龵MzzaXߍ\ )=];ޒޡ(c3=r@[ȰG~tc @Aw-3G[@aD2z1+y-x gy +UAjAv vY9)e%ċ puެmQ@E&iʁӜ(ߝKރ?_@`u=R7ZLѢz.3mO J* Jϛ=7xU` ^IaL=A zCqLǧP2!T e5T_7f1Jذ2*ERx9SxPaoqBLv5 B-2V}i>\/&.28*/|(s'zt9$ɗo%_;'O~|^/lJ%lIh͉Et(P #g#]廎z*X+5cgE%:*3Tr-(rc3VtQ>ctã@!` &zzt׆ DQBƮ`ݛad~fG*hF8|su2rr);:΃*DꓐϷvq[*AD*ĄTτq'ǴDx/2 XA#jp8bX ŵ8ɬoH)+Q墇 З;c(OF1"#%\W҅7\ 97QbܡsdV~Ib`D6k ֞U@AGA W%ʈDc&|9A$1` G:QQ3roqpu3"Fg(3TZVP3!@t%L0jyϖtog.3!{c*FTL)2*H8xǃvLgaԕWA0tdPfq2M#AH(!L gc4g Dm5^mZD<(LwAS]Fټ[:4䉫hN yfV["JNbЭ-q-V٩[L9d 5UeBbv|%+{ H.Sn~-L {PɸWD{A˵;t-.|~~bC%#&M Xsa~w%ٿBjl @4*PL{sg)|e΋yTvtW,$ C$QQDRK9$D'3C$Q)DR,b1(:g LӦk}ҝ"Au ;15%:a؄P/ekBhUO?ŒZ0-'<:Li:ʤhoDw梹{z8C9,"Wm(l@zS%‡D8@`OS紅HY9T1B+sY= 39Á4R>n:v=,(jG,.S@KfSFe:2BWn9:E!G!IX(1h(Ɍ d }ιp˝[Όc@iENK?XG׎d<}ec 5\P.P09F X]~7vģCi)Ic-Z'6)~+19 fHF "vذO>xݾT_%--Emfcl J@ިrJ'#*:p&yT׻oHwi~P&AmYIJv7GAV *6$pӳ(JvHI@IBa$ BFu2Sya68uI!Bh#2$AVPq)oLP[P@IGXyA_+QM| )NPKi `A0P]x_0`/5ãrb'.t3df(>ZIEO.)57Y51qeDae&!ؓdC9n=ɡI Ήf͖b(5[/ qm,6-bEbgZ/*ج< @Xn[1Aψclaxՠ)U9O$0zqJ&F{*갢I$;;{?]M]ZX.݌FfF_ֆ#keMHY98R} g6z vz*_ vvq V.v +艠u m_e9NE{o:F@7 ,.-b0x?RІ!9S%Of`t%oxY~&y:Zfz^B!>#?׹|ͯ9@ ;3=`у!,#g?#u󼸀ٿ 9 3|=(dC8ξfvg|h0;5,|z35abڑg02cs=SO.!+Yos `2 "RR yû7ҥD٨1&)R9]a&K%끕e9-&@Q:|nR@DAXqbܕ;Ӷ Fd6 -O4*O87:kj so\;_tYڹI}{0 ¯Ygzt>,KeFof[oM'@^6hHT\DޯA*sRIS!C90o'_&O[$2HI~r9ȇ^|E>J"#o'R8 eJyS2<'\P ]fQSIYdˏ&F;M=0zXI*Bm#/  ^ " @( IV} nn$b :IH梖@kj?I>i>"`S-;$dv8k ~ ht3obn݈`pvVCJ|(aR3aDU̸=<|o$nzK]̛L!Ś2Iw{vr̂󀫽_9ڝWZzFC΢75'DŁ8ٌzP۩m(nfVOP-+ImY,ʽgUHeuh+H%ur_0y>{g (a^΂R  j?A(QpM de^叟9ǝ71 EkJJkVuHCdH[b 썸ƣN8 8Kqm= (:y8 '✔CKU ϰ)9) p: f$'7ONdhJs}؈k%mŃK(#z z ]+03gWD {(2[,?$S(TtQ΀V)˅Zd B1jeLo< fϊUN׍ Wuxj:,߬QJ]fp *$P9ՠ*{xc_I q 3紅HY9T14Z9g~l9bJi5ű)3<$'ȋ"/* /AU"&v0JBqѢ* ErQS(/A^"oe6$!heGɵl5f!wQ'9~D8ҨS7nǏau{Uy[lbPr0 vDfg~?, vBM5R1_mZ (K["t|יGw 2~Y`ɦObyi<{]8͵Yݵd4[`b8}=tиۼxpodC},ܸ/btBi|qAwjM&˃z!ey-][].YTLhYtCC {Mt! tKA)I559ZjА'):vwn[*ʨNjUЭAitU-вV錧!u`Ht=reA>+! ` %og㿲L15E/_\_+JvYK둭d&An4xCl0&~fnOf2̇-ʫ`Y2IZ,KO"bwc)pf-]i4uv? !Q ;5Dn iChK.HĹ%1FzHӢk_gQ~z?"S;ڝh2r4 @U 0`cqcS|iLD{Fų4¥~ob QW(֦8&)VМ" ߨtS}ۄMEvfb@gM" M޵~tjf;putW~|x=~y}ߜ_~ϟ_MƆ6HϷ~ksuxbr}Ŀ\1_Ě/=t#ʦl8,[7M^+yv?~@,4pċtЍV/N2 j5zA-UZ?V jQEUZ Ԓd'#Ob-+ZIS>7Q8Sem-Тs;;ac<ǎWk0IbxY1k{0'>Um_3Ң؝C ɐ bdȕR1 F0cgJ>8q©K7VA~Gϐzs., (*-l 4PHy1ͥ]*]%C€Dߒb%%2Ӓ&a$fJIc$4DE i- UE"+u3<\b Mi}~G \(oȃu"@J5ugjފEnC"4 4TRkXc?&866x<ۊ14^H-t~^H "E5 $zvkZV( R)t ,݊=:~Ü\Ϻ BD S*N,ޮ <S$8)RG~9$Aaˌ"w`BXKxű , q)BAp!Cq nIMЧDepγTy9̨(,b# yM-JXO|[kq4'r'$CMC0fH*#"BC JO=qa(cFG<Ra("b.(4"`Fļ!S>DjjFr*a9(ǑTDJP5PpC$z 8s7X82 Ein02 91#`JJ;iH>L⥄B7@)q",܆)J`JłOS͘06cj!b"y2 O7ow<+@,yy8pE1$wwIF=Q_1esB0K k%\ VOܱIogp5* A-G8- x RE2 In(X#ph dh A22"f4Ep Q~a.E%{'OFn+ɆQ]W:s6s|f厓0:#͆ȅ nxavޥM]Xm}U6@C%I[x7Tq/q}&9?7&ɳicl̈́!W{Md-ĥ!I_핽f0}0=Kٗ `m5<{0GgHA27QOUQTFY({*]|D |EvOOhW`64xmZqåH]/wXD6&V'W"Sw76Ĉۮ(n Me'浳Q/N~ҏ 1w3 ZQF3֖mUbpRg6)#Oq!Su/Su|!^OC+j+w=Z@sZK^荒w~ׇŽP眖Nw=Z@sZ`l[_{#olZ)= o(`b*r*‰n5z7o)DG-)}$UTJ  |RK qk/+1!uڵJ?B]WjԺaEa=Ύz%G~VVǧlHy8Z|zTf G!Q{n6zSz6&&,6]3t`@Zms.VQ ""qH@yȐ(M@Hdi @D\fasUC:hrN]cbˬIk2{-y \7SMfogcb. |6$3t!*Xiu/u \cBwd;^G3H/zЇ ǴXnNx6~bk7_i>J4ȷk8kfK҅ܒ@wBoCB"'SLl+SnǴY*Jlw:*kdRb1bۺZnmb 6[-~gFΟl0Y~~dĂϾX:X'_?d7g =*kp{6#,U*⧛Y< gF,=-30r]J~Nuo۾tj疂]&Loȋ9@_S0o$(f?6o-X ~!Uoz"\1[!|~h9"b$U lcˣB_MLZx9/brì;b]>GGMNSHil],4~*aovLI\\ݎYb1w*ɼᜅn#`ps{:qbgչDGދjz?uK']7@e0]Zԁ&#,1ble1r*`Փ 4`9*'ޯe>է[c& U>O:[4ʶ/#, [M4P:qEg7G9gV+vNxĊ͕s3@8_Фzr `h.MT)J Wf_m sL$ϗ8n"!'e@up> M{dl 1hZ>Yr_+><=>!P}ÅOe.=.zW A. O Z9^ĥuDМ[{[۹&+Vt/&%ŕگ8}^r4@#.< w23<*(J .s>tu5\|s BH]3Zw.q9O:2XRMTNnus>q:ĄPOҒpegaXII$r5{*;cHe>bɎեx>/5sF ԃ[a$t2)`sƸ貍UT;4<8r;V2.p;,rvo PZ.xXIϚ `WJN9~WPrnˑD(BXq/ CFj 0Saݷa%x[4.X)Es˔D/J+OGkil[R۹PNY^]o#\;F{9Jx 蕢y&)J:>Ub꺔 ? K|{|)VSn x!T~F^daL -)>~ϒe#Τ[i_0%SɞZ7!fǺU>; 1r fks[rwn Õbq'5r!.{NdkG<0N bHoM`TAi5Wr܀6FEa>;ER5\"kfkq0xAuSH&s3 U Ox`ƿoj4O)bNQ(`RkGX䬇V~7iO)3gIws?*a.K9ގےg#K,{uXa}";LEj8W.ohvSvw(wE' Fⶌ|kn637g3.dôj®5w{\J&Z5xZRWcuMɿ[ɀWĂ@k;nZ4^!֚2O%@{ EӼһR֝!D%ϗA|L껧 V/д=Uk1>ejO660f~gfL(}N)T`З.F iO-b3>Ե?D`2ɭ:.!}Y!,"2jfv㫻:O耙Ǐ$p>& WGAJT]͡fχ8=ɔOGǾd8 5ߐ7nȆCr+7^]ko+>K(6iQ_-J*Ki/r,ǖ%9۠MdYg3b`)%x2PƽeOax?dj213 l00[+h#FO"GH1A Cng./HjR`^pr/J}:-jMfz4a-l56ţSB!B޸GܹrڪiCsTw쯎{hsQ:Q*F{h)ƊgJ@!IQR,uD_$j@97 NɡmnwFԠ.8zkĂF0@0#5( U"Fjn Ejs:5%٬Jo B,'^lfvӱٱʹ2>e8F/NǥVRQ$   "F(DPmݫq/Kån={<7wŘuUw{n#ȅpZNÉG0tP7c!t50iRTti|yHn(\%%AG۬?cN  tԲymr):=K!&Qj˙ᭈgV.u]񛤙 рx\#~\4Q^ `nU;a!z35օyM:[ ՔP (P+ uLliq {=%LEB"b2bHj)D 3(6"P0 U1FH᫜>=\Lbp:*tWT}0(.U&Uzj#5`+lV\܇Cր Dső-rq3s'f=MOFht49Dg!GϡY!xVlzg3#ȫ200_b6QD~ Ə&2Uk%m3k?8}. d đ f  z# sJH~( T"Ʃř٠p#QUO&R.r X!avE|aP !T%B)2a(` &E C"+I8xie,ɝsWo`\!2[)5`C@v8p|QBxl`p8S@,5ɸ2i6TvX_ yt{nҍ~urOѽ/W`N Sê3 jNZԮYrRFG((Oo~jXG#ezqy^6+a4"Aс.aރ(jƁHJߩC`mFYcG4ph#*@1k>Ȭk r.hf]tjٍKxprQs {4HikkobFQNH(( E$  (4$ 1$vܮ-s~f#HrQφ7)O_Zek;tæg;5hŻ^-vBF,D7_h q;^+~DzJV=@\¸v cQq44TOZ^>"&;(j۽œt< B, x Mc7[Z׊t?>۔ҫ^-WZ@<8C~V+ΔvbΉi:o7 g#;f ?Bxڧ3078%1P1RGkP,dPpnZmV`FR&VND2iA=vql5aF.(ړ\dlOHiSr!wHyE&|7u1ux~F_g!K-M>plA:5LY#7CfX1}w=sgPE$zI1zWϲޓ/yo3g³2*~lɿ}o0pE[eE_.f/l*??>ۇ"|ra>!lF62k,8(4o "Iĕm2)dDhYDX@hČ$4RL YɈG+c5KK P&@aT`L JR:C߷Spcz%p6-2k!hI}[&B0+Qar.,3 QioXDQ$!@I!&#B @Rl2  PQaĉLoF(oDa*˶oW^e բ!z-k 4\ϟm-Ax=+k$-m}9xVI+k\~},T Kty/1@b?iCݨ#Xp0>Of~'ӹe;}v LM ews nA#pm=2"BO5nx\UϤP'vTV4y@Nڥ"#]_)EwHCg¼S41ScҢ4K|$8 .>zwJu/R4# uezBEw6 `k;b0NjεuF:*PXusYa}wgσWx]etP\¢+S<$`>c]><+73|w\hz21>7c_rFr:W<Py M֣@%{qfZ y)i~.R0dOzZlRfL*b^Rxy1M3Im8z̢6pfZ_\r~n,~١:^Bw[㡞t2}9<)k%eױz"P\ؿW'ә*q:O/B8 "ИS %E7/ &#_y:.A7vmEL)l)S˝rwom؛T7GG7b%#YM3+eu$7em5jf#o8U`~/ §b@}-;w6aK=s]Y冝5+ʄ٥0:k&?}\..ܝX?D%X 3m~Not42W1Y>'GHHX M0CJDŤ zrуe_bØ Iؘ.V0_>jIџ3D263]ezqdO'j!#uXǷjMS]4駐q"w< = @)*NW؟ .,9G6A:߲|lt`f3b~zoD÷7.o`ȌΙ~kc,muΦٴdN^"~̿lDVlzk$q2o;rHsZ(g̵k`ff8\J[k\-[ c xf8T!4-](^8i[;&+dm$]δ$g I $/ﶿ 5X=ߛ(vUW;#L+ХK%&H҈3PD̤4:\c,noU*>QV3}gyhI_gs\4$,]+ʩRj_r9Ҿ^hKml4洫5V #Lм"ljaC)ƅwM~TNȬ' $XN50_GuME|QXk~wTcy\U,ʋݭTIUkLQދLJvVPuZ`Uޭ/ʽ|(vޗnׄϺH}Zh)[TՄr;Gq4^Bt͕XɶqD4sJo աx1phOg8B1^M }-?@>_/@~Cl `Oƻ&7Xooԇ;%Iy;rKt%v(|$* fIC23 mksP9XwZ׶ޮr;K\yg^}]4w91ǏW mU0O'# CjI0}j>s)r#vk\L^^x7?*59zi8wU˩쯙gn9g+=YKM.gQRP[.fՎZ]vъ>uahGb1611=C.aF.g8Hֲش9atVZ67~*Dn U]d\v@I:jv-Ih]" m-'鱺xE۸(ЉKa Sq ĵY))JDgRih$eXgGu_P K˵3mGT# M`;ȔLgFQ)\JaeҘʕ0>Oj[\[A%A+SzsujJ1J32e,8rC%_3UeU`"xXXڶ-^fOGd@1), T`s]8$:; 4 <5Sx,lK7BMZ DʊJ++0Z+p˔Όe`(`ĄH² $`VI *[3*7*hoɭLV)MpĜT/ @~˾P9ͧ bc]T> tW@xFB>N\!CSQ`(pq3% p/RϨMjt/_ڱ zȗfJծ0ūx۪mظL\SLy "Eiw7YLGʚ`uP23OqxNyF!#o7ƔG/6~ρj@͖h8Zo7<Mc8x7~M6cp˧(-?íZ+]VbN+^ƺ=(Af_ko%&j!)S،nj0/m ηg=N')z$!U@=Ox6w~!cc[n\5[69_&G0L`hD$ds+A\*N,ty0πCA  s$ ~Qhrt-g.؇bhL(b b $%h QiYfFJD4 S`v!^m|3͝Lf9tj)'ecb%LЌS#"MaUD% d(z&AVZ4T , JSbd㼻W03vfP)wl=}y6el(5cscpm _#9< z Dwu.JcIO?j0gkŘiszZlw=zfv0}Z<>-n`[+uX<^z"_+ի{m:ǃyG_F0˙͞nhE4|d 7]6`|:ԩg9&˲ٴD<K. (Ipem1db=-,2z[K<]}:W^҉'A"`G?DovL^|AZX|=#WWB?VM h;;ng'PeֵbW9nْӤüwrt(FCys p[ h_L}mf,aN֘ㅊ3AFB+CW+PN8jjFEJzqN]E,dyS <;NC3[̜U@ Ĩ!6H-Ox̴.Uis=νOY ʎq177;e_'byd 6Fh*ID8r&U4Q,ձ)> EG&e U^Uqz.qwSsM@>Q{x$.9@~Ws@LRa vDJh0yJx`]ljM] 6amO!Mj`awC\ @l,HaU ]&<_\ THeY=w+7Bjk%}HړHh%E]44Z`@ъ$FbT>mo%q+mΒ쭎_)-{'>tqcP2W1ԟMhyIi-&Zf)R0{r~s=@Wh0{}h-V 9G=˂vٳ)L]q;bڭRՈ@v"ࠃ_7 6& 8 /'O d^4_]k\Hq{//zsvˋn֟Hß.oY'ؒ=DZ9Jn$ Yԇ]jlӣS_c >]%m5rJpxXSS !hN}kcss K ˷ѯ  ; X !q8|J2 $T7dP4ݏe#Z!3 8SRvT(BkD:IEcuq#VmUStyΒ^Fob Ia(vQ_1ma@0mS%Ҩ.ږC#7KD J\Rzd9k*xW:^?UY ?6`F>EYdJOMP=<=Ѵ~b JT= }yzW9@,lnTosV$96?^3BԨl޵Tpv-eÀIAFWGvAek1qXu2THp VW S}5Πr1}ʉPӕGZY\{Gli{H9G8@ vح=[)*A/@ 8U~^Cf =v|`" )?>+@o-z/sP+4\n҅Ү!@MX0;qabVj+kgHSb.iJ',uRD.41z`zAL;S%]HF W͛n* Wnu NEkc` *sZ?KP5Ajq$ŀ;H=7K9"Lh*nNVźP @ge=q#H*!{qzufeis]{Vˎl]P{pyzznuKME ?pqs{A~-^nxZK6=zY<#y : ksOسס]'.=r7ݔ__jUfpeIgG7lP!߹Srcdj`;g9_j =SyܲmY?="T*+w'ϻnicʐ3aT1PVMZNS}1ÙˍNY~@k]릡U\0՟J^'035T^u#I=1MR+];NNNA!^T*SYx,^7U<~=]r,FO);0 6wuҭ[7pSTQ^{A!z˜GRA_n0e]EPM.*ލ/vqywe051Te|Q6&OJ@xDVC%dV VZOk Z)9g"A<:M(s|"Mks~-l1ߖHM0&Q69]ir5QjEB8RMS%"2bb`e.QAR3,hDS;OMDDQAqpnDcӘ,qlL RM7^V"x_9-|%(hRvg_mF`#ꏒD &(Z"LzìLAR fh [@VXw RFSH6.&9^*% O#/m1 jV)\iVBrtҸ4Ss"q:"nQREcpkj %)Ry*L&Y1 qjJKLrR꺦bS~trr,(J.+BbDkDȗL9m1Bih|:.3tvfrgͧtkAx?ⲥK1$yN&rz~c\08csѱ3Wq%ru M0T;O_Le~ItC2!z;l/hteܵ_+˖󆖲lg(=c3`<^ &H: bUTΡUTKu[M:r?mz91i]ӓ+ /eNi:f5g񇤁薇ےZWNR9FԎDrxS R8Н/TOwzKR)(9^ShvYߍR0y#z4K}$A3Υ6^UdEqmkmN(cm̰捪mYV" zq- <m D.IM::ԕo7ɇEpR{'ھuO BZ3yGcgx !1-g3=0ȉm=^ \ݾYۡOfy(jm\ 2{4xA>ZnIl+'w*2-mLh(BX/FwLi-/28 GfC;ݎv[mOO<8OkY9מCQ^jȖl<lYߍ]"am?ҁzx֨#Si_+R\ A%6CtmJ)K  }` Y(!AVk&BqDGjR86 G >X !z-Z!pxyڱ8L/Xh޳8p)1;#+sL.6fZD$MmK&Xp:L*2\d( miiZNQBNY:MҌVEMyr٨ ਕ4`h2҂@ٓ,N!(&ThK;+=f531VB>[ύ 䚋ƅ?L᱾:9!B,NZ 4e$@&q;+^'J.Ѡ1_mZJY['QKrBlSh7gۇ|*ZS _T $'/ GuJbN59b4V,huCCs-ҩFS]qX7F@Ob:mtnEԜnłZ>4;Wtql񽮋k s !\JI[( 7%~LA& C2Bi{cᙌw4 d=^QCnx5k{>9YUK_ yG !o*L&i,K[Ʀpqޓ8Eh%#S6PvA ==pq"F̴Q?AP~1yb){F Bi&U 3j!\:m=OX\ŞCs Nmacyt,IxS VOn=%8|DFwe3~ztۿ#0jQɦpV:2 씳$ a9ܙa}hw%:ռgqGJ; PSnNi] ͵n}hw%:GvåY#KD-yHٻ8ncWrNN2$.ر+㗤XJxr%KN6KΒ\Y%h4!ןj6($: rWPH&wuse8PDer>XP+k"I%J%I*HsBKTk͍ܲVeaN;ƀ!ӭ_C9ێY \9:Ƴ&d鬺IJ y*ZSlƨ$`jeP:stn#zwi+?ufԺ!/\EktCwAX7G7mSa[%[p)Vݭlm!-^_\!u,cl~vmA"3\c\,pIj!0bM2Q G&%ڴ tZP)y &̃x*4䅫hN5[[&SyG{ K%"."E 'uȋFG)"75TKle @|cы$NS{f/5IK78)xt=5JnIcqY5@V~Z!Blj *[1GtFDZ0yO?ᲗN#YHnLD^ ɆY8\‚^3pW~i;UABU%$fa $LmkQ cuXKeVJϰ(c f}R}!JB_1XNp 5 sgVMY~n=9!ӠGô1>0-Yꕇ5^z[4K`Qzk`xi^6Cҫ2b H4^=>!1|PK܎O*`ܹK.` >w9ߞ(Gn%?nƶdȍXYmr~v h'DPMN.D*/>؛D1)D4sꛯw729{8r^xx7gw*M+7<󟪡W׉H(}_$J9"Y+Z_vsip?ĩO_pe!Uq q~&Wݷߝ;{Rg~Pf%z%IwHm#[8}yskfv6麋|yoJ;L喲(M*Ƣoh Z)xs\AGy/3mGơ'pxS{Q;z͡׾5~ Xl>{dW ev3sGJ .䆾Y㥹ʢz)^Y B}3s5H>,;V $f pvg@1hTPIcnyq7v0=CۗjAQ·[ Xp<T3.;m 4-)B`HV6<-+|~Š*WTLz 5u _97A4syr/5EI\F*eU>ʪ|V!Z%g-GɃvԘB~0h!f ē(Bq+ǫaN',TPOj'z5NRmu}(r'_[d'jZ/=^*` ‰Kq 7zp&WƖyvf8*Z&Jh )h0Tzɜ&FCB2$ްG"(^"hf!X杋!$DS)QXne[(07e-ri{שf[\$QYHVmS * PDDq1P%P7uJ=Uyv_+Ue̥gfnNj\xG#w0wU?= =i`~3k6Th)/vCvc{wӈ^G ͝yO|:eǓt^wi3=!wgNgǗZxv.-C$pӓB;ܨa7|+!-a:$CԚhb9Q3\ZRtU (3Z~!SE*u'%c~:){HS`VEZn0"*X0H;[B]'\h$=b}Z 漮pE ON Dⱉ8hQ'&=z[%VHd!uVD7$/)Y."V,Ro5.hY'![$Ik  5&]<;`Q#ŇP*cbC9U-#2 JEySDn7.C@ sNeo7)igH+}Y] rI|z~'(îk/뽃Ovzpr>AsW1 F_e`> H̗0%#aR|Y1_2B/k{͗5H%#4RwG0n=Amް5aM Qi1ϝfr2TRU@lcE,93rK]ъ9\\p!鶅^UC<آK4atC @+z KS z٧2+~|.uDmP% T3fo#I|i ]bۘ(|2 ՁdCw Q6tG ]lqpNH=Q@v1@{Ap@P7;m~nCxn+HOs7/]1M`BƑF{ŘU2z7\JgoC|MjD@@+bUVY=nXU  OQ ՌA'AjG( '0B*Ef h.L$i&J~b;yo-I9dWiBS€L)RνDpC *^C=i-r՛&X-*5ZYz2|kbTbTL b`ZtůgBdJly0tn M@h`㵨tOe@(CVL`Y]єܸb ӾiYP[y#"#LP%-уP ^+bwq!gd.t!z7;YMLQEs'ecJbo}CMiX͹>r|2g ׉/U<@vg WZ3c(͍(*L%J%3$?Qndd 2BO'J /nr\B4睞ӆ!4(IوCGje0R\T0f:~X='}sz~>kY AJ^hFjᩣo(^PK~A!psCR;SL5y3G5$୉׬F1i.>tK6AMh=$njV;lF1Xwձqخ"0<;~#_e(I띐 _VQ#9X.gud+D BO^ZwE0P&R,&Do]YoI+^3E坑;6y^!Ohr^HR",AIab2ȸ2*r'nm*ϲ<˦,ʳK4pM;Q&P =*R'&N@DN~@ ظ+i@@MF逡@ $e(\"h{t+M.8RNHӁb4[ ekJƓ.h m%~>4 /aQ%%a0X6_9Ε%&&'"Za`M;AO&J^{N2 z }.ӂ3z;]MFk)½1@.ziq"a]2bGoN3]ϱ'ͷV`t/fCH6 WI0UKI9|o&ջSf"I*Eւ@ ڱ|Fd34/=z`pR?-VtG q_ir}T6"r[Np\khg4b1$\J[j~s;Wk>ۼljDjVE,u7=^牋~@^HDo/^eQ]L.{tZ>z9 G&yθѽ~gXf4Ol(2WW9u5JIMR< }#e[8m`uu!_Wޭ[7{xΐ"|>u,ܛ~˂+xo Yr!VEr~~nVo<Sуq7Gਤƪzyg敖۔όm;]ҸcpT ջj$Iy1]U vfj/? ĭ)&ˬrKx Me1PCM,nCgc2,^|S&H&5K+8RuN^P8f |+ {W?k)DȌ¨WUdMuJ:NQޣ0Z4*f;'SJ F4q95S֊fqY =W_]Z1j, zxo\N?|ǂsZk7Z~[τ–6)nه-Ul TV'?LnGW7狿gyZ 嘣amߟ~7Ӊ;]M}(Ҽ7ME/qcvvuq([C8e nji7`Q9ڗꏯza%ښu.n\u67+\г.ALv2_ |~2=&}'9v:E%:BwA;rXP UB'|{d KIџKl'ޟQoXs/D *PJ\Ɉ&_{WgSĊ3'C (lP 8|& 6u~|@0SPrgF,1G_&f}{7loxἋEL(1v6|@jHw/I{5sG]ZdT_|KlX/5(p<>Hqk]b螲_ӘhMw9OSby)}[E0Ps{nn~ R1Jj#WBS8=+4k%rOGkq<櫳{nxtߌ=7o97p~%!9o,\b-gY>ϲ|ecsZ#]@6 EH>PhZ K*8B {?ð,7M1$DrTVM$ 2ۤ)EfG0-m¸:N1ā L.YteTKAMT2qzPx]`R•6u$F שv1.K-)Θ7Y'MеҜj@*n ЭA5H] WG-F)`$?ZsSb U#ut槱ZS oq1( ٽ\\~^ ʞZ,2ʟEmyS³eC^6'c邡q,݋L3IQNVvr *?HwҊȹ8HZxkj*xR㩚p*NJ;f!dm&40OU.R /HeG[΄ֳײx5Q+en"@ ɛ8G(9Z6)]>@E[k~kq r6{xDx|)ǃxeNdIMo+|NݱMs:No3(#jݰMu-.z?ݛ]Br ?/P `he&ȇ^ noP4 ^X듳Oβ>9䬮OLD!9gC~(3Mׇy5D^h;Q m'?XčVBvԠo؝B4!@c ̨_DSXtRG&,܀ZDU'(DkW *m+Պ֌ΖlX* iYASE# w)+`ED,XmUq멤y)ѐ օp{}/|r֏.[ۥgܩ`Y)OK1?GL.rMF_~8AMkŦO')Hwv6UwwVHt{y3;i0@)ۣRDs(;'Kc3`|LjRxBHW{/RW[7 CDuD׆ۥ")[=#Qd*R\.gL|2$ō#B ɖ!TP)Gܖ#^"3ƔQ<.n$,NSKaעo,/A Z.6X}Bsc踿qT|J81q^IaYd %OUԊj1n5n5jE(Xͅ RD+=7*@2E{}۠5 ^=;yܴImcQ5Im ycF$֝rY?Eʸٖ>mHK'ӧnYB5**|DIBu*͕^4,6jɬ2h@o NM 7D?{Ʊ ݳg)b@I8yBO_,Ȥx&)i {n0`Mw_uuUuuu'^j"ᇀ@rFU3HW! =2Lcka8L=!JؐLgdua,SܠV[ď{# F_%V}(l|S }|.Ä ΎYd*$ė˘OdN/ ASԊ 9 CTlM J `JN =\*{ O`F@ȥ 4 Bz2hۏ[/v !wlG8y}K^K^/,Hs/٦HÏ3/ IOKeRj#?#.Ʉ`D?NV@cQf‚\;=`!B/M):`>@' |p iIP!*+@eB')`Ti&&Od(ɐ]dļnkx~ [aJ/ n&Qw(P{@?-$)wJ+^i94rVW `P֒ei +.,"%DZ\\)Yt5iĐvBt'|p'ç a%No)lgd[39ZEu񜽬zqwu.Un'\n\nv3 #g6 jˠm=O;Gw\]0oޙonyOñ kMǗwI\~`ؗWr'sis[{hC+D2,h~ovZh {V2+U|{ $]u#q"\FuM>X%{jx'?^+ΧQOͷÜZ`^^>8s/o99L>iC[(iֶl<9Ň[bЦ;g<{qrmCXHRhkXkumĺT΍1}cu-tWz/qBkѯp(GkGRjFd`?nc͛Bkg&:Z_-VU,QXRI##І)Y85xΒ cBlJ?G68ͱqB9~WbeoXu*yIRCWΑAT=Q̓gɊn.] U 4'ֶ쭄v($_'D-D=Knv┮\xl|>G>4-&KfoGn)=KVj!Ut˚gɉjJ5U&1):V!I)ZH;X5, sbboϑ|9@r"2=/jU59ݬ r2P4S!V(hhSȰP!Za8VK& ŕ̞& Uh֯:`Gǁh~Z`i&ZyU+ mQTuMM^S+W6|mK0]̶.X 9ja3oB}#z[~zj44sk==K u>8 >>$=-2r_wB k}z1 s/  jTC}rY6ta>% :>듍p^}fq=iTWLO#.OFmq>g)i=G|*!}STAW*MJ.]a'-ъj-J>>~]IcKZ#G`עGO~C sZ3VSȺZC'OC$~; +~?4  >DZ(j ~U;_'xN+ ^`ͪ_ZR,D,'!9Ϲ]D"`n&.9К ] jw`q0Zk*v{31Rx0f|$|4 ?Y |ӽKBO? 3{Do"7<<9&X& QNR˅#{T #_q4o3 Cjf6xƓwif 3jЫۅY<  5X,,n}Qpnctҭ/?`"/b=ƯǓ< 7y 1EUjx .%|xY'7/Yg"%$rw% F*wtme@%Rώrۥ΢ F7۫XTO\_W2ul~0{,񄓟B 0c(j8ht\ . &u=5_6c\|h.?Hmc8Wnˏ/:GNs`yٍjii^ǟs#rL>v6M\ڰW/rTWW|"$S wnHg{ 0JŹf^ f: !!$qS[Pni S{/6!Yi aW1N H,<6¦j. CA#KHx$O<Tb:ΊaFH' *_`Ķ.Y/=|}Mϖp\J8\7:\̺Ms<`1ViDZ#MŜˬsJ<:-!XmYS[ B/{_KMLAsa2,R~zVe)O}K_s%]CaOcX>Ր "F^aVl⽙wKX!(H"4&N[BQxj|rRf&},KQjH`L|V9Ԅ|̇;oP*oނ|?`UD>˂4UaG7:b}o3ލ>xk +cx=n c2>_l>~bs@^KGdڍu`JF ^`Ǘ>M3WH;+wbyvr7k/O6+^߹{ߢ^RqxV^X-9Qнvn܍^i߫6yfXTk{}0 {bk_%[䗳\гт#X3yJjrԽ/"pKVʸ 1Hƙ"+Ifhb: ٻ6$rxeF_i+$Ir߯zHC{43|ȓ#3Ub}=hMgz!14@^0ҏӰFjTGFdphLJc5R`ł1`bP,l`3JV`SL&[n2YW70 Ӳf}umG`, p)\|_;'IT*аeB3-Mn63KgW0CJjc0Th0s'JSuHYiYSo*KoʼXEBhVEGG: )żti%X'?h6(\B4:N>:I 熘UDsW7qb^mpcaR9RPZy 3hlfIū5Ӽ8HW|V*rÌѾRMfC;r1Nnog͉X!ҞTG G yF K+x*Tu#>4+gm=cwmF%z=KFNHte,08Fu^~5J"ANzi9ph8қMˤI?/~^V:E#j쐍aO,9+cjqDK5 7L+C7}QM_|Nh{qL^ج% Q\Qc.𱂳)2E}1`%!< 0iMǣ'F* PwgBK 6|Sp&xXʆfΐyzH<1(ypXZV`MuR|RyV 7l1T64jqYv~]D!HE,(Žf:ݽGnMPxJ`{Ow$Z9X~;PZ6o^r\ףpn1>U Nx7קQHUP> @=jS|_{Ȼ/:=@S B7%_nXûY_8@fi8G4~|{W7`/ӌ huaGBjtO\xJ,Q]y.,ypyo=T _'g{0O}-J3޿//x!%\ߛGѿ/Dct j}طh&U;;*Tj=9I_*1݌5宑Qto/b$y)|VуyKYp?rkO> g2/p&>hϗ(zAKM# n=\ D]iԅИF o<>ˎe/?mZDG<uU2`9gD`rVC EсXdnҚMZkh%z7]:a)-ؼ0$'aoea+iP^77,_ 3rajqySFWܟ:Ru WzבڽTSg5*=HcyfVԄ%Kcl.5ܹWm{wyRwsvNN4[Th<iN( ZpCPk^ 2]ъ?M'nZFk//WQrs5#6<ܹ *n jkDĺg4Ac3 8NϖEҭT=D-}9A n3 [;#f!5@T#A5BRe>pصƄ+e1u70&g@0ß ^l1:W]@B[؝ !ep >mHb&n|WF5J`'kvekV n4j"+ד[y^f~_'LWna-_f_;r@kҽ!kΙu͠M-k}OMgGQ6Hd% v*)P2{ \0_J0kmx εՑbWOe`ąIDU5m=jJ]Қ@)M`%4ev bs)^irQ؀nhcA é^=]FJKJ HpP#T+& CA^ĔWSV4A?:YTţ+}\ !F!LR_XpJ-vcY^^ PMt=କ妜>NK'~RNzf|oPQVn^vˎjF=v,JPGΤ o{.F/14&iGZ׭m`u*B&90s Lt|߈,K"\9]8A -NQ$`numB,-IT Jt,U H.t?wp)D|%jiHi ä+V9C`Ƙcg (&S˓%*9 w$3/O IQ`7"T葴~VN?>a?8Z/}"iuuoEڽqq λM`Ym"1) WQdǟ\iA^yeuAVXYyBNpHy8"U:H<545Naat/ЏVXV5T׋PslvNjSPցU3ͺ5ǜopHeA1Zˠ2aQa0*QŠQQ3 ! ZѠ߄d!h"./ZSD%T%uHT#mRZN(0 :im$ ,Z!xk-Jn|H7ei+#]zgK+5T=6T - 1$ߒ/OL1Aw}/Wש?ADkF_mh2-]bbw'0ʔ^}{pϒ!KLMc‰NA!v%EbpxTbkl&*qkdz\ͼ~-$aQ` _^)yiT4 x#RV۞1S I9EQ_=lP{}>?eug>! aޢM)| Ce5ۊ(DXQJ3k Q= EKbkn['[қuyW,{d*TvB;ޗ@}]Kh<:5L@d0%uL 7O*z WTl^߄ !WM3ǡl`2% zQ CiZӂ- _ ݷM&WI1jh70.q&(޼ O?<#t=pST5sq̆uԖ64Mgw=y]`N1U}n0ƻiڟ[8rt+>ywˎ*!l;VCИ?O-eR( اk FtVݧG7&BI u'>tec9L :1#GkaA@*=L@*]߆L zCo@ֺG}! 6 ؗ#8 ҈iL˂TE 2#-9[{oaZIfz}+ش:zgLUඃpl܎5Gv=2gVsӠ3 Tȳmb gR**S >zp}9< 3X d7"V#T)-v\s12NF5D`ĽRP.DR_9|bkq<@Q% gEPvjFnZ6ag\k*+cMPyz"TeTz&HczJw2)eR˪U2o gK0*xAP!iJ Sh`^_czјަ/׎ֵ#)ũLWjphǟ>tx.s\VZS  8Lm^DŘG4h ^Z7}Q#ng5kDq9UYCz5* #cFg^bDaFq(f6]Lb\9ڶSusbY9!4H#qE)猐hy1fVy @k]/hGPXfk3Ak F` ,ziTQ$sXF*Fp#n32`5`d#[Yr`}:M=L =C T¹OӤ`Q#aF\`EF 0PJA#:f5ʚ->a0'goĪxmD,y ^N TJkg'}zpA2@./W-0kK=|Ax%KzK iae0j%7jTިɵD%8Q[gynsWZ ~/-i=S*Bse21ۤLmT\k6Zbdi#g"T[č%r$AF -fsRCP0@[DF/M?жL-#/nB$熄ph"ƹH>a`ӎN91d4EL8/D†iBM`jPjC{sw!V'<_%kza,ld0<^$O=o |l82θRYjK5~=*Y $ٳwg]&]:豌V^f`lֿ;[<vS{ٱOٴc0)؎9#gM! gV5"RV잲۞i!WaeһߪcHS!rX W1GEw5A *qNξNʄ' ;B2)j-$I 8;؆OؙM@O1!U!"ZG#23SL5z2Y(H# E щȽ&c^BZ&ـtt&EsFu4_ fDhw>5€jt*҃Tu 5V8H}`Ii1]Ҟ(FO$ N9 O4ŀ(BHuжVijJlXo:uba:19譕1\QiJix4!KEG&jب`d)DƒS΀-I44ǒmI ;@rfG9&Γ@A`A*tV+hI:ӾA unWl9C.48rG-[hC#"17|Θt s%ڶBW-ӄና=tT5 =Y!JZn{Ҁ>j,xct\ pWm{[;àg3K%@Znl|D^sF*_޽^|uMo%;[6I\-n^ř '!?|x CƢ p߼].ng}oFPƐv C ǥ[Syvt~nzNH9H9>`ufx'Y:OtfԒ%nmtmLRF*ɥe@P`[V.ڗ- ;H>_iEˤʾtɿH@ܧw/t^,SJo|%D%B:dpgBdžyY93.814#I>6`F晭#BქQSvGAL؞Zc$fV[l`fNp(&^O4f 熗fM`8;gy[5ܖV3#Z[e83M0RKXv3 0}0cz<֥?rSrR5;9Uuq 7FcK,GB u8Oh`,%mE*>]T, &fl&'Kk'ϖU[eekى6ypR?KSZEn~Oc9|g?Lw'ȥy>YE(DK-`Ȳݟg7V[8]mu`av{x6;piV Y)vB7WƂ6 Q$Q ؠ "D'Hz4*idH RDhվ*WVřӹSyÛv&GbV\vWn(. ry.nA_OrON4L՗ND$m\,;66ZQfV[5_^O.*odߝO9J6ȧVڜU]3:,W:<yb.ّͯđNjh~4dL$<>%ρDD)I45P6-B{:/#u).1ܻF#-L洠8?]jVD\meK_8+\*A6kcm SCGɭ^!C-XjCO;.s`r<'j79/#{n= n<Ńm9}P~o~GR9n?}ֹUi?"&{.J^ 7|Q{?f/u6+F=Rz}&(o|`@hLGXٍ(^-, P0sID~Ar󩱶:̭lxg\CKR(x)oCBJ< ?vyHbdn5 R,-"^,=.}̇~/Cg?#tGo][ 9jORvuSE%jNNiEzq|)G#c0ĬQ>5d0C5KJbwKyh' o38}WYkp:f+u`.= qΘfW|<.+U OG`$>j܉x^iwxdsws:ȥqZ^#q+zayr< /!eT8IΟM184| N`5l8} g=2I@hIdžGpy"qbjdyMo*_ 8J_įhϢJBpʹ\Zq"jWI  ]P=m|r`&NX )&T4WIPD<䛕jEpB:'5n/&:*fvA`11GDm t"jZahZJy`aV7H!5P; d $GV" "Qx_ ΍|ޕ"ZFrv#eJH>I:Yш:z~4Ý] ˻W}ѧ؇0 &쯏',=9 qpdsFN#a&u;F#GylvRLZQ/#IRĦN3,rkXW2?<7nzjjC_t- 3RgK'XqgRn&Ehl7i޿{ }$jy֗ٹeNe 7/;sqE_v*:&$@kn0Ǡ!@WRM^yO-Y5S܂¸Tn:>6kZ}Ѧ!I Cp>2czkt$ӡGq:tHbG܌9ofUȨ>g;ޱsIw@0t5wҙ)޴2_~CtxQ`L3k~`_n*QgXU>T MCwj8rxE@vW?6hVSaBH#`/?ͧY>< JpI㘕ȸs#z畉{k)E#  ^{ zmH Udzt;;L ֑>7 RyG`^2md16onc"I7r'Y>O|f96,sf Yᆬ2fk8Ke S]!ky\89yDlȺg9a-68$rY&z|nFA(];2_Q*mD0P~ x }Sd,IqXb Kx!26/7 xf:!aMZPkK۞I8 } (ȶ#3qBc@^y9mB"'}?uE. QsdHd|&* OOr?n,/SN/NXq8'2 iD/ᛰ8Τ "jj쪐,|T[ Bd p̾HlS `]si(H ah&t-%Oغxco.5ǹJ{y87.>BE1Xaټ?0\_rk)klxn?NdF CY(h=遫n~2"(DdwH_̧dp]l\6_j[qߏlvKփ~=jd˭&YbXyFPC2Lrw\bX|dOA=e#Imȇwi7#j6PJ`&=Pbd`r e?bMgDDDr31:ϵ OHGrs@Z\Qb<ʃsr+>fSOff%0Fft֥ḩz4AAXg⻠"$׈h KaRhb6S Υŭ|N,:v& bLOZ=Q t,v(ɸREOM>DJlG9`H& DYLQh{vRX@D5kN 4'%0JE{ZˬKF*=G澖י?z=[~x *e,a}jd(fqZi@ t KlksJ\`#+47,ENaE7aJn>FoQ]wSRЮ{\?Ζ%WΤ.:8aqn*-aƙ\g9I{U(˄(ePgg7DMb|)98Fj p-5ZwXDZj $` Fh) {&e? 򠄻j Av(4d+$i {! ` ]s)w2m{ML՛LLLLbP#qj"v1=Nͼ?!c>gh)2@gu;ODr]ߌ;hϔs]BoǝIbb3hp56/hk6RN3%sF2,^jèth NnmbWmoO3tVtrJ`V1e*1ϴ4x F(\OrJVDEȅZmoZb6 c %E?Lݭ^=FCL)rUX k\wG*qV1IuIpx: =&3Yq7 KGAvl"Pþ6Mq/tLDoLt.~0,o@ wugaɨ(vTLNEePXՐB!a/RbĄsri؍STA[ J]ϐpcd%\Q9PJ9`nGX+ ZJ$/}V3Hg9̥tdb(1L,5-:w(YMD7oRV~P"Ή91۪`3Z)EcjBtXh"ѮJ %V('rhYJ5S-BC3% (=@^i{=#2ޛZ Ծ+'T%LMuDc6)OWJ.o)ohiƘYۅMI_-ERǪ! 5qK2A;I!QEPKO _K^%Wz=/ =^ HBhjyDqH3//;ǫ{cHqoqV2ҁ@|o'ЩAqv)s+ѹܐWWgq UÑsbQʅϩ4.,Is0UitYHM)rƇm2بRK|6i݅uV#]MPbD@Iȡ%*O<x),(a:J=F PZEFmĔhd UwTT; Bb"M/IDw*(Q,{R'X1i!#X.!:6fagV\ TrmwSx-6%^t,2k!NmR+,wp`R&>O*OmxXˍ'y 1[l`ǹ?. r|BU߯>TFK0|¹mcqE ܻop/]u$X^^DxLd)7]E33jeƀr™ ;4dY@bUlaUBcf"uTmꟶ$mO S9FBA{=N>鳰^oS[4J6Na|=dbDѣ`1Qfz3+mV:so[V󭣏OkۜUT,#>SFIۍvZc"gqc2z3Tm>YC[ *&O/u%?YA7\sҲ ;O}%k\<#W=ҿU{&~~gmGmRK N;XO46LZ(]HN_NvT2$uwWɛ/ . ܯQ>S9B.3E%,h [ĀfӍ-lquikUW$)!ظͫ/OW :1.^%mDlk)0o A! X&F#D}>`۠Ŋ 6lv%xpNAO%yҗFd %dN c-bxr9˂,ׯa:õfHo6IkL\|?Y#n7ӽ.1KRט9/|>3_*E-~Y=oUƹ5! likywy-m*_a5w{s Qp LϥP * DMZCZE"-l$Nu{N.NڙLrfb{Du& $$Bqcr@ʉ#q$X3SK;E2grnri}{1j@%r,ѹOXwg HCm|Hٜx@ld *Jz6vxJNח~<=Qޥ-GJc$UF`I̤ٞ1$ P \!߉Ԉy5|r e>(.aRSjZ갋^aD6_>pT;O[.w|ptpfRr }Wbi,'Dv/r]{S޾]Jf$ 9Lٽճloћne 4H4rQ+j Y*ՐP))`K? _X/, x"*܃Dvbi^p& GHɯR!+EâUp{)*%m/XG{&-XU%G. \<Υ9Ҹ'SR`&kJ@m# l/7+I=>" pUr`iI>@)5_a-yR&yT&>}'&ːopfC @S;#4}9MSH?.TJ}V)+)ǴǃrCZ5pTݲay{Ɍ"52o7iyGD=cB=+HD9ڠxtVjH%Jq1yvҫ2ZS63ieƅ8)uNJWit8cRLFb &} ,w6k.sr\́!@sŽC%A!i$(r{=k8kXHB;!N!duYHf^G0?Q<~M]?^]\rtH^8ǫfi}HIn8ݐRNi~RHd`M:=EM[EOk}nF K_Hju/N.uWRisW$*qP/ ᐒ#'Xpuэ~ E^H쇿~0[>q ˿J#r}ۼKE".ߞ103+Y?d7|zqwx0%Y|{tƪAD!@ѹrz鋯#QHd Zqh<]3KzU֦j LYƈ:uMrv;iA+%$-ԝpd <:d@" V.mq>fah0H׊ M%fB8A/4K"`!`D)Cdhgْt)QL{$qy;PoԠXp Bۦ1D;"c'߯" Uh,KOP9G}(X`|E$ AHSFK#C2%t*BK`JOh%dOFE:eAʉIU(H,k̸N["^%捊"vye2 &ӏ6 )t,GKA|9<#JYR: ˤd;ou@[! S'oD$6H()*ÎlE3bhr4\t9Y=["j)I[''Cd {_`TS&GNcБ"!CVw5H 8Y_^2 s\5T%vj‹a)eVV Ni_awͯ+!rMPg:w3AOd`"rL ?̈́GO[A˵R29@9n>Я5рz40-$)GYN hh+Pjuv"&; &p4( ީOv_6؁~fQXt%E8VQܰIi{Fï *mPwƫD,nJ%؀|y s L(6'T]&n&u_N|9x4j٩i/N~dG^N= p)]_gٗ,;{πQ^~v u(n=Fg4:+Y1ڦQTNQLLZ΢XrJ9f&1CBU;P~v}_[)v"$#^pΝQI[+#-qGdM^~(y"I$PFZIGsD^st̸dc3I򄳜PpEYUlϥ 4^p2,\&dqqcM#'}B|$t`q`P.]Mkp4]M#EؠzAʴIuazZC_UNdp"oPY 0%*4])ںqOgy7-q6;܉l2X`;Ml{KU0ӓH\_J&*R`"@ZuD)s{d} )0R/~lR/=-cs aג4(c" R~>8%dR qm;Ozd=ЗNYTb55]]H5Jt>82A IfC6jDUs49ݫLcr5^h$_UDֹìH\{,b$hx&ב$5IrAAv+HJ2k1ݭULvy]DGV4䆬;ZgΟxwVv﬽-KtwDϬOhl0Vcԭr*y#uR ©Qxbρ{^/YQ]z 1b>=\3D%bi-W5 A12A j%\@SP~oi=qN^J 5+ېd{Anl_gblL/Ie#N/&yr|o{F^vғSDhy0.4#7lv8No Zs:G'oFqGp=Z~CE_/Gw%ka*Umޡ8 y!kzL58 R-])'Jd /*: ڊü}s2EΡM,<,MO_H.{&: &1`bΠ2;@X &L ׌jѾfVo>|A6J-i&pKS9F%c4BM PVE)r-K"KӠoQ/"iE|R2v;K|C)6DBbw ԝd[A~+ϳx93(tl,C3|PaB0O4JtswYU=݅xaNg ύ>ҮF4t.Tn+2{<}3ͷ|G^r}nf%R["P4?֟tTuяo7H9@րRX;\ѣ~=eAcTc>\0^c"r jFbp˚e]o3XxequCe뽮D~9Cu≀&gج+w=R{aG.;FGWp+ľ3bKt+7i&X3k5k~IgW,5ZtK MO89>40JV^r"-WYd5D.KcJ 0#U^"22 ^;+2!+[ 4wӔa+ZGkU׾і c>)NCÓ+3:c.tI~l1IeO8ۦgOmḵ(=3vW+8/3qW1^f}gh\> դu=$ *Z~^_yw q&mUZK{Z9Eӊ\p+ɺzBSY Y  )#TFd)#To,4+M>n$@+"xaWWWxWt&\p *e愥HxgE~V4Y[%!zIU#Sf =Ff"xÙȬ4}%}]Q[H3߬He-k:R;pmIAٍ>O5^t6MrM{zU󽖩T8J}U6;X}gk ۟3`0A[xƐ.r -!<1o~I%|)̙PFV&&q{O_./Wn=?8~ˌOs4www~GOB 04![9sq3_#*([1rsq#HkeorraMmMWr^(Co q!LNg,8DO~rHb ޛȃՖg醤AIG_~#g23GbN<&bbV_+EXhAm@K5`pnKmD3`DI.ϲ@ڨ&gTP2o1?+qկ}ZدFqAnyz0H6vkѧ0n1]N]럆)s j"n'0j۩=0nxi[!FVmCb rVH^%FK7;ü2G AHRh3flG81ͳp Y'^d&@ߤ_!wOA pP@ QO 'T-=0Kd=(hVNG逌1ڬ#,"YY@ $)d!;՚Gn ԅ\ͶuGKEF?|> ((XRSY}= Zks߾me;MJQD<%Sb?8*Gn^]|{r9=ObR֧cA~Lhh  ok}{Mybzx[}/aw`!փgp^OW֋-S;tcx   QU<\ѽ'tU}^Y,>r]^6hzYo vܹL"0Ա<ĒcF4IAܦhA YzD/ A;`DmF ~e_7x_Q ,؄v)]֢!̦~sk/jBZ\c`̝h)YuyC\5fy}V9EX"pһ?}Hno//#MaKZʱYHŔq$bZ-Wa~ > ,j,ZzZr~iFåd_U&d[/m %7pl:7%}vl WE?{϶m4%bE2w6&~IV Ya$zf=F7uiI1k΍s|DxH>L8"֜xd?-S$SV>HVy%)N^k?>HVYyӗ<WĢx#RfStthvfIn#;~#^<[X&&V{6)yczYOF/zO^8xɳ^ 5Jk ؂WqlU)z݃h^fѼ̢yE-w҆DTX g!K$= }:&Rdx'}Qw}_~4kEGrkXn a1`| ncu4nl`0(Ǭ_>A:H O0s9zzu uK>|\M5$rGKzqӓp?U\w(*le^Ʊ&y'OwV[}wG`R{;*qdrH.hbUѧ $-U (:_k, ^ R# R$WkXu͍c0L.H{w8Y\DJ{S>4^% wfzQ+UztٍζU3FNoz!3J!ObǃmiI `Ü̈myšzM39pSk]`RK&RzZ@'9ڸD!P㉽ވ7o:mZECG~LJD^ld!9)eߛoͪ}Xxw&*!XՇe4LNݍzۼLQU2ZYyBULu: xE*kŒs#̻ay'rn3L r9J=& Z9 (k9:-墑,7,Out"1ƐU~QI&ܹ _)VMfpIs1>a﮳Dɿn7ŸُfqECjl;\T7/.z_ S7EuYHu9W<݉oSЍ]7 on諦4VV6E\6f#&V.>RA8TU(znQ}QHjvM'6MO> tPly 'OOoHW|XntrsFF[sۻzfp;Ds.rs|.[nRo_EȗG9jKMc}2<7GR)<*Gd+ TsH\"H[pwy=wy7kB: r䊫;z%5_M4CЏ%XHv6!H3IQ"9<( Đ@8qIU"ɐ+EDfZoWpqUA_MjC;y*WW9p/ߺciz!Zb#[86)Ŷ})k6]|sY0͍' S6ޢ4>dVOCz#/J\>ѐHZH8z`7=,}`Y{1X:ׯm 63]ڤOwɹt;pTJC'-c|۟L"AaZn݈# :}L~Fě!Adڍ.ukطy^DbVӨ1ג?{r 898 E|(O[7@ѝ u %w5^[+@Kd,Y VrMQ凌)&xҢ2xҧI+g-yEu$:F'bj୴A'ݻEx-28>PáBJBչ54[5Re CȭFGT)ldrՌ/UH_wrq^.ndie#*~)Bv;_i_ܿ"+[caQ?M,gXUl[aaGT DvDU]NP}&FaKZkOŒudSxvY"9%vch?9vwGωb%6Ȭ}pv`\^뜄i{A@MFз͍]Mj,{ H0ܧ+R*wN g$U{\bA-v`_s F*uWc|Tė* XWYu|)FKvzت)yAnT-ṈNc:dwc_#3CE>OV9o+D$~5{MnDٟ*+ca8#uy-͗/nY⎲ȃSQ?فwBplXd?hqf2k V+z֞4Gw[6=r>1Dkm^g P3sΆI>l Y A{&0QD-cS ,r36'ΙcQG*aS7/>j`}уJdνt>:/{Qg_?p`ݓ7㿾?|7 \k).Zʘַy$D`-2m 7fMWwSJݭf]KEw. V3c: `ʹ,p#65y Pcb/~o9ʲM9`&9SDz͠5k`tJ}O߂dCks;o 75FUɅǑESmxr5鞾%0nWI1zˬ&łAxnWV%遂lg&SKI#C-!9wp/ ݊$rq]_*Qbv8\k'N4cwQ!+Vz Q%< %m9rBT%;㓤lYʒw=d-iB̡2ic{gesДCLVF 5(!NKhEj'4j.\Sxfl\ l1Anjt p;35jh $Wfp {AZJԉU2I[k]Dao6~CK,Rj9G-bΞ-鮆eU);ɅU#] L!$ک+ox4g(c ;{\G4̇| KvOr; bjV}yiͅAcҌse>OGƢ!,iu?˿4{6kĻ ]-%_Ń=e,Y4/ۢy\Tڣ.zA4;nM j:*<2J 쇌 ޳Ͼ[.obz#';Kd2}Hj9[}~Y>uoI*f57>N%JgEYG+#wovA*ViCxfvu<w-p I9̅Tא2*ɠa ȘHECI*똢& A#BLId?HZ7pӛV "F%pKP؄IԀbZGD[41癑qmVDI//R@=4f1/0xe}:z7)۩b)etl/2#'1U1PfvX,|Z1ELZ7#u^a!Vp8ʵ@JS✪3\z( n@YzdrU跺SdIehDei) qz(Nq4 9iD8[1o.n/^V⥺? *|iԱ|2@iE9r@fT]c/g 8H gT/m1b( 5Kev;QJa6-a*f$o- NRA{AbrxaWS 0b&3GcU3Qɶ\ r&;{wtphg'pu1yIЗSC:vڊG0 /}p sc,"Qe(ȜR)) "v1hb׵UfJ<ȰW2Ef8 YSK@5 7(%ZVY'|jԖ/kmiY瞧bi/։IQ.GgiT GAqOl٥a'GW= +W',V#ЀtzK918gM{us+{zԕ'|ю7+wv_o?_ߊ"yf_c[5xdCS:u ͎RF|zrlrŗ:;TW.VJdQ|8!?h -E0׸'$^i`54Wfc6-Quyf`F@Ȼ @PD#׎0_J59fj#\P^6H d-Z* }c<$ҋsؓՋۉɔXaktmFO\KkWo,xUF-C?c(,dq2Rs >90 \mI)(_}nV;ZPW馿O:mjE>98\ˆ;¢L)MTCeN3qp?wrh%ͧ7IAJ1%App(~Y^HIdПj:߳Q7))^4)Q.Je:ζ/DniKM^.@,fҗg W!6zH*zv1Z!FH3" eE )v@Gr xy dJW 6 N)#BchXq]ijTBU(RE9p+2֙3Bb؊PM+85Tsox x'F4ʂYs4c}YZ6~EupC,w6:IU&1il;8aȕr|Aj^siՑ௣qfo>xYTi螒ʞևun%O&.9bq<,/O&P[tK}:@#ąǟ䄤cO'%6Eߛ#Sտ()<<ܠ1~'j?Sv|q4gh־rQA#=OaKЕxbehғ|"Z"SUdT [9VA蔎Eknńj>$;"R䏉ֵ Uڭ)v;TѴ[1ڭ E`j-N#_;闯?7w>|uq |thR q(̨Gf1NKX[t:]cmzj>/#n^-~/ I䫝FbEJ|yE~~(HSa;9O{U=?PwoaioߘZ<[8awt+헓:9ku 1K@'PU"}" Z16#I|mokԮ)65vq#Kp+`(>Bmn_codw孯Upl&As8@q89Ɛ] Ο/'B+1ߕWߒN. 5co)E/y+W&qXgazi If23Tl=hVJN?^F.#>Nnn]>8IǤn/}w.rwɠ:zp9aDpY )'!mTmUIvJCM_JLjQs#7[\~q>!2`?>/*Hʗ/wJwV)#0)} #z Bfx(*@哟5]3yxjVc@ b.wUKǬ}4Ń,F3bƫ7s<gh\,)+898ЩN LbUi.7 DF5\Ԋ!hwLj#ϴ&T_ ),wwֹo)Y,%#.:9J!/UpFJ )#x2!JF+KTPѦV9@>a!YBxoMz1/W3 .ڙW4k&45<Ƿul+%T^_^=AbfR9#k͋PA:Eg $'T ۚhL9SJujHDwJ!̾\p@*k D*@zN: #8| 0*J*WsB#nhNK4s)./O.TR⸴i4ЁJij=Sicܮ)ID C=kw3tOJr;s>Npy 3fn 8T5)%luB)릦DM @ dR<tF5a l'Qq {saRGj" ZJ`)%66T8&ՠ&]Q(}(aσhœ**D"%$NJǠR-5 % `BgAL_SeHn='ޤ>N'5wowj0qL3 {2NzK78n{)P)Ђ9;\RY^>Dm3eKx=Hש-jYqRFHf "JR!&KAV);+t{_[->,,#iU~$kxHniL<ՌJ.m½ +9pu%xt0asQDCb L;9" STb۬6Kq]f)@dkG3T3aFˌL#P)HШ|AWJED}L_f]'xTy/"ayȘ UmTbf:PGjO S>;l;YiubQy1pCJNDa%@>l\ ?B Zbhqe/e뚬wuZܘsaj y,7Gyupndsyzt }s}wF#:vDѽ먦H;8Âc> dB3Şu\qׅ*!8q\'"@Wv{p}'t}#uc1uEĕG ƺn15lSw0q2 oٗ]Uka(M+{7(qM/Rؠ4+,M|$R6Y0=c*]["FA-a:1eIzGIqL&8*? I)%ct{f%"WoOC "ypt*Srd`9ƾ=VXYJl7!O[|#=8'SAA xrU|r-i$?;tg}0{"+Xt_׍5n>98~!`܅̵M3֦|\pER I1\C;זQnhV9%243I xGP ^T X!|~m@E,R28>%5l@#T\ȫн+iȈ! 0< іY 4+` _X:vLp̌j83'Wrl뜜DDY៖2+[iAHŅ34F2Jgp婢U / H^ wRkKaG=}*ߙx 31Qn>,n(Jx(^qsՂǃ]LVtgۍVWz ÌbZ`up$f7{%yHZ{m{Ȟsh5kkzucuOԆVFC;2/~+_,LfV7FY47ߟޔK3p z?MоwerU-c&}:~Lmڷ~)^:) ]\wG0?]y F.Oyu|s,Fuv?;S?>+׏q S([R'e_ݤwղ˅AH % b Y(i72pW[%BA^n c`*iJQ c=CRI +*5@R37pWk a?X4X5a>~s{_f1RTA݅ˏ~ߛx[8wy"0ͤL8D˅FĖ-qkt+_-hۛzZ=8.osJ7_Kwzyd];|ÑLj!q\o3_ck`sP~؄RP2z u֭FM/kjlH `OV2 hFDQ7VNJ(?TFw8ؗ5^M$3,gΠy-:ܻxI-k8w^r~."skʰ-ז5MXlɳT,n3bЋOP`k2Iƌl)đ$X4ˑzɟ'zI$p$H䣲ҁ'ł?n釕>(~r-kJY,pfa#uecl@*T)WZuҎ/67#qV1'Y [~44H#63%tbn0!Uɻ//~ # {f/ОLXbH3(CbxJsC'^2cLqQU"v,\U52I =Mp? k VSg+$LFf$aN 4^RK \;ֳ]՘_ECޝZk maovh3$J6gQ()Bj,m `zzDq[ۺf dx]gX;u9 CΧ8p1^< yeJC8G9sEmj*Râ.lR^Bd{,6܍cmTV=3v䉭lv`ѯ~tagC[pMvpI/G*xR£p`INe/gNvL=aF>i֩۳ѫse-)G2 DXa+CKǃ('Eߔy/tږ%Ԧ*A1!-% c˂ w cCu=Dߗ;S]y;)}Xײҗ4VJBVρ)XX6}Yv[DRBXˮNgΡ:L+UD⩅Bk9թ=*.|!|PhXX)G : e*j* @e%*,S35RV[lΞ,,`T|AKHG>@tȖ(}i]/d)RIVuv:,mȄuAN sF@ 4MHUDzEA)H"\h @i^5c um$=c2g Xoőڕ⇫-1Tg$w}}w篟b N+u2V<; ᭢͑&ǁqh.#7d{raei&ߓ %T)x_M.n+; 4z<w807A =PƶʪAr(N*,T-D=oTlS^2?mctdĎ[ Q5~{q67# GHݐ}ȑͳOn<.9A6lTyp:ӹ^ފtgh:sXjPGDOvnDOʉϐn=z:bɯUmHOeiS5}k7#f*ē 4YN' *0RF0jtK,7wBu!kqEK8vVRTUm.3.([cV^+ 4ږ* =VHƑ纲|Y+f$UBAmEK)E]%#dH]%[vlK]%ow&Ļwvv|;)!WPBbXl]Fi!gJ#tܵaamL%j &grϥgdi NP)҃t>0GITG+s<ܚq袘uoI׫jU5LaV}фZ&rZ%=m, F_hmض]r]?WNStK0rp0-isp^GeqZ:l/όαB>{mSK/CE<*MxX5`߭ҹeM{wqns<*յ0(qLabݱ\J IbXQA&cb`AKw>n&Z]qtƮr4ƥtUx (V#e6\kRO:PD~c2ᩈ#[I#?o͐k,F;烾ldAf~ )tGnк`t3:΀>[ ˱.nGG3x6v{͂Uwߙk`&hEv_i/5]njKmꠍDqtj8 k綰8s/tؤW+zZ|/ľh_^n y`޽g)ϝĤ=X(egzu0Bu4ulW,^V5^翷wH2GՀm,G{:s9:#>gUpCIQfKT &%Y}Ve)A&Q">j[wNǏqW4Aju8Z^ga9{p 0ٰ;9;?K)1Q۶/K?/}tH]/\ǰQVOn.(G;%$TEXI+*:I;rAT+kH+4[]AT,d8}w$\m ]V;!b{K^WU.|HCYraBaIW}c,ߐ@%+K'Gu)$+i`5֙u  wykX!")a_=6Z/`a#ؤ x@! !E س)#yO!jfk `ш./G .?Wy΂ՊV7gp3𰑻p?QCR)1]HtK 弲^Vss-p(o `pcˬJy f3ě<kC;(NI"-ӺbwK]&>n,V (Y7D_S@ĮRLWa"+ ??{.,d#`fY_{nzY9#w5m#'==2ˡ8 qUʁz {OR*$kc#HIp]틀pfcoL# CY H>6ϢMlB/߅ؘ89)8\u`ZT@J+ TLRTpE@H 7I(*&ITL$Q1iE_R]l 38 "&`6o@E}K{mªYVr GGeiqQRSB@+gpF $A8u$of^<7bV, DYl{.񽘑Oƾ 2Xa8g(5oeL-ʨQVP=R@A 8Hdbk{1͵U<n ཆT8HUGJ8(<g : #E]]UY||nSslIɗ͆zru59G!CK*ӡOIg||"!3k &Ft~#}g/\;o<Şg,og= u _`oW`}]``nFG% Z *s0{| A?uPnFS"-Y $7}20@Kĵ6PK(4:Ӝ g/7 Ӄ}(B|oC79X"qB|P7}yۂn!Qb;gkabJH  d _L]qP?ϤBX*`_oc#ag g):(Ltq/FOti+EUZX6m¥i$Eb3$((:PB q'L>)hڐع2Q Tz:.׿F[6WX1:N}/pPt*xd*ZOb!ݢ1ƶk _l#3H.nAzvc=Ѣ`J>[_r`O% HO6G+(' hMӅ\z "̶Ƙ)EҴy KmT8BsxH,  ⾛p?43 }-E 3{S2iAE'ٌPhBq,2YNBF3JYQtJlI/=>uG(J*P"͂^ sdVOX 9WP"2r# v\̔6 idF@{&Ji푞,FC)xR^1#Ji鸌{/Ra%?4BF~9 DJH2CEDKAObEuFmV~t+:~MM_DBj^rY5| cJӷG7a$XAgk=_W~)ISWI6NUݒ??W%.!(0p,7^Tڊ \2wKV9^7I#cP1 F`#>>j@ 9=yp4B^*+ PuF3L#%4,,lIqFZ3aE\&DFhC#,VHJ5h!! bG8(c\}caD^UvTu2qӠI}զ;ZQK+?|R^Qfcd)Y $aJ}݋k;.?^3zUw9>\}ޖ I|iw-!p4: Q9mNǫu GcJu_HM3GW޳Iqt׊ϔ^%SWF|+/HgN\|;\Պ˗u* :?cLRR]O AY}!'3rBqܟَ#_:"Y:it8#Jbni>s$V/1tӎ _I٫] qIrK(؊,;k_|SRvof1ݨ?񟇟ptuT~,yWG}^+pg׶=>GM+pL?^ڷy.Ҿz8`FWm+D~U(v6o`F(R)"Ҏ9307H"A(ag2OիSL]qLp0뫙iA Dus3l7ӧq uVeZUuM7go Ζۏz W:`|# § F`-4QJՉHE@4|)|tem^= :4Z,=.Feb%Qqp̀)C|eAA4em9ADwP6f;u~OYQ /wwlku}5j%,`P݊O 6` " 0LM2:8?z]QQQ :`ۯ+ z"8+<hTp3ErSm{Fq3V^urK"~z5"J4~rs)q1ƫN]6˿╯ @o,8c N_{=܏-'ѧ:4\J(the؊[E z~yqvIOϯrsp{ PNc(ILdcHV\ŴE7\`g*PiR!e8T;yy/"c< # d嗼RY_JKLpN~c.gVԽ)>V1 K ;.xϐ]7b=?PN?D#ΞP8qO)X*52/8УQCoL8ͻS`(g=wJR;y@ra G@2raI-&y #2޸iq$ԱM$mT:2#1gd@tC[oLk%gcu |N+tBWI^)!M!iջ,-c9otƉp_@ G2YKũi s[^2(]9m/κ|! c8Lؽ9uy I\ 9[U$ FWFkt8<1-y/oV Pz4㷮%zBBKT %^b"0xQAd4 jԜd-%SYg_znSPB 8@ Ba2XXc' s-ߨ*39)ck}D6 5JQa F?h>[,ʬ$|F^.W?:ШuDxv}e_+!.OdzI:I:4Foӫ#,BSE\85( 3k9A.:x}WmJeQ ۔ q\FfJSKԯmA>O%[$Uq}iTpe!#Xj7]zv<.!W{=}MS_ UFYT| ڿ8F*!εXv ݒg??-S/n.HqĜq'6ny)Rś\C*nLIiA0Pr)c@Jpj7$4-Eq#lPhǽC(%cy&0tSI*#a]"Bs`޶Ӆ5+TK[78/WC #:FmnpR'PD-)hb _`{\ f)6 {Q)78d" ͟inץ+/&LH[opK=Edp/䟟?՛X{So'1ۻ#Rv4~|}~f(`Kv< #;}iƓg kS6+.|d!XEX.(ϭG_W%w@5<EW-dj6G9j2YްFG#/&S> A5ZdP=D=~0J:P@^e"YyP $*2 z=Rk vz'4=B![Vӿ迼j7f羜jF7o|4ZQyЄ#&.f&!ɀket`D6^ DJѱ%ғ#VKM~6Xaw㽒Wq#% Z$TmSݖ4adLѦ ltr5rOb5lhɂhH''[f\ ~Fc`jTiMTCї&5f5T%u Ù${g?5iU[a-s*mT‘( TIL!`TQF~"y2:4 7Sŵ22` 'C, ‹B'}x0^E!Dd?t fzZuքXIn,%)ڞ@SB H ~n@8l ;qfd`x9qJ8ϗ2|?'U3[v~kZ+%Z('hԴV] t{C - <3qH%>WGtsx4_3*m_m(L;Rk?Dgv@N2)`(yqEd_ ) O~6k pne̎BYېeN )UR%3r ēBiN7B(tPVEn !sJ-i9!2{ʟ#Ζ?9NY9SprB5_V HbǟߧIv9YA,u@FSg4m|aT+E[ +u(R3^QkA{T[O;3e Fg3դLkz3B3;zG? >zS\;z3&I9r!ۓrhJalj/ug!d:k(5MSzuV ^/=/l9;A֗"'Tcfg}3MJTM~$5;(;%n+iq W? R7os*Z Xb!؁D h$ƩAe2g;b3|iDڝ Fdf[v΄Ql8ieT}3fڳYty6웡 u!ao`IfDk; XXn{>ċ𫫺t<3 PsN4j}ć\.XsD"4-bX|;2x3!~;T[gצe~ѵΖE kِE)='O/5F>.;<_&.V9._OsIz%g\% 㑏.Yrlo2t}B)t}Bb0Hb<0]dAg[m(tDưV9E ¢ƸHL~I%y)FLyӝ޾/\vd? Xwf7?G(3dҠQX6a~CޔP䥛W>uYddǫXCمa^wkC}%=ʗ֌C{Em͎XKER&ԫ-.WJ0%}6Dla=TE0rUx2%/j>DNnN̝_RTzSZk5f'Ru]f2%N黰Ef/](2ԍgf'Y$[;ٔaƆ ÅՈ:;s2SF?ɍ2s# z+Ǘ:ϰZeNR[ fC@RK@3I!o *r H9s  ,JZD3iFFdbh\H*Ik0NJ՛5=?vOCE͕h;qs8=kZYj ÄnZM}_ f)*4ĤXIN勏 '' ڦӫ\dZ=qxB^5R͗,t=!:=7vkSl& /V5`~a8|'3G߅T҃}(vvQVyqk (ޠ*@5ȼ{TDT鴂\A+qӻp K^K3׵SfEo4I_S4sŰ7^8N'> Neѻ&/{)m7W?~x*]lz71sX{+1/߸^^5hûWguZRk`Z=SyhA~{ )\gNVy5!%O "H @h \R.Pp |a+T 7| KVo#Y 2'j'Q85@|_/kӕ.?~%g^'μNy8ʙJV +V2:p5eD@+mb-֤`4`ƪJhJ|?2/Rix|̈́g=~BnЌiX.$ŅJ]z(J}bU?+5M掄&LQA"@$TFfp*'Mn"q @q#kdVj/hOm OEAmݣ?[^}/Ïˑ.G .}a@b۷6>ŷf9tǸvt >z\T3'|>MpJP*q:U]DK'cK {UnU%d#:DTs? ^DY>H!cڥFAxn-I"J QhS +t/FPU ppt]FI! (`9MiEC 8W,AC N18EȠ ℈c"5(l@MXx{'nՀhV4 3Y "$ Q"(ȃtpJ~3H*qdQj*b>21/"Ba>I&2JX FtCLjDg0J32 =CP""Rޛ!)9:`G@\h qǔ/H,uSg"{Uhe@ Q?{׶FdbC)3nh,vctoSB^mM붺t1ߓ.%T,% 6 C"2ODFȌX'`<%+S2Vג$[{zgG˗-Lr_1BD 1&; bmus8i=X4(J[g]5dyɛD ^(bbxǡ,` Ut\`]}o5Ѷn1G0edČpBDхEkE J}\dwj93Z'VqU<`*Ŝ40E&# χ% iՁ8h3lP.#/LX|;ٝe Â4q -Y+x3N3?j|!  R$GϳUc؀wCFo~^q_ƞZ*/j/X&!w;X%rv[Aodt|9 PMIZ*ț _>=Og>\ׇ@kR v\&'2ej\|/mc.|xOW~B}Oj//l< .5`|3Mɺu gΘa1uYċS,HJŷ"v!vˆƕ!F5?j}_QVċc|oWkbR@ICJvjSDr}LS'a-Q^C'vOCAbsE|;pF'1 KfA?EֆS_Ve=cS%SI(tlsiGLu/x99ҵ78mu#[ ڙ]] ZY ́'ln92d,쭔IwVݳו_Nk+1$ijE7pgZbZabJ00n[lqnf'dԸ{Dbo+j3jͻkN0QmwBXp!]w2ru9ZٷfO`cn#sتY1&k?avLfmffռ{vBBgeR[cn쫎dU%crU"v ;߿ >F^7CqOfm^0j˔ّVBhe7OmG(o-~]ۅrXa](u6JVOcg SXI2dy#dpޑJ. Krqv29 $N{y srݴǠ1z&ViCn5SJ . :کL2L|b!ǓaThV27s`]lsB̶Hl_@rF<+*e&WOfOk5o%_l%u<W$Z H3"^W7P"ތU'A4Woe!ynlM|a7P6[I2V="C^uyIgP4$0TUn ]3gYloDdvĺεr@kDn>8Z]Y(ܼz^ѻUw|mvw!lӽ#ȴ&Ɲ5 <۔t Y;!)d) ;S.vO+p 7RԦF* #E;js'2L"8'XOȍ8+p.+M*QͲe.)Y% `cN.վy}C8ZMy;DŽ6EHzfBe~D!x8ͲtVUl!D$TR?)0ģOEZuHsmt&˘$lNa,\kZ`DN3I OSipNc?0F85ZETX삇s[Q220X*~J={GTK$>"1X:(Wy~-:2q:킦>Kb5|>G2"p0wO>w˾.:W=d^P D"KSJ,\k}QyLn1Ɛ-G-%XbAY`*!mG *tJ^JoI;!`3i=.zbaa=8, FIF*τ,ꄝEjZl8-p0m ,%(,7>%VW1M& <-C\D y5e,,ܩZ@,I`h#iRA$EEBg=pK-` -ȥNrE0yLJDiZ 5zkoW#nhǽJIyEf/sS 0hqRA z'hChs>"iQR\E锱bqJƅoNjׂ,z(Lmƍ7`2BAJ b-J>Io=ul ĊhA{i\mOg'm~gigӦ8s\>]x38( V78.{(|&I, )НIeUITxa#([\H.c$c=sոrdӃ'(Mf Q+ WܤN26ڃVmU^# Ly^e84U v-yCyU>hCJċ{X.ƌ$E`cZ P5Џ -D@`v tA`AjPh)dDH ΍&TMA C@ԃ(WV):g(KU'u) dT1?g('[8d!^q[U}F| aEH\j@cV)V &-@O*j+e19aČ.hS6i ^WBv=ӣ8Yd"/&NSS\a))A 5_DdMI^iihYG*R.Ɠ&o4R@PJMJ& @4NƪR*[eJ ׄ D{^o&pU5~c!yVsc xBm)& "q{ wC.^S ј|OhC.YHFJ[Ka(CDa"21eZ"lwAx\^h<ٸ3d],ێH+m:lGގ3R_9TlLPռvdeV1{/Mh1+Z7x=zꙘ%&`%%F- oYWϹT+PGqvFTa;+}_ܠ8^ku0w$yt,4\np9L-S ؇d<%uK\H7k/[Z#ӓTIhw]^?W,ZNAr0X7}~'!;+2Ab]Z5 0`d'g.2f??8cQ ݒqnOtq ˆ?||tHu2G|OCKc?qݿ)ixJ=WJn7.xJwWR1kV*~R}4¹rRV ۸_ؔԻ$;zu}bkSTu.kQUwu򆞧 F{x9d>ϦWfr濭k-7,dOQ$SOS-Mej"1dKXff[/!mB4d_3QXr3{wҸĽ,E@=B1 ĞG'ُѬ]-6v/p정xg{[9<{؋eaI&a}S__^-#{j+y C`=A@P NY)5^ةgNҥLaO6>hl#뺷?XˊvR0cvp+Im2Fp&v7p+u3J9ڜ^CJJm;Q@DvfN;;ת ((lSny;"U?gRqiwkmAFJg:IBX{߬֍t$FXJZ߻O S:03T B|뽣oeޡ>ok& \;J7S`O31w~5^(.|GOOCX$ہSCV5uiSՏC~F7[uM?{g#qB콩-/#;$[ǯ`tƬq.ܲdjM攩Ƒp  gL Ĕ{֔2>'?Hzc~ Q/@; vkoƱ##:d?)N94eh=ANlIpl3e<qi-%ZVjlvM,hʒ51Vl^ UVeZ@N(wy&<^a`fs'?py&g>.`%J?3*rz{懕~~):#ҙ }ԧU7t|WkquZ<k D :Dg_癓l1׷WA\˜>zm]\w{4t7|1܏7%+iY=|7aY1͇~GEvnk{a"*EC5VA!ͨ4ID "|B8HU;ae0}r9NLm cp{Uŵ e@b ɧkIBv}clD{rW0C2[tMJf/9$ވs'.VCGk0xƑ˧\h149w\sz7Zc5ã׻S&O169XTzB;%3\S$ڿ\BtJ>>O g#xtB3%-(ɗ%jŋp-OLvBÛwrvr bRnI-:Nd=Av D{pRκEm٢L5X=DTYX,)}1A N&K5B;LoϰX$l={-Z[;-ϑJCQ$-eMN?7ϣ.[Śd'b׻PnY~wMz5irPVֆ{3x9JLUR,%5bLRO #P[ڒ;|@@רB7eHU `Pl*$`?lYdv/̼x<[jNRamF ,-kDIeL&MĻ8w-oLn?̱P ?%쵎$/%KbQDI44H)J9Ĺ&7cYEy5<{*\k ꃕBP;S,$YZqb}CdK:zuѓp.Ib[o]iu:lنWJDq ^쑇]3Qӥ1D+$K[q,%Ys'M$ ex/ha|2c@ǩNc_h Cr( f 1XT7ZQwLf9ŰWS#Ym#l-0+ hN>;tVrCg7w6ZGJri\% ɮI=5ʎMѥakÔk"PDzk14|l}FQe)mPNgX̦F`Iw{!` 3R>L |5Zb\E +>ZM J^ԑ\N 8q:pLn (X3Mۊ0Gad+9{{|k bD0TCY2I)X0S_Qx5wluA4o{,qPbQ+B ױ/̺Gg>ьTu jbÊ_0ΐ`=Fc}OWIsu.V|XfH=X`KԮW +.K5Ӵ e5Y8qlZg?2vx#ȒH(&X :{Uxk CvӚMiL`lg(ޯq礏@ZSA!LRv 0N*ie!d+aPl\іŶZZHkO&L)bmZC4e-mxv,\#&5uB%|"WNck|UHǥ dPHPжxT`l]Fj4c܍Xpż5='ɘg}E 9򍚨!.D^B51=Y_ Ӿ~ ~CbQŞở.ڽMr[{9(:D{yK駇KVc|ΈUCۏ>4Ҋl=G}<5u*ďv'TڛڳP)q)ݑ&aaP]!B#Lz8)yZ|ؠ5lLr>dpŒ9!agˋ'/̘]_Ϛ]Ѝ̏!EÈE4l`9"o`z~S#V*GBmZOQyIjq6;1af sp5tz.1GY Gsxjz1C->=[yhKLx.Drn+'\BB0V!@ "Y.w]@B0 "~t8b"Y:_ lO[@K'BB0"O7ٳ '[7:.BpIi)}}goW6L稅}RsTvImZQ6-}^M9j6->;i-?G`Ԧ$Kj9jY f s>;5(cZg ~LPSPX3dLB0Zk;ϹS|t#\ӓwp:gjk]&x#NFn ?d VZY /s;!иSb!ۥC Yk4"Y:?t,ΟEf\`)U^[beӞy|՞<OQ.M.&,YS.^}D_E뤾MQӸKjt^UM9j;tZQ}ڴdyԦU5}ڬ0צY f5r6=1OjC9jcz.1GYCGS]f/`jOox!^k~'W%}'Z{㺑_i~ȧg`wf03ɗ "Vbv+ҕZm6 R%TxX$R\6Vnu~]myg b @tAWݥ iYkF)ﵲ4\\Oajmo)H_p\{,H|dU}XZi~~j?\]M*|{*A7PX675!m|}UUɾG߿oVM|CC~pf9ӧϻŶ{ B84i8w޻hh (VyA)lӮ]2S|Oer-&3жPGr۷_V7VR|@Hf69^-\mw~͇'$~ ?<٢-040mݹںbrm7=4u\zR[OL;֖jazOOŧ1FYFrpdȑ4z&'83vzXm)xu^|]?|rH>q-HLkÕ]rҹfIs*wo|C?&x`M[U|%C.ƴ2A;V KUv8xq][C͋FX/ ?tTD~hЪNa.:_.&x8}}ygC|ɵt7f7(/?'&W}Sg6.IO'5f4 #) ӏۆ!\ 4-ߑCIS룂#duٞ2]=*LjI, nwGE8sZCwYcDnL٧>M' ɑLF)Ov; z b}.:K(K6;>E! T|yݫfw(_?} Ք8nwݕ3mYNhO6 2!)'dB`r:6+{_SE-P?}U_2+ ϫe>*ƴݫ7 7EoU+onnK j$|N/I𒼶vÉ@BbOFsF 0qDv Qmw["NH9L=t\\`*QoFTv/n.͛]~ig-c}NʳxXRQJ?ȻcDB!}!bI[`}4myZ3zxT[]߹v.!l,>侰1Y⩠KY F,>]6 ~9ݘ!hr!0KOuq=wZts-\c=іGI%=՝@O efh/G[""&2'hֆƉs!ieffo̔8YRB KkWD^{J#&[T<0'X2ҥ« ~ _"VMxQ@Sʅ׼=U=\8tϖ4W.[)dt틝å!brX/j6ZҧLY1*8reS *S/c# %3騋$ ݝLUt7fH.,Lgܜ`A,SjTݚYo Ƨy&!VTNT;"y>X,p$iь@R9i1L4-A=𡣟Ѭz0Zɭ!GS:< 3FRke )*&Lp? $^-hFyϣ><: ΣiFMiU v$ Q "MA,^j#/d6 YG"2 5^fkYPf&Ih}.X# { l@f| p9 e1ˆesf0=fmpy@ Bz H wEDqP8jKe7sw xi@g5}!tb:lSrDH-cc ]HHH Jg pCьa(PKm\3 Eeg_1`ڪ3o}ӞdYs6%R0Bʃs_cf@-ĉG0/& "CDa(z!MM TvW} ѮyC}D A/UtHʪ V瀐"|7ئ;"ck Wy9,h/ǩpZDȗ2GyDKκgI0 ya9Wʘgά*s )uNe@"|,2B#/D m־B v.֌:KCAA+FcPe$]E+Jv^zZz"]ʃ BT6i2Cmh/= 6u јreZ$E[-+-8lWʶ5bDT+50@LIdpkW2,I faMro3#,[$6Jze3Rk)2TjeādROxC:яZmA]EŠ YE3B(?e$; B#ibs̴c4L&</u~zز!Zd= ]4œ鯍(\~Qf֗e'FW': ^ibH T/^?| r/=nWy\.Y=iW~j96R\՟7?zuassWW]9-ibqCqAk-/t(vqOGȺ'3@Z{\mf[+&Ӿ5`=.!:>0Mɦ\FvwSY{F֊A鴶ǻMcKTFLݚnCX 7&:7>aػN]ܭ^m ѹ[m y&`Sy7.:9[+5F6[iM@ޭYZ6pm)wl[+5F6mѩԓyfAkMɦ]wKnԘNk}|UErUfAkMɦ]32ػY3zV jL>mjS5]ޭYZ6pm)m_w$ыX3mŠjbllw*:d:,h5wB^6=nfF֊A鴶ǻW]tjdޭYZ6pm)qN w{7EbPmlת|<]~fUVAWnM6%w{7Z1mL LLnM6ewy7E2L126zyvR9wkڻ `!/Dl/'wdܭB6zUow9tʦmzMɦ8]ֈA4˻w t&܉*hwB^6?wh}TmsilNNNn};nM6uN/5ʜ{IXͤ: X&$G][o+=#~ ػ(6At0%mI2-1gH HE:hn"R㮚f}A!@*NJ*(Kpc,VH@ɑRf@w$*i J,D&p)Y,@KZN$ԁjUdXIeQš5uPb*XF&y&0˼Y&V@5fe!P>F9GʨKR)fH*ϔa++]^T ֹ[ +)I [Q&h-!" FŅ2FKQUDBk jW@(\ѱjw*;8872N5+9 N5uD>ݩ溏OGYNu׆[mv#֕j&ݩV3k5Wݩ!zMRQ pM:F8<\udoq3q 9ru)Ihw\CtU!D1Z :3ֵ:=XȑMQ}kݔt\hnᲙJWřQ[ta!GnN6u !~yȊ(2*78F(v=N{w̨uEr&dSEnT"v鸎jXΙQ[ta!GnN6Q;;Q$(+q;Eu*̨5y#7-–6 jT_~oEsLӋqi빙%Y`~0)GWۦhMӬ.YJJE@JI krtz>8ƣl=nx|f3@4B3/"^TBÃ>vbxأ@(x'Fuup|-RWax,R—pщ{ A{єcHK"a"),)ŸFjnJ0Ӻ)GNڇ>FsGt>sJ {g~W; J:-ӉVxأc//EeiurV<*!œZ}ҿG~E2* wҚ= g_S05)BgE[Lh.":\=\\>&A8; lc-jq 8{/}oWqb9*c0𤐏"Q|w&E9)T*"#dZpKeVCPNRqglRTNIU$9cQ pg4+?cDpAd,[Z>Ai--` Ē %$:|AI|A#I3dQMʼn$$P,+ZGލYV,0hp<Ǔxqф5^X1yh̓qGxCDxA0)Ad3]Migqf97L蓂NE6)駤Λ^E6Ij>vZ6 j,TdsTcaWXͽUbğɋ8ӧ5_WżZZUg.|k* T.j^Y/]t %ѻ_-MlQNX3ǽ{kET!IB4 j+αo_梇[JeF n e1NVRET$[OUq{5՟utrP4]!ZjYlA7g#M|R) |5DiePVU^$5]bl4tGy~AN_(Ʒ1Ez?1CJxxLl9-Gs;[_HCa=K}'&td)U WjH-յg`jjC@d@W-|M+#_NH'$f@:8 O C-Z2x(oLf\NY&eRlHK:<[?}H$rr"ݨxt(ɟ|3kjRX%n5y'URE!JO/oհ0{Yc~PCyjI)T|EzPt4N@,85@~'A(e pmy+xsB^|7\YĀjv&Ă;ݚ@w ۠p VvVG 7vW?].nԚ^'!Jl-`KՇQQ=$rEf$e+ݳL`V@ Qƞ h2ɞ"K `GoIUF(YsXSD4Fɯ-Q E"V~݊DC1S?0İ9fJQyʰ9 /ߍQb9<"]Đ {E^WBc;ȋ)c,~E'5lNX"Ԩa@aiE+W}LK:t_McnlFhGUFt^<S&*:~dRJdtC:GjXAc|2ݙ-+=~J?Do rPG=&/*NP^dҞIlicOfp/MK{2ƨ-Of5]l}'1M$8K_C]E=qipK7Ͱ7&9l8SM#C-ڧB<آo-}ャ %aHj-~1f@Ҟv`Q3䧜)9 hf;SE0b,(]othV.>K8MG!׀SzB!Z?b?mSā@1q@?Ł6[iD,SBJK!\of'ȀDޅ}ɆCB,[0@'ro*B}\h;o@HXI88BK@Dޗ\!k?IJk,Y,[K?{dB^,j׳X|]˪m/&?M&# dGF=4%cD1j ,u5~z:ii4WA^jd[Ьv j9Npn/!cHXt='{. !S^ZtJ#?}sFNh~~.o?,WWGoFjzt׻bԣbVǫX|>i'ZFZwbdt-n&&p[u0"w:l!SfuA1[W/_z+}/D!CP&12XvB|3]b5Sm4Yvbb|)!HڍZIs5LpxlPEyKlX̗ND%D7q|:] Vv#v]J"3/}p9HNDʿ'ʗj!:/˔2Y0feP\=*Wcn >&Mcf8le&'uQ^$>|M'Px~w#\lE0fAl[?ttjuH13=fc+i4mkß߷yR祶K>o(棩(+j\"EI&UUe<%=EEprW2ǵHyhٺe=jC6?qY""Q 7(0(u.52o͘28Z^%y- UJvNִ6 ̲IظJq'ac/G ; !/њƴ8_L,gW&Ie,[^=}gQ~L=lzmTḌ!#|Cܭ#ؑ5e<>ĭIg)!nڇFSτDż(mũ arxpL*DM3*zK/6kQS `j {JFU$PY2a]Uy\u!Ê r݁P< QK39 4ZPgs>罸s{!B$C51 k4 Fh$]U '0&yc `*`aۃ+2|mQɽ^'K9ij):jXq|˭fhƷ,@oQCe2%7;dힸr1$ŷڳB)[z!Is/N³ \2' ! ]|xO'9}V"@@>c9h3,EZ²JIYLu- ɾ}yCGaɼbIqv?vbҎ koлY+Ծacm$c+g,݊(Vq@duE"QNM0/)>|>?h_n.Y@k-r+K|5%>ZkK2kw/eP$!.wP() owf;Y,א ~ cFhCtIpRMB%|e8Aч yB⟉%m0Y\<:lRPnuD _J8 ' FpLjeo$k ĤWgRQy]б0~ˉj @W>%0Vc4Jyw=Hk;خI¨5YM4X6p0_&cJUv\iZW=0Rz qYw`y$3N:Lk ~eH pM@k`P#ZUpx饓Th NKǒ`VECM AF'*pAِ$ t*H^KuR=0uRĝӈZ`&ik *EL1p ht.#w[9y輣,ƄSC"+ `d8~GR&Иh)poǑ N@ Qʼn@EQqm88.C|Đx.!Bd<7l`1dp Ƀ3#Ƕ9NQIʜ cB7aRϺ 8ZwU%81PCTq#F akc.Iъ4>eV<i#N9?OWY+C0 ésyəB N{)B;Ft0R }-Z d bŔ?>0U1GAOL%xW:rElt`Y.Fm,-;ad* bJ%X*%z<6H$;$L31s,GLI4?![5EDqr6\q 87(# PFE#q%w@nP`ҸHO ( j%InfoeNp%M_ \mUP;Qonk2v/KD,h0x9ݼdӎeƼy6sy?Ǹg^t4BZ6jQQ؍װ :mL׀AtjM uvh{ +Z?n:dQ{p. 1™[ڟE&1# &co0W#*6#V( 2oFѯw/@Z=o0LnM(?UDRs 1FBԦ"^V>0cq~>mCPmz84gi;iuG,4;8a4`Av>sF ıO{F8޷S^w:3Bc֐LyTuv3 KP/2JuUB_,M%&Z*~9%`t Qܯ4BqIC8h1Jc;NGA^N7P͓'Ht%ݝyE&gQb!5Y\;z TĩF fKԸ NWPDH\ɨI '^kTq|bcu\e].2e[([6"]}RGDs֬6qB*m[w’{ʰhdr dOW7@hI׀B"'~>++RBtW}ۓ82?ӎݛMq+Z K*eZl]ݸ 5{cLuXєit.391W1-'i0lK]t|2s[vȧJcMOny3h,2}4)g Ik]\D֖w58aaff:(6ھlq1q[Fz:ꝣEbtZ|? Jf€w(K]EETV%ymkH#I%эuxuh͢%  II # ɌT  gt61)T[J1/dvu Rv6R[@DgL9DXKe-`}5BMΠqjr΅& 9hg?N^o^+ܳo""]o_!].q.{O__۰ɄOoA?GO2^⓭XvA_b?Iʨd/xⲨvztyhy|կ~ ?Zu\lu0~|A - }$)[t,n69FP/,,(?_!aB(5O3[Z{ws6)k2!R޻=oQu _\7Y[f&Prf5(t بځݮ/F;F4Ʋ )﷊qO_3YW}iଣr|cT*g녒T!VՒZҵ'o"E5)>0r$GM280Ժyy"Ҷg8\GYNUA WljtAL=W l b(28Wqt '[w>7z;´wd;C$ E{α fx8DŽ׭iw(SƋ;})|yRje P;.y#)@qgh#a%x}|Fqdнoz QAPZ&) f6 h^`syU%rhFh:{p ?Ż/vvzuN^}kv,O޽{㳖Amy~-vCʻjP MN$,M^h(Kctde Fa <*AZX5TrNZfh44BR fD*D8s9,(IRHcr9ǀda7>Atmk*#s(-7P> Q[HtυVeSsV%ЪiY+YjThz w HQU~nb5EFj@YɉQg"$Qo'$X5&Sc4!گvV)$og-<܍ly.QFM9e9O](#uVo0p@7B~F8o#kO.sl/ʞJ v.˫YMo7w'?)P5!/ieEWf*WLqt=2sIշ4v KrA~wdsռT]xmxwr]~yٱlh2s`ljI74(DBp'K^!w̜AM d dH4v1 g}KhJ>j~X=h{6*lHEݴ=U[ti9&1fh'Gj+=K`L>8Lons 2I/0X- ,QxӓMsO.f_h+뷰60}5,޼d{̦ۉ-4W\ŠR{=f0 G/+ϐ# c4:n0Kx :D- vj$VƑ"unAىAO6lc$ ?GR -9)*Gs3&+8_Ą-fy?y ~v@xOt7{M(Lh1MH2b &h'05cn(]^tjEgjƈvcc6t;GX{2~@CÄ"i{~C^)JԵEq -\0M*BcmjXni` iA kWk 49*r) QyC4v)vAAp/O4vv,oIhdBXVk5v 'wEDeC]QåUK3RUوQ Pw{,MvɱѢ10=ribx-C~atpFa“֭k!FÀp𶧊K9\'gj o{`5xc=X7#IB \tFFD>39 |Z(KKQnRnͬnnfue222HCjȑy d;0F=oxt_BBM4)g@'j`YY8N]4F ' a3l'5xp&-^\Ɲ%㚋_QW^-c(**#}W[>QQ!F}dnwѯ7I]fW)P,"{+VtjBEդ}>}LFnj=/77[Lܼ::Kg[{}&whAVmS2'%6AB65߭xeVʖ:CtZ7%(XMWlU s=:j%tkǤ!EXQL`g=wQ `%Y#J$bcBΎm15T+z$l ;m|zKn("4S<@+\8t?(zNT eu.!pPaOA]8;1}< .g6|&AkVHO% OBD,jzo}m09U)2ˋl. Vc<;h#Ap;g  ag#&nӍb|qsړs?]i\O7Ehi{ˋK,% ןK o{q~E&t f~\7(=S 5v9ʾ=+]BbUnKbdFNj{Ri9'%͔K"2e-!њ"[e+z b)A;w%<'Ӕ6sŹmG)DDr)v~VqCtO-#E8ǃ/n;nkv= ]LwFű]{2K@ܥrh>ge  ӝ r )"<L \ h0]NLxaXJX3Bʦ*ca7`\:Od?B6I@OM>Iܵ!6M Z/ woSK"AdYe5p%xxM#zn.Y}p6M z)vf+OQ g̉ӀHV*fh (ňٸi4.x v]xn3a=v!̈́KPrLz!C٦c|>g<:F{&r7a~o􂚼ݥw?m6;wtj.m 5];Xa'TC;;;p!ya3)z)tYF7ސFL&qO)L@j8L*Z!)+`kU:6*`p.'m0-Gzqͩ"zÅ3U_c"}`<~:1!(M`p>S`xT=1`U KsV`r-,3![)[}4Hyr;K?r$ַ`O_g׍o84BS9f"? k<=Ӆ=+=< 9"7SOӜ]醯>&&cD{yO]}Ͽh|@">Zt5_]h)˗S/o0z=oWdrGC ϛ]B2k`<*+v#$bu @UeL!d ~ a|\bOS {d'lιp']3?OtL=: kvc%sϩGSce_z^w3e&8aSթjOTN$+qEnW6Z[19I[i=$d8;ўwazۧgljuyg#/8Y7{,8ĵs+Zl<:Z?{7+=6+φs1xS5>X#1dMH\X` xsX \͢鄕.iyEtIc)TPfS #L>YW&(Z`d㽮3*]A ޘ-m;N!}o/gd?۷N_z9)xh؍^?}'swpҮ{k2AOӨ1cd4US#TU/Hhm`* FuF|UA 4l]1J ҡ ,h1jC Pr؟jęxHZeS0«!9]au`R mϠ%p6#nNL hOC:R qMP(U)8xD Gιx3jޓjĈY;K(g \d1>)/&T[ Z J&-@xSdA0IC/+ۭ#{^BψrAHIQD-N4 ĩK_5h%;т8Smþ/"vؤ*-&L2"kU F)9JqbfOԪ7,/ZD)&Cbb-(]jbJ!H4ҜeagkV܃X9ڤeŮN9B!k/Gzwn2%f+j9dxѻVͧi)[Gk`IpkQR4zs1$|2j9-~eusTDI}^:R/ 7o놂J0'`fdElC/|DyS9Ln9\ hW`ob VZm?beЀ|2]oil'a Sy"%|M/E.эң=#{9 \NLIj4&&OIz{>%yJrݨҧ$'b*IoWW/?k_|q/(hK̸ÿmhAb&G+/G-]]WFF0eots5{ͺF Vdน*0CtU6vuT!G!-}LFEz.^\o8o.>]]_%y5>{@JIsslדD@F?R~2\ϟmzG'pǨ su=]槸.>5ѝG)㥺XyI+!~_ 㽇yyݐ\B먊ϕcΤt5(SpL L%N7nwyyj?wkŅҷo!ԲMݮƼ}]Wy[:ىG8Şk)F5} _[uB!1@P#;V*)(Xޓ\W}Mt݇8c4Q$*$c}դ&E]MY 0W~UPRlN@Jq@o]:R,:# eB4*eKE0ʎ9'Zjd"s oG:Tac`x St=oo>ۇܤ令~m鉿]] ~[ Gyg<jV٥oꗋvu("BjV0dM~fkv&(h_Osb4_4}6Yq"Mfkdt{} kZ(I Qc"g+EлcQ vz3OO1`&5zE5EQ!0DJM@ :Ƙb1fɞ0P 0Rb$E2YeT?;- HZ\ UAQ9*h$NFH)iTCwVoЅ-L`RyuڡrfbM)Hsh= 7NMθ5 5AH`LšKj\T"I 央{ 6/}T=}aҁ=~eϹsՖw 3K)\ r@qgNw$/;Q]3gv[)UW3b|<`j4CW冢vXb9VOz*7Hws Q}GŪ2d|$g|D'#b'8 =%a ۡ@!Bzם}ġ/}͵ġ";ԜձE}|wۦvZ'|njP_kCIחյ;YXospn<U&o&3[nv{WMŖ?WJuZ¢Lbl>Ѵm^U?䇎 i{݆ܲ'ge+7'h Oa,%yxڭ9S6xA)%i4LYzc q:hݎEMunpV'W!p-)ƥmڲ\mߡ14ʏ 9jK_=?CJ(h tWzszs??a;>< 0jM||ww={T FOPTp3-˰ύNmƗH'}U-d1rFwEa cı +ž]Fv7)Xо'&欰gk͓49dJZVkam{W%p&*xWNRl_Ǯ*5jclxi&>vc<Աs {Yf0Sİ./^|It\^VW·rLmө dz@PqD,}{ؗg_{jKeH ̉XBuͅq\e8 Z"S0iQY\@7/s qT!lŔwfT1 sVعyeuIMUdOCO[PƘJS/ 83$Y^aI)D3QT:rx+\T &@*J|w>P*W!U{!0'( 8M~LZXk0@ ixSRf\-3,0@!JѨ7쥎iڦVRg*b 5τ!TҞ8OPDWTMqCJ #V89+Q8m)"R =±r*z-m@dXsT *b4ۆ/8\1\e1%U;C#4K嬰xɆ ŭ m tj*#AV=RďE+H0 <*C,ix)ҳ(&!xp&- )UY&P:?%9+6B76J")G J[`쁯[*Ty JgwF}nH@83}`YP42u)a`ܦI?}ӟDU[M/6g6^ҐϫIA:^uʙa[>(ur|6[,߈-SX$03S"~K%?|6T3"l@#}Ţ  4Cl0nPU%oH.aloQ71 `i't L5پO?D#%:)P\Ϙ fAcOL1Ra { aa#`%s^%6zG4^sǒn4{TIތpZ 4~M 1U#D%$8gX<50k^Qh"RZoQrF֛ȉr%63$&P8x.7_i-HZ M,d:"kwmJ76՚7Hk({ۻPƛ;(݉HӸ cE*f1Ŧ ǃsxݧ\y Oaq{\ʜA&-tfFzn"Qd0-lPVaYe@$Zy020!8A:ۂC͇mW{0Ô,e섴yʌWSOX㔖C9d` ߃Fp[M7]n(L\\^-mBc~?*((EҜI*t_L.nn௳_w\Y-n|[S,>&}1tb8'NZ]`{̾ȉ# o9|￝0>U SNJV|2- ؾӮn;P{t&.`4R#VRۉFhD9AzJmFRгCp&a$:i0l(}7t"ē!YO '^.]3ZمuL8Gtr "ī^`R9!ae5S@H5TgJA,tљH~X8dLqQrא0R;& Y*=&HX/>2/$e&r%%R[3R_Xtr*C OXaWl gzL1'ahvw|wPw}wL29aKE^,=s)ay\ZCM'.fqhzL.2KkEKYCSA`(g_hS̓5څ{{^]ZힱL^NE4ZI irN9-2q5oL(9FjP Ź1Kg릥)ZSLz7E8-iV3n0m(˩Թ Gk*;NUTT9َ6 v%ӻ86$T5.LJ=]hsϬdxGAǻ/0xqD7+'rx3{I=Bg T1g$!FGAv!X-36gB#"]<Rq 9T POFZPMYB%$p[[U:OJ(`^c]G^DzTSiLK7t N\az8~nkìcugꄮ=~kb$Ͽ,tpT~ ԴtW?zjRg?b 49//kK~8>GGƠ_EڳXKdԫW?_>n{TafA2XY~_I2V wK)H#*ikPvY萩dF6x=hu`ESL&MC)4UA:KiOOqզժW 6JЊF7}6!K8i PǶ(If :- &MlV1zGeJ")=^L Hl( 9F'cX[~PERPQ[LPnû]+A,oѹAV"sTfOްXvt29:ʺhdQ8)Ao0pbY.B`H.H"d) *i4:{Dh|H%R~Ii C2ʟFK٦! s9 5ڲ53gp F V\U)9A,hܒC̄Ȃ!c~ ?. ~|18,]Gݔ׵*bu\'S_=aWca\N^$X"yubM1xSr ND'K֞<:Os=f쌙ϓh0gri罱Cld%7:z029H;ra*}0De\em|6 "=XBcU)F!byKXsRrBFb$/˻v64Q$7PdHB˫Wg3{USfVv;),8@$ٲa=t\EM+DCp:Vg),q-;02ʢ~(Ec]Ga#5lo`E6YbdQATOH`(R%h" O \L%FKlN=N'/J,=bt=śYd0^y.*(#Y֙M8b-b١ aJh=$v8b6ZzC*c[J sZXaKag=/nAjkp#L)N#+37-eE>; &3-pІ\LbfِTnT^J+E:kbs'I1ZX91yv(gKDHu!B$M@`%<Nʱ0D!-e e_Xy_*2өCAV2&)zαe p| ηUM+خrn8߶K=n~w$q0{wA`X3w F]K]%ޟ[G*,iirv d )Bs'V(cb{ե&}ۭLD{xPj,@EI] e pdgW&F9r͎;p,' ;@ Ϣ)17Xr{1`N7`yve RLI0甓`'8ސFT<Pvr{6a +|yo0Q9?aEh7]QS0^gP @"$caAr bu*k,7:2lV(ATiљȋưm'[E yrC=8m&C wDUf(DY‘<(31xYNCd/pF=:uvǏ?+`V(qB#F*HImRSdek!jVnx+'5^ιsnB̭hE-?ژ5f>2]=q`j˱,|Ź]bi,LisE,ȌjMԍ6>!3z&XL#f]e{ We~9&'\&Kߝ_'Jz_Y#OIY&Rv9Dy5JLHDZ7" ;cmRƖ:mpڽΎX/ ;%cYm>rn0{9[@s+v0o=mV  ڮ!V"J\ՀvkM-;nOٟ?z|LwE뜑x1g>? ˵Dev/eo}7ޛpxxɢ ~,ZZ}R}sԻ;xj kq)%{ԼQ R)}^[;o_)ٜJ%ar(˶LJ;mY<0 u?҉p(VjE7c$6IiEv i$v(4ԊL]HǓ7]:eN ENLu9ؿ@|Kjw}trrve勓c1}ʆWgˮ_Wk!7xAqKSش:ϓ  'ߦUL].M}(u$ڦZB;bN綤 ̍ r =JJ#AL 6lK *޺vvӵ^?ktFpsvƞpi;peD(вqg A[p8!r{ȨƑe!ugc/me5eu d(h![6WN$l)K+W.ݯ\XsL-UZQ&xiHƖBZ)$#tRiN77gg'KeӍs3dmvqVCy3wyٗ ,mIaz'킥oIBv!cQ eDU)GE>eAt)L%CI-MI% }9B8k)9>pKBĺ³MJ%d0O\",g"KY)^+X Tޕ5q$鿂ˮcT]Y"+=3/`)2DXٞ߬6AF5IU_U\Vm#3Dk4"fWҀ˖SRBq^!\Iy cqo!M 637h|%-uО2j9iD} C׀Ry(ZIebd*z AsC zzJCFgBk&]CM6d INp*qMdFWm Mrwi.-r'jnD Y@ f\d# /H{ZxP \rjYg!ݐd[yy *JOh+ta3+pMގP%уC4 !BcZjao8)SB$u4L ^kK*rt(l4 aH-`i.‹qoZ)Ҩ'[qvJ<ÿGgOrMJ8ᄎ\Z%>lo%h3|p>IMUHZ{͔ raYNYT h";qruv^:Z #FAV!p䵈#BC o ֜FMzXj" xjG0 !A(Qu5o~Zndn[ၤ|!;SmxJ`Y >g4%So']=sy*e ׭&9%HJG=*Z6J9;K=v*(:nYӜ0hB 1;ϒZ&7ݎ$3 ^t= 6BrgKE=aط^TQS  xW:V>iMA ~ RN5sqaa;'4w*D L,0BOh1zT.]AZȠ% E560(W$2Bt4rY S'tHE؁YjFSKѻ`),KtcCqZ:Һh)w9\w%?/7s*O6<,JjT08jhA> /3t,w2R({CU"AT# @|ڍC`寔qoJ PJcAPv9C A/ tA xvcGS冱 (?Qk:v`ʷq(;4A[@ gO@O whP| F outFh=AsH #eiO^+PO^+(BMp%jŻeKԊ7PнDxo5yogi/Q+)BM~ZM"4^+g)BKԊfP3Dx7\UPj={F먙3 xeԪk{0TΒ {Z :pczmFb B~%WY ~O?颓B\R/3>_N&0t{3LyUFj_|^[jG""7۩]dO?H)o5S5 JXlCW\7 /#}nf%מZO@BiaDf#n9jG zoXPÝ0 VۨG[n]Qyf;S=P7i5 svgpoHIo H| ( AR;J4 7jz}{-KXɪ6EFڑMT<(p5r1|> [D4aء=(`4 g$eLxi$D:ь8_ OE% = x3e- G)7*G!YSgEyCr[6T{Y f gjÅ}H\@BXRv_ػ^R.b!.ʈVy!R &iElldnoy&eTzJ7{Af&ԹHZH|?M_dKdl>xuf Xv$wbyݬksoZ4QlDGb^/ؒ4q ԯpn&'KC\ |3J_jh_G)mm< Iil8}<ՊNZ}答y;:[Xफ)--ZKЎ22/N{jBD4Ѽع=F޼[D֮A cnb!/ Kݖ ~sOE!xo= JvS兼Q]me,ZAcAP~ó@˽uvAK#M-?x!,2?$2M4N)1bsJRݯRHdD#T(̙6BmvK*"e "Q#Q.L혖W' fҲ^<p/ߎ271zTr& *(2;Jog4_ Z2q94;O! w<> ęǟ* nS/zw,f:qV}l~<|CEYw{D%<0,pM3U ʖ;cFE\GhONՄSP6#~r{~moEa AJѠk~GdɃZ](T ox~+IJ݄ fn8@ 'BEd-;-ʧoj8:58 Ը'FS Sp>Rb>};ګ+0 U 䈾UC7% 0I:>FmJ6}∾9mv<<mݳ5KBO(r"QFPjOuI""Aj;n&j4Da:N:KT Zmd6YDmǯEL4|:"E6bbc *y8&/lvuqnHwb?މy,YO/#Ϝ4Xg^:f&q 3o xR4EaQJ ,WʂR~*_?^mF9ú/z=XsDxg"^.W $"Aq*|y{cAXA^N1tŻZU*&@48).%I樔@hTXy4>R 'vA^i]'>dF=a";ꅺ!L1Jol?2Qbon)j$ 44 z7^Qo"pϕf?fGC`򎓆z#7ԋ?EhٟTZ.V4 Uݙg?iAO!c3w jSUqf'0nR{Y*#$ck͵L'Oի,Oq- /cro=#LWpt98{&'hjj T|Hfz{EdOK" c:$Dzx ,㑤@\`9"6Fr._&D8${A8ǒFA.a⒥&Ex J#G!Z :DN%W׍s}( :)Q0tT0oBp X+DH'e RϢC0d)BڭM1w5$iFkbФxM;;&'PW*W +u!8Sk?v,uǓ74IhC?dmasuPKeC6j-WvuI"2.#`Go2ƥp*t;9x;OL!Y:?B-Źʎ*X~w#agB\wwB5}$jm_*Nz* U[a/?>Kx{uJ~wr{pOT#c/u3 QpQF"x _oonpMe:-PƓY3l|%@X!܌~wp1yQ]q p I i{X@ĢRIƠuDH𚬥+u s+HI K, \ؠ kU,wmq$GWg&`k/xIWo`S#"jveevUI!%+"Ndd%Nx9hT1Xb(8/m9`?]#x' [4Y2> {MgΩh+*~}kg#)5 uJG,qYto^voW_,n]]=UD0ъ܆QpEBWޛƍ~x{w4cn{ziӛOIRG#Vx3o?^E{qwGyo\*4;hAeXonI|Q<;r}"P&ю9i d(b@᪊L$G1'3Z9 5bcuv\CĴI Nh^;NUF2r//Ehqd==OPj`,Б=vX1 ցs&p4m iU\JJ4.qhY.R Z i~KM99KU^67}rLHg^$՘R3y{"[3$WOr`r%SUO .cjK%4Dd}+9oc5烷ߙאٷ q\xsi/+5_-F'-pPyΚ8 rȺ7sMìj QmvQ.r!6 kvL!$JlDLQ'%Z60 .m:I >"Dz"jߎ%&Bt+&D"-'2޻1ILKٔl9_A~z,}Lfϵ =UkZ|iSe2N0Gcq2/e煄e\Olʸ2^*zxu\@q(l}ϟTk#O7r̪3e*9Vkk/"b6Ols6iUL͢/ ?eџ]5Ur^&~w|k7߶̛٪HĿh)! OD&Pnƣ6s1)' OOfN2 J5/O:l5Tsq2g?3G&5qNy" b>Xu}sa 7ͳYA L.?_6J$+AxyH^ed KRDk-QF&5UqH[;i )W|8;]nkW (# l~}n.0֎mǟY 29ŭ 睽s6jNUIQ4C2xy_fuj7 jMѤQҦmR 򛙻*ٳLi0_;clmL ӔKqch0[=Hp1iHq--J @Vb܇K݁ XR+Hd\Z\l:&ᤰ!tSj h JРh|{^%YVoVbDG tG(}GNU5({6!߹''MKJmuJߑݦKhXsFn)zFhO}$kNJmuJߑݦ\ErVhVCsSk]4`H(=T`kݶZ!'xG4$ rBڝ7'M8%rJ,a1e'bG쑰 9y"&C||wi5i>TmG<6sX>F_0+5>+ GWLryrA'RQLV>]NWKNf.SNK2:D, ,Ux@Q*t@bԜY) 9@AԐ ^`JP/`5bFŜR~==1YAI5xVgb,f>TSip4wX?lx .Ro')GE6]NQF/!{fB]t#m;=6ڪV^`u[Y:)e%`^L>ʅ`e3yhD@HjѰ\U.85v}-e6DVFelT@eOLT^,!2/)P0@AeH%.J蘆STTPK vJIv.lCZLEhZ.C@  $G7R*qWsoi+}_on_ەom}AgyvU#.Ϛ[E?,S:d -WōT>kI5>=!̱|3(B!P.d:04rR%[MX~MX&{-f7{{WwMyQJ ?Mu<}>3W~mV ٓ/?n=\}۸?/! _s?oxn//.PD 9"䯫?\lo~zcxu7vɶ҈1m4<|՟)hyR6'{~ҍ;T0 ]t$8a\V=a o2 $ N qUiv ,xɓN%+YIpaZ0w)jIo0^>3N>.Ve}4V(iŃA.Cyŵg6hFbEe{=?|-BYG;gmRy8>ze֎#YvmPY_9g¾[׼_eejcƣ;yVܐ=K 7dOEzk-x03g5=vt֯!CC#/{$剈a֯6<:%7;a{xbt^]PZ9~ztQ My2 j]NRyAUQwu!ؾZ_g(4:\.RsN 榔6WTTcǿ{lO6)wL:…!*Q b$½dBdlY m` ;@ )#NAE/'mYBYRYdȮ xhTZ@fgN*w"!ua( yu he@p \R X*%#ϛ Lԃf[F)gyI$yE,ϽB;F(.܈ٸǛ_FLF<;u^~tʜ0[@f0=R5t|8P:+(- :k(-ˬR!ƟT*0996ڛR<)A r9eS؜/U$j)@PiB=Ou)'=iM&|R|jlBF'Vp>O3B c 85mFI@e4Jӎng *D6 tF9AƂ t@)Ag3sֈf@LփEDf|=V27zt 5b0:PdpE*Ǥ__ѵd(ӾUMCj8.$h&3٤k* bbaSviJ$36F'DDIɛS00kTkTrRs``W1eGbN΃)9%kT35!c1 i9 4xGp5U Ԅ \ib&g6ՀFpt\ǵBS,$.2R_3U%#5}Vvb́0S^!>_{%]?cę}Xw$8J6v7qѮ^+jl'hRR"%_%ja8}q/pqn%"^%ZZ2n>SPVMɻ< `i%+)*O90cV{ڃ R:׬uĚ4< \=W0; ftj!9QEՕ`XlmDi@hkk⽯kV !CGں(B5TKEԡѬBAU˸J-mSc^R@Tw'*Qi !Cu?.qjͺU{h,PF'{ze a3*p@(VK!$f8|@n-NQ5ap8D}\0tX*q jU8% UX@iVq[ VSt9b*kTt c}LgU`H. ڃ\?l}Hv~53F)F:[`80z H˵ !N=aUm$Aa 8Pý=^>XO}sP$Tױ6p@_]3=F'\EBqxҞ:4@]((rK|s).5c7q1YٞVRE'貈ל#j `R(EoÄ' a7QuFq)R!UH͡vSki p,8` (C(mJ855Ss]gk+ׁ} vHK\UKV=`]h*pH*jlQ3'Vb!,@o,c=3;*ec^ęŘ767U6pK8!hMïg@ #)8խ}MpvS!qt'L7r1_vmvX#>BW065g3PB9gϹ0\*@8&,qFCh`É^i2AT 50q^ ]26{ƖοH4ٝ:.;_4_JrKݴ@*F =ph0UܹJ*aK{sv}|iqQ]2wOU+ 'Uͳ`a-G59G!+ -9B%BK#6v?Ǘ]z{oMƵeƵ^*thECdzgOg%%CNxVÝ-gKЦc#7.*,ue]2=]%ӟ6rH\.=Elh~H,vuKvp=<9{<0[L*96[CL ;~,nosޣ2,qU][;9~r;i@dc7MWLOj݁yҔxz$kxĴńEB0JM\f]{ίM_].\s9Ѳ:X>X( zc1RܨIC 3*(5F_vşu(^:F1l|{hc{4X]}F}HvǠU6ԭ`sϼ7AiE::̛dE|Zyk R+\W< rnZ)H8=ƘI?QR>'83^f:g5_ aˎhzj0%9X7O/^;bBNiI~<19#_L*K~t']ܳ!)H`c O#,& aj_9F8-emosN %y[RI \㹫ɏRxGӛTU6?m6D4?ϾdD_ :8C+K'0kJC;Xrq^cO|v7s|g N-84ڂӶ-X֫=ep\ *B8,Tl'RO??6d,g82_nof晳ϾF~H,U@͂H*io ayzj5x"ezLw*d #2Vʪ"ek X !@s! u쇤^A?,5ZU}cz{F'ʺus6:Cwg~N@)=^ ?P#M 7d~iT,b:3taL(yi-T$f%XnBiumEFg ' t]LHAC> K꺁U"hBG}r퇔닅ԏakҏ굃Q TJ3I7c%.\?*EpQy^I~t ŖΧӏ>5$GH?*QɨRYLXX?*'nH(/*N׏AxmzGMQ, # L(=,ʤP,J8kuYFB B딣" >ԵWGA}ku_-] ? en2HU&eެ9،ǧPNȂQơs92almez~VSɹ}YeL9K RtLK2?ԣˌHj'f)ϯ>5m [$:hs$p\^}smWɊqé쿮nk9^}9w+c"T7ߝl0B?}897wW~Nr^>F +U|88q~3MǕ=luo[Tm0#>cJt-dNIlorb5iFBp z:)g:״hKCim1w-RetINZޛPe ޮ'e&1h&.,a_bj~!o?V+o쪎ku  6LZiְz4ZRͤIypz CWs彝*ع^=-H\g$mle<оo4:GTcG0(4p`<8U$ta꓈S:,K*61eh9R=q&C,M%=j¼t:nZܐXybG_ˑ/ێ߷΂jYؓ|G`8WeEg7*)A.fǸ#;ZQBi,l^81LKxN0;d 3:OYdlR|U')zt ~~pNRza*-,R&t5wx%4s$_p!J{A98¬֠Z@8#`Xb,מZAR:5ۤhB"۵Z\Q_C@zԧ@%Rpfz㶛ũAHHoa*Kz~v*˃@Uz|hp$a gKJUW6 JIմ6 tE B;tuC4<21i!cW2K1]g時tSt@rKZ~c_hBı_0|ͰZ_;E~.ݸj\<ܜMy]jZJw7}Lk<ٔT S>3AP XSW(2p2Z\2Zv%w+[Qy[}w{u-`ѕȬ= THR,3m&]*.Z ^ݔ*%!Jy^7=],PSYF-&MnOubJj֩#s-B5@ȼ~{_)k2N6R77 IŅ[z4D֎g>^٫$OU9aEK&La<}$c>08-\x1k'"Chm.!DjdG?;KR޲]$Ȏ;w") ޺Ŕ2V"{۳83Dv2V`J:N JY; (Jdǘ#;Q邙Bwӻ5Ռ_F oWfkv"ϳJ8n\%[~*'0k{v;Xzq^cO|@ow3w48JO(=mПFSǙPꨯ\!uN;,PbL+]O??6CM,)!RjņEV<6{rmQ1YsHEM1D-zTj˃*]Iz6e9Y(@͇1ut >lF5>Oշ&fi$m3-@*QҦdP ]Xխ"q5,ݫ^K%"l)I^j>N[ \)UMdJi1Zv"UX䠦S2˳8Jfӱ6Y[ֹ5d Փi`0|tmhV[)'mZvK[i:FVʡS(IɔVb'VtDcװ]~ʓo2Oe E"`re%q2*RC)dT\*o\L~Ž> ao^^ݮfgM[ˍ}:Lj,z/3];gcE>wOhZԪF!բ2%Hh/vS* zE% PZ7ݻo?7F ¨4w=M,ق0 -Z-J _)̹ռ2J $!Z4)"r)IDu;wuPqxQyBDI,:,v%$Qa$_\Q$ϙ_uz؎g:$ٿڇD$iQs)wءMy$צ$ E&R:,IgLz#=U]<LK|s|=z$n}GVi*Zd.LVW0ҙdc)ĬSRi/ν3'Xryز^ȣ:+rN{#Ha ~y\*6h%+B/Q N|DrfTy1>PD^bY! :z dž8T֦ȾUPYO*ҫJB+>*b+vQZ tS9* UBתFe(T؏nҨD?&4Meۤwi*1+ĊDc#MByk+%Ώc[a:8l dI4XrlcI=ˠm*˗PkVMV,-˅eQ4y &-S,QԿHbȆy6вnh_6-+CeWڥeM# ״f)؄XizyD-yWHcL"Kdi&kH,K6vDS[2eFmiaK hVePC$ #6!2)mqʧxNqVM"smKp&`|kbPQNx~XD#¶Ogp1B(ABVhg!T؜"WL]JI TSE5'TټnBf"o$^,$.kƬF1zXYm()0& e9<\t>8&8f-7MVf̘7_yDޤsmDyFBmPxDUGVj^& ?nWdmʫ{wNK HM%QdQ0*a XߌQNWZI ʐ֜O,w ZdJUf%h~b/SC3Fd/B[0b*FRG‘z1%a@KFE#p逶iNr9@5}%a'F7 G{7&mD(2uU~n^UZժU-ak*IvNL f%y{wצ >6OB?돟??]/sޯzB{o/oMvOw0{{\ӹֹ #lo(Ǟ-WoZ߬>#ss!4/Tծ`ҵU]ja4Vcc{}4/ oR)8urP I]BA͙ p$@TyOQy+M zˡN|nj*RNy=AgR hDgQE[K eG5Wd!mU^VzZUhWVڇSM~0J]uJ*TV˪Tݵh]:ѪxX^Mڮ4m{0 U[mk(U-QkeMʖ]Kh+O%7cɱ-r% Pm{&6%CNam`m\0-Z 窉p(4sx襾עF7^#5$5Z=(X[L=&ބWڅMQDQ e:ƦT@) 6p3nɓBcZ)UEu9/%U)Qj.of}7:;mmFa2 ,䦗6?Xe 65"w<֖,b$-=(`4N ekʓ"36؛GcƶOgWp1&Œ%(jA -̮lMJI\̮EV1PtB-s0>]&ª5̀KMHy[I;s%m6&X6lbV[(;9+~y7@}[F@LbN\űRUmǪ7㰋_LOG $Qa_=z$ٱf}jj|5myL5K;fg@b͊Xd~3B%` myiYR-Xd_dcލHXU3 JKWN iRWnb>146~l11R'-Q, 1EI4 t҄GAoRkX q J{ ٥G&NN>@o|[ &GҾ яxMNٻ{%Ļ=Sή|J iҽRBL$R(( w^QRI]ts+Lk*(A:^ ӿ.vZiO$53-%s! _udz-.-#m0|йڱNCC N4]c,? KWk|Hߌ%[1-U\ŕ%bWUX7s3%4-`gHn%4 ѶhH2P !!,E \%hH[xqȆkR=խ.ަ!%:VKCܹGcf bNS+ccss$$Y"k"G^f `q  ]qfGr9aEm%Z( 8HçY]")dXOgNŭQgN(A|x4WʧxHn4|>W>]Ř (I0/Wʧx"g,O98)iLI be_RuZU^n?%B>c[Q4-OJҙb*"/sE J_iM73 f⬻kYw1ox+.M';3 *ϴcJm)3^B5xX?49p O |za2azь"dhTnjR (UB!%&"#P5/w]e~@ڕ&׬F\Y2oDDzo;H= G\SRiYǥ,%PZq d@)JayxG&SЧ8z\QwQ ȟ<"D1 S1A S=(FQ&y3BXmeN,U9^5˃0[u k4`\@,:u/er0GgQ]c Y'SD,t Rp--ZPo 9SyFm{y~˝"5_-.#myl+ԦjQjTF6RZmmѸ #ɒVoƒ(S9 5]aJ-ٴZ느]2Z]9h^ٳ tKB ٻmѣmc| - #.s7[@+'I-@9zRю_JV0#o  ' 7B+xnW@Gw1\2z'rLňJ2Մq^{EmTaDwrBoqwVJ+_{%i~|ɌQj̓ЛۇqǛ?}wrXzW1:(}LX|+}d 9M#|5a+dZk4N8W[$dA%WcL|74f9OUU|[5L񾩂Vͪ2ԪV5jYW [SIsXzgXyJfSvx2rAO!#=z Kxwzwȍ&䧻Cha:|ֹ ީl\E(Ǟ-Wo߬>#꥽=еڕ:JWu-S]X%+y|rx+ Hא1o8dOR,ëIzYt7#ө"PO⁗Q /JZ<0w,O-0ܶ<~UdWdx/.nc;φk?nM@}Xj ecYaħFp蒹12LeccIQ@~$,yHǥ9o?R&l4Lj75tx\(WGV߽3J}Š\cm?{1 *{|^nձΉ^%Dk?饧V{7.G>u yتb{.?B~tڒ0׋a@@v^UQ:Sy)"zlW^VzZULǕnHk"P!тkE%Xb"(pMڀEƬPmދ]Qv|<#-'DKinZjif^ŕiV( Uյu4 IwL󝡹#;w(R% [4p7J4q $C?{G"3@W 4f #-۲[r>̪%UYd>,WͥGV*3>dGk7:oO8tz(n>Cw\gu^}b.ӠNu WIt&xx)DI.sйQPs9+4tہuv7UFAP%` 2XNw`LY4L'+ߥZr&d;ucLP- U)jqr5V.J-47Դ>NhIj_UYuկY->뻯LWȝۏ7W?7-oۮ2]o+zlS+;v{_@7^>z= e5`l>% !M5KlUTBTaHQׂTTuY6%)U4Uja]b<ɚ7ɶw'+4*늈 mm,xXiMe` 8xh@0k̮Bh1f.?fDAm}Fג5v״Bv޺K(?=(kz1ȮV?#\@NSGvM޷ކ v%`,^ Ǯ u80h"ƚ](+9l?IrGcWNt|<ö rt닇FB熮2gW29q-Z./ߨU =lD3;Rd` S0A@Ũ(hu:Z\^K0ix<f<\fLfPUvh$)jSbtrU"O\NI(ԴbĜtJpٌq`$O$(|{VPY3tzYˬ`5~dV',t1/8r<+kt񁔹6/$TJj'.mI`^:ZuF{ZQ(G5ZEċ|~ /)V 9Pb&Ԕ4 9?}jrwQ Pb\ض\t+TۚY6.Є|bhvf+em ( d$-AB ;wbɜ$ՀScf1N,l`8x|b/P_sjrTGm gYLp"y)3wשA@}VǹbC:pS5PnkJN Ԫ3KSκr VCX[U[Zu3B;ܵ ݮ 543K T*-=,n^.n hmouQWy|튿7/̽>-c>-#a |,j_w^PE.ӆKm|+bsͥ-]UPZÿd= fl[lr օֱ>DXֱRJkj'BaH;!y rx @ I2!H%|FC26 m xݝf>ܴ HZtRPO!EJ4;wx>FpԻaoC1ZDa"k0) `c$!6f$'eҰYS$kkQ(ϰjHmfQKd9zP'-_@0!k{~5`l:N\ CA'DE 9ueϮJl(Eѐ4Ւ|9]aD91uxA ʝ3r$֑o|sqPAr 8uZ:YT;W}bs =! +(y7V>c%9Ol&tRKDLp9.NQnyԹ\4"Z;U]cf0 c+G9.z>#3.@uZꮼ8PS/v@Z\RZ}ZFH a4а!x1ulQ#lHu]S-#KmKQ>f@O^df< D Xݢl+Dld* jAV֩hH JOX4 i#4$o!Q(¡H"m!"`g}e QP7|ZhC(>&5 )u=ېC\`O߇"M3qy ݅_zǑ>c_RRAɾTlBhZ\c㔳憚7A*QrnV5L]>ШW^_ MwWoo?|?yn_MXRdv%]{Ǜo.&e~z߶WN=\73`Gp՛jy7ğx{蓧R  @ʔ@b"SSUNזz^|_b͗Dؐ>;ncmZ!%`LQt$)pLa!P.)k3b^tQa;!x [ҭĜk&,6kE]3V st0aAwàbT$p[:զŨU54qdK9TY#];ybY1.HCz'&˘MRPX ;-r>׻ ;oK8Fr!@䕰eB9QmNӉ"ha.lRN&Vy 0 Uge^tޑ/+1.cV'W<k1zef5˼c:Vyurݕwpk%Ίdn201Wy|7/bZV}KH NqYFҵEVuXkȁ.REjW+eT(TU~5x4pU|+҉-ʦR\ޯƎH ڨ teL1GA,e,[رQiE$ctW "|g{F=:Dȼ`PKAFʣtD,hv3j(蔈(Y'XEOL& i9ïW6<++编xeڊWV$77\|nhR34P^A ;sub$TaڌQ^AĜ#=-qH\氨 ;-8)/r 1 E:=!uXk ~s S\*%S>:5¸;+!VcQ^;xc6P>^pi"yCD[Ul9y"ԙ'r B]5/I TDzPx1/Y_*Y/Q?V_ :5^u߫M$u_ iZ?M'N7o+.`%+VEatѢҔX -Xn[ǺhUW_M-Re\W]oWF? }lkǿ}|Mۏ7W菇׏=`9| 4 5yw[]ڕPJ벮kA95;&H ֊+CU<)J}3ۈxzeY;f鍅E#,?fKڈ}yƵ59dGhT,k'֍^J*:گGDb{疫}W>>CC}[raͰck]u7 paɇv? {*;?w+0K]Qtff]6c;iT(lVYP?z8N5r!O5|mw6KfvW}LdKN fmYMQS;JeQ憚Xג~֦ @Գ6Xzɧ~|۞f@%oC_ >?n7h#j7CcI,(Kȝ&t{gFbbWZ> o67,p,`9Xd>%iR\fSA0[bP$?~(V]s\j2c^ zh ؾɃ~Zn>wۮ=|ntQ/jaJ ŢB\o[oX5;|घQ؉{'= hV/E[jŸH lz~rEn!eꗮL5ڔØs؎tnGSf-ON#inwCz8=͍&A TQ {;.2{otr1{(@6 b 8+#e61Q|Ɂ95h˄Bd/7|.DIb=7p=/J!Ώ'u֣Cגd'u9|I0rufjځә ۹q^ s?\=6ĝ^1s:%7*?E9\ˣHV7&E9N95=!d% 90ʩ)Zikvh(z& u%DN/mO q\'$Ŏ,YsqPQ4$a?1,dM$ Ńo@+sB]N x$tp9,蔘\bq|)_܏K;ﵜ 4 JʌgAc 817*߿mFeFt"tRbfyNrbe2>P/mg+b1ŗo]|Ic_ ad/{qMk7]MBDр57XEٔ|~l UzM4* qjjS4d\`싢p/߭FЪKI?*tb\PU MeQ(h ZWcj l+V^8p2ӶayR^ {2FxIcam}{,#/c=>`|{(q<ЏcJoxKuݡ u"IbEf{u ӍԬvӽL7 JL;wF-uG$9.)-`29/.)-+dc;-k2:mV͟ǸgC씡_HˊsV$ s CI''T`px=8N 8H@4(ЁЩK 6ÖFvQrpyvUsԘqX@nᓃ !?\3<8)z'97|d$6O'1h4Z٬K䤵:F1dYaNV dI! LOL@L/S5z)^h1-+:!pf0aN9a6 |+a Ge0 Qa>34T#˜y dOc" CQ譑E(:ARDcn F@́m1td+ G=n+0ΛQ6 mQ.ֈKWőwPNKtR^7Dd*1$ic0kfIԳ9 ;wR8!OFjQL̑$A>lwd  WIY N wRd=:BpmpQs"^b !E9=*/;5c*+v26x93c*{3.ze-L T'Pe+o0K `hYkA>ӲkMWKoZzқ~-ӲSWE#o/WU\re.AJZx2-+EQؗ ViX DTUzB(F NnhY(*j(jWږR6ՕZ Z8 oDiUXhOFOJ K4 WS'#"%[:1ymA0vZѽ.V-1?^޽±ڧZ8+o?m׌ugXvE¢Pwm{*M03`':y7{_~ ܭnj&;m^H”e)E-4Uu\ooX5t_oIr)w"zEq1];.o5AKHa{M816 IQ\cd`IQ\&[c^(%̝?яu*GWov ]ѯ^c>a[HYdV\st9*lt~BIH6zQ\9|Az89ҡx՟J(.ZJ ҡ d*6c$GQXA J'۸ NƠJ(W:&@qdy)L]5.鎡_6y|0= =o;/L%O)FƟNAq \jVdS_^nǍcF~cF#9ff 0~`&G`Kq,'j /651dm0LH Sx"a<{GƲ(3fKe޴btfuq v98FUsVYrh_U(5v]qn( a[{DysN|T4mXNK|lҽc@Iby'I@H ;zb95$ >s%IfOH'c)B%!{.ѣ:烷:'/2%](*9 tJxNyr:s/(57`켻0vjתh0v\B؁b0ɛ0vY!Gv Xf|~`գ7777}M]i,D+]U_5e!Qij4( ئ-ЁU/Vž^?~dXJ ɻJE!C5lEajV&טVzEUXWUA\YҶmJTg'^ymDTBjR|z(wۮWt9j?m&m5?=~+K?׽VxЬ:ՇPv!?AﯯC察߿&+!~X}[~3~绻|g!BFΎO_}xCzx~~W?R+@axmsb0Xfwrl_trT{] ;ZrpK(%`K/ N:0LJ5نx@lACjhJ$4ƯTN+YE^䶨=<%,(Fn56G ^nEF06Òk.qLPgtmLNPb@0|hY:qT'` TzS ML[x]٦e[6Hm]Zm([.0Iw:xEYUpUޅESS22ցiADRQ ϕB< E^s ғ?R W$"HDjV}=J4;wEtZ OfN~ `d:+?ʯD-=a2cl..',] .GOrḞB5l$CZe]:İz'ТxÉ/\JH[utR:F-rZLՐ9=x،Q8= [i>o8*yRnsO(_˰UL䭳νDf $p*e^Ox;:f)uFe+O/H2/ɰUZ ,DMQ{Pֺ֕ץ**B8lס"x/bPgQû}mעt!ԗUpyu[Ӻ^#eJMW\(޹4q+{IRuȠnlfqj^ǒr~!)yH0 (ͩFFȝOW>V_ε̗޹+ V~ۏ=y烯^Wڃ,>smSv W7,};Z% 45Ueem7~5_|w-ֲiX"Vi[4RC='-S5(kE@EXP6Mc.STB-%ʓF«f䫞)3WZEϼ&%d*vt33n'-sgs<0o`\¸)W0V@;}n/`\㡗o̾.n(ӯ v5+K҇ Ԭx,p`SVX Z18jn\mK[I6oE#Li*Zr ]uR`+k]Y BmQ^7EEJ$V`md}kLy сVu [CpmAw܃{㸶(=~k6zoh{[ArnE^z8]3LH2OJ r9gbkyqufa~~E|gt}Ce3t ұ:p6Y؀d p z8,J-1ӯө70FEV,Rm8XZUHSJ66)F' ᅛkx،1jA"Ǩ'N)xacK)[dˆ9.s2B'Q1%ǵŏ#QCQ8k)v:^OK?)F%I9N;l/8T%MOk7u;{[fӲrջV_LW{˧ )b_xa 8lvOwEqmo7?ltNϊ6ˣ6ˢHKm /rc436:<=3bȅ{K|x<&]7B.|P-rQ֢ZHr,U=g:d/ΚD$TR ybo8Zcܥ0mv:dq ǩҩ{%9vg͂;S g P2kuqRX(Q@r"Z% P9?kF#A|85^.HF|}C%78:_ W OXitT0d)ڙ)sg><-N 6{(Ubhc:qn 4[-CpV+Kg[f2JJؘKi!'VdubnjveU+5ieS(ј ]W4BHf󵀬*%.g(O*xPd+eDnI10 (EY5Iֶ(!T-f^ַyY7#"xY//k-Jq⮕m$$j(^V^<H_/(4fwp,:JCs>ըKC=VrwzͼAPSIiHӔGW qcLzr#TN𼾫`z>p SmA'Gũ( y;U Zcńp6Y<:,J-C;`*6c$P>*eֵDBҩ62+J'N1DDlDTl0$[DT5*"*B'XJH,J ^zΞM & tdR AЈtB%`|u4#|" Hy 5!VkY/a}ۿͳ@x؋ 8O)rP`SE1JͿtnox2*!& ZBˈɸ3o3QSh-E(Z`B/ծy$jRTȧnoڒQKƬOABŝ:-ԋ3sj$qtR/)H3Vɘ&Oa'%gpn?]WTG񺄳h(n8 뒴Eyi?j2Hyghrs:*ad"2%uv%zTEsϵQ?;>ħU.Yno1`fәgiO\jG T;*^ ba%f^ 7t/1{}m+dao_-'qmgmA|~EK8,kM[KoZzқn-ݾ dS t Ue׀PХC[*umy{O~~9w6] OWP}CYW>y|OW߫Xi޴һoR*F`\5VPYن W, li*]a+[j()%ʓj<9FVό[h4`=>1ˋbl{g*^Ze$0E?x)*Z~+6@9 +Yxc\VjT`/V^*mB 0޸ZUJY[SHPXG\р(|noV; J7W6jǏ_mSzJGmEysn|G=Ҏͯ궭cs`ŗ޹+ V>]ZրǞ^ ,Y9j-1'X[߅ ] ''񰶰pzX[vzVp.Y:E4vzbvz$3ttMAG,V)X#ӑbtbP:.-P!q:0OK7O'c` ALI"Dqa;Q4Xm93/C'_&/dR(?t/[>+cf۞?mîޭ[}]\-&3pHb_xadkƷ￟-q^Ylk$% waͲ(Aa(?m/Rs/Xkc0  .ҝqcXHnJTKXT K&c=qƉci>| _ IuN8c$V9m z<- |saSSI0Qq8!h59v\P4@yi='K%"O.G:% Ul`la ÷vn#.x9j+;jr\ / ['g}z,Ra aܐ ;`YCa0,"a'˟m (`F`1 Slk{&6ZZܘr)rХ6uŵ: |!{a`˵`q | 'd ǁI)z2#Hç%e*H~gOK0-Eжuuvu0P\ )L .gC^c vI j8Qȅ+(%pl93R!عwKvQJv/қvNU˙S>"$!UKVJ!&f]=bs]bSkm2S]??뱖޴7m-.1-9ijI0r*DcKSʦR`  ` x*FJebR6UzBM4H#Xq-*kJ HXW[pHܔe垑R<~5Z#x(93k#,o!Ux@8K1X^6B[3b] +eL:~F,$=o8d+MNT3aqגya #plo& 86V.7ReC;_'-eV)I1Uac-#tԯCGåSo:4\L֤ * kUUWR784׾ gV3Vo+nV>3Zu= \+SI}N7R#w5|ׇbuƫ_|Sݹ6e~w>Cnhۏ=y烯^7Ã,>smS|{AnTQ/X.Z% 45Ueem7~5_-|-%/Bn=cqEX߹ = Jgz#=U $ :Y þ5LlON&{nr=͢$yxX$T8H,}o0^:)k0M0YHalRh)Mw16`3D%st"fQd-Kj/r+Kl@Z2GȈcVF20 12QA)hNȿy89މ"jF HdMTe=4ø6O'Wq*< OdDtyJ 'l2AcҵiYw XBĜ#CsJ-E^yYYw*n>/ ^':%vEVZFwyUo6aW%m2c gz/02[us{_-=^ )U`6I0HlU'VeQ$ _nlnFCY8h t1lh?6djk2rT6k4HYq %E] dM cUa^,҄|"zȖqyh "` *lQ$BBfY&<x//H)yRGlVLs<#KxE7rUHI7#I!ͅt]b=JijH=OAIh9|"[&%|Ij\tg0)3XhPVU I:. me 8`%!YsX,@aii/i! 34$Na\_Hie]ˮRFJq`mXZd[hx y~7McFՆNjJu3u+4)%mt1I K36}e]d&k 'wI1I0BbZ%AC#wSf%N&MBLD j[;{zvHo#H.o]4yx:.} 0Rs"*ЖE"3YR\[1PPa3(k-YUfEMfZd6GL}*Wl28K:-9!tjb.wℴD Sm A'G"TfY:$m G"Smr$E-74q#6O'Wq!;Z ѭͧq}Ój?{N{\{/Evx?zxūL+V˸zjS+1}/4c7k'ocGj՟r`?i$cB޶]+Rz{xwpѹqO;C&&RBp|X.nZ|I' ri>,b3q;i>BtpӣY=`䔦SYlJSd|^_%fLlNШZ;iF1SJH(sN3qZй7$BHBH :lfjN/bT|4T{)A1{(EoIeˣ\ly *ۦ݌((rQ.mڻ+4`09v d[.foumeTeUlDѱRk׳7Nu%|.T 1'H*ǀ: `Lq碏L0̝#$UI ̚h UQtVar!lrn^t1*'y(ҝ@6nɏ:hVI G*@SLA4#C;G+ɘra>iTqlf$9U`LuKg )`u_,V-L ٘R7CXb;5\uu]:x+k%vmg54 TtH2Uۮ.IHݓS]mlZESb%+K!Pldּl22O*ŕ^ʶ>*-`6x.溛ll6p%: +TH²mݴinx˴+]pقZ]7l4<%m>>~.VȵlYhPe\ d-IlM$*bCW$&VLt5ԈQ٥,=l0kq*ɑ7v&GW+1v%&C" `M:-lN=pz.IE:VùlTFPKI7s6O'WqSqj@-2Eө61vNvxJu^mVK(ԴL0N>*si,bs0eßI s2ȄƬ8xY 8jaV($=/>I0Sp $}:?I&Ĩ!K7Z/j1QOC'ↄpIzʢ )QO9 q(b*7&I";wZ-ǦFWZh9W`5RE@#Y[Y-R@Nu-f%COE&"rnB..kqb'ӥ0 |o|* |}U3y ~/ߗǟ7lmۏ=y𻽯)>iוvׇ6yVM.E'Au1o.fByC>߰ ditPX] )\n97:iwbO!fNȗc O)7ޝpRB hGKÜ%vaΉ8J.H%a8IJĴ hgߏH 8-kJT9q*YM& ڼB<+V:ݩ^dW_tcڌt籢W:|'<#bhfr fr'(+QshYꤰ/a-!u3$Uu5Ԡl寴ZEmRu lY6Tx-OJz/,˖ʹH4/Kq[7S52W oJU g1Dj. &PsW7a\`\1ˋ$Fb$_ ̈́enR=v6Xn7aq=_cY\$X&Sz⚼o@7 Q ldK5[S,Q d,&sn`FWDaؼ>Zu^lQY,=<'*Q"G|[qy, #skl2:CW[X2]xB-N8͡QhjiLatFN(.t,+ŵy:͘EQTZkX"vPlJ' lJwK-͕Ni\Dl#4-ꎠp&B (C< Lz6cVz50pQV|`NC^_ҹj =? Uf{zZV{,]\K}0žk5_q`015-,P5- oP(7Re͟WZ3a`͔.⾅`.MHdW尃hdWEr\~21h΅\3F2hnn%;&K ]< m[)%+lBEMYi]c$6X D- M߄OIzN8yq )H)aRG u N`2V'GlMQ "NYNJ& `HgQ( 2!m03/\ǫdMuڑ3 aFsr/C:݂s{FTt落UraΉRL<)_)- aH_:(ԀpȔ&l`C/t nx'[f~yŷ좯'_7TC a"tFod~'Fk^h/?}r?]z]*ϗ=r詤[ϗ7e~[^޿/?u]?w]woՃl?_WN_<fkZY6[뢪xYtt[(|~ -(it(fn{EGUx)Qcn~{Ҙ#UFοq]PH}K# JCmAmɒD]/f^Ȃ\mZڞ;}GQ=%DэABŁkK /4H8\đ#asAbxƏ'\{;]E@LAN1 ]:Z|ϧݏk՘\{EJ usJ1)wZ%r"^^` [ɹ5c*V r _^ -Yό>! 'N@;5zBqFnыNmچ<1Y.zn$omWmkj[Uo65 BڬF i ;V6aHGd !+Ϋgaxߩ2Je!D%+ Y7 %Jp p#FABM5Mf}+OJDxrhI߄(۠ 'uӰp}°PyNJ*]K*j6ȅ5ҕo]xoPwG(T[ؗWmߧ[B HͶn)Z&vsEe)F/O^Lh_nuWnן_^A-@-}??_]q{wiT,V/^v磻A}'}k>ۼ6Ij/_Ie[Q"ugMMiMhSv*>)o"QRPގK_mR ێLtRH8ƙ‹"`#!KA.~?{S:\R'Wo a $P\ڌpI&}u:9VCK_TG֏> Dn\a`p)PgsTyXI?_nMBY =[$N0BU>~ialrC:%=}V@6:.wߥ tDl]\ \YM̐˭*<.,s7S4AG2!ź_? zӝ"J;=;y<]:k4BQbG˧Sml|N̨9-ZN)16'z }{(@N IO莝~\r+FshR_Hҙ$OLZdR<7]cy=d1{\&!%R8d˩XZ*e4tLӆ?Z{C&kUK 3X/7Y`!w8ttxL>E ,F nT ݐofk\/|ނՉ)ʺ( J!ۦTѯJ%LEU6 K.DPiW`&W6e_狻?]ptUHz(^+<]z|򟮎=xu޵u>viv4aq4x w*rl{o{?Y|F#lMU+W-Q)pQ-hYZY1;<CZ*`) ̛`" #LqRP'r;G3͂7hj0G?'6%ߺPnB虡kw6]ѯRdĻft9P:VSA[T ί*F0̍;W-ж!Qm;hvTIhəbȥ aʇ()em6%nʪ0Յv6UFa(6D*R%U\TPhu$C)Gɪ$-Ђs (VP[\)4jŵ)@8R3T k+cx{<讍oJ{pf2V(l+yע%*(Mkڏnknn÷—n|?^MmWm6|fF6 AqEd(Wret v_Ȫmy~ZۛʮmfЗ \ܶu^=ͷ_=]y.W޻J[kҲUzCO4(+R*_RgKu*+tQВ0FJ'@OZl-;vOOZl-6e=~ʢYY$J[!ZQKUZX?ceG/'쾬m`-K:[v@t XC6*k.VTdGFIMNۇ1Y2)*k~9Ρ GC\Z$ 0ᑥ˭ȇo;Gr ]hcѯD^0A1zXt~M?60.N_Sqz{tΑQKsH;@UoH@Evs󈢑t ډɐR@;qppUBFWEd%PN: "Sl Rl(3bqHh=SczD9\OQOt299gII夨C g5d^AOfurzhXS:N]udPqVZQGE-èD@C9ayCvs(Οn7Al-Â܊2ؿҞvMfy"Fɬh>N߽CٳWVPAf4j+BIjq[%VUr[%V8Yt2ӧ4y\X%$S G&֘6W:Y$@$- '<^&5Q<RQNJD>D D>ݽkUpsBЅEQha]VV|{l/߱1O&jUA* LF2#۷Xzҫ6^K@7[5B5ߺua}37#kjR`t~`, `  }NPUZ-Q:)g]?!o  e)\ YZKt-8W邵1Gr5._VwԤA[ ,cH<{i%wyށVxa4X0)m6dQ5iynMZ+ ?VؼG4d0ft9b^ȦA9z[ҭڙYe/k ܟDԯnTJynQ*Her[\繥Sm|[Zxno8t5 bH`=M#IEeMޡex4L~OfjJ+YEeި<2KkLis5y9s`F'QҋdiFRWv>2u6SypEnHyJs IfbƷIH1PHO0a|Fi".#ׁJa0$nqK5\h=+]H!4t}aR PojMCsמWkm$HB ݭ: iob2dCF'fL>n&g6F&A곎]q2Eшi,z%r0Iѐ: Z{ޢ94$9#{4 Q[ThHTmnHC<&xӐOGWqѐ%6X/JC4PѰ:AoV'PbutRr|_i_(*b#PBd#j IS#,'t^()L鶤3/hŗQ= GeE]&9L ^tZ6ۊOi>AEk{a*a\=G3;l3P1Vj_x KH mfGS2(E5}6(X; .)`&(UR\)m5?5*+I0 UM9袘j^Y2ǡV^lmE|kz{ߖ#4;KS'%ҕ&eyrAH%P#V2nnnWX;'.ɯ :FzBPsHt 6tpB5jXH'tASH}vV`,%뮒2ӹ^IOx:QѪtOw ©OcP`by,4BKx`i#3.amGu6ZDwg1Bb ZAoRO]mS 6klUWigMAO.yهwz!ۘĜ eB+d ΉXP)[U hk`d]PXÛ<:d̓IX(uۉAzQ d+6h6y6 Zsÿ0I`.EnW+tgW-(%VF$]K"ep.κV#!-VE^H qu†JN<Նꄨe5qu\UoP\EѢ0\x Щq$^K@!O1x-!W'trN8/t?;.cZNQj8^]: :&|}oT'18\(J\sAtVΥd5\yl^k4UQXUWg;iGX AT)Hm7ܨ[e ӂ4}Sۺ.8Si`O e}X5ߚr-UiRPsg n Qoo;RƁr~SA|p*_x@CzA4{e)DZbHY`?6+D ݈fihBH@k|,wznzCﶜhdJhb4JɆ NeLi\JsVJ9u6+} GdO}r&r?՟Vڿ)[>a=6'aVH/n^K&zx'7yB!k;5x^ ™u+QϫU\Rlsc< v#T`bC $zJaobUmOS }MFheN7 񰱐P"qn%7J.ghv]MѯV]7t-Iw(EW9F'gus`8oNx=?"h8`XO H>BQFœ us1hl<2eכL~:zo@dEINl| $SSm`2eN$,HK%dx@ LJdʈkP2eNѹ{ MqivL_nqJCQGuQ;y#"b W>Z'_R n~owu!yQ[P j%vEQj8^T EҢ,3unzlWM_ 2~ F:mpk2ԉ\3jb.&T6T96ys[ r7GHVvh|>5/gu1A4%ցo8ᬕ'y DTJTׯ# 8bzZH)z ݉d9i^JaOH (!^yvN  qzt4s 6heRas攀єcE!c/*^٫$Vtת0Sz')LVtIusf12/2-RG{Hъlڕ/Dv&wSTPx L`wͩ6 R?uSmxMZhmnqU~: U(K!D`p FlQ&6fi54a&D'7 )5B *׉P  AN*έGu0Y p )רGu\#!:&R2D[$a3FegPv, o'Ǜo0a7X$_Қ$FDgifH4O`B?%DЭQ'r`͓D$@(IyBfukM q CH#Cei⼶ff!9R׵KS,[[Q>꜕ܯ%;@t$x`Gzv/O8t!X^J8)uV y 97vZ$uZYˑ %%R)`d)9wiN EJ,KM:,UzX@p3;Ob)/ЪEs kV(讼1MdР"iV&oc0|SBL@F.\-ZδUEy*(lI-_'2 2TEeD-vN$,Y R 6*U u/On_/\j_AAz/_7Igu'÷WZ,tz[O?j=\}/nw;P8zގ,~~ys5?/6˧OWr3Ow>eˇ6|_޺D1lA~{vG%ZXɌëxO~RJj?6¯"F15G4@3QjJdkcd[VlmtcȄsVkg@p&`ܽ{Ѓp@9վt4F'vk4n+c6:|ǝ/3ײnw{ǣerK\fZ\iUIgtj9J U!rs}J%Qֶ6IVls̩w@ТȘ:]|ORA*.j_b2#Ki2iI6S>%Akx%Wk&VјV,Hrׂl$}wt5x871$J RCW) HQLtWJ6>+toT98/\s7(LV'*rl$$PG1p)f285*`zӸɮÃt&L19ټ2Y+ <̶Zk]{\'k.RK<'b/:9-2 Qt_^RKű[;Kz y^nm,TK¤89DC0!H#m*촗)&=0e٨ӆ g=TR%.䄨JPgJBt#;6й.#Jl32L).3 Z8lKdQF=lݨ7:єb Z:FY&Z"H&o3|0è= 9LYU[DuBW"&L*2n080#ēy1t2U&vu6,) -@2L yE;6r$\z/dn\7bЧ E=۞g,Lj%x cm{KhBXtX{Q+NV JxD ZSL v1ԭ^n.#D&t^%bjvx3jVi2Nf!3ӐiF%ѐƆIVMEӒ62+i49r@IEhevepeѐ& !zdW:'mb<p>DH-Dv\Ui8M. Oˊˣg"9*X %P1m>$s2tv5c.$'5ǰڿuڞgLmM;'*aaWJ<}Mv?ys;Li S+*CЙ}w_7ϏA?ݦl۟{}Gx|_}z{AtaێmW<>ϲw{oF6>{3D ҄|onܓH'K&[0S2 5/.]';!ycIr&kkQ !]8UhMF>z&ϕ4|NH dCWPjTE+C:ё>h1?'qlז*Yo"qVpP:K"i 2ai2*|7 83I(^h@gV~ޟ&^Mn 7+"Q5uu][dwn +U eM6|bYV{.!/I ّ+z?~{z$I JKkY+a(q+U0ؒdwnιMU(<} [ Gl!ۻjsa>o;z0m  [WjQPnzR .v_k1^ws'r {d:={[hV>̤=/vhϛ?Dv4t}'cީ;Ċ 3u<.f~Jb+}!҇a+=!ܟo8[SqCSZThEs!O{ީ;11nyN~tHm>|)Ǔ S*OKG,G˫вj0%L'""X5؈0UF=hYeT) =ՠ0Ҳקe o{ yjlvZhW4BGa "TmڃSWxj$M' 8se (H$Ը%uC-+C-I]Ʒ'LIRINs^8feEc6ץΞKh1k҉0YU2Y&QA vǫT<%HBG=F)XϻQ(FuqzAo_ʼ<*͏ ϩD 0/vN{U@[%Ӫ `G:_nR{{d.u8!T薄jLpRd >*K+}!҇Jt îƪﻺ]Mբ#_y}E}}!im׶j5dTMBMI&;9rh/i*)+SC5׹n|eIێh]xqߵ}kW@%oƒS[KWem{K6&.ѲHe7mJ: OVn*,YϽ8"${:\Sb;]lz ,+W=]eILjp5ޕ9}{R4Vt_x}_c*ZI+jbʴ]zFOsby҃m]+?;N7dE|{K%BE-M6 U|Q=i>S#%ЖE>--Ѯ\j\BD ucV_2å#V~GF_"PBrW>C| Tta~_Ɉ<__E3V/|͍gbEf鬨Bˊ6SI(|-bٝ@ 7BB윳1EXѲEb:qEm&;JZ!gXbЬiiC׎Q RYb %V_\??JDpپܜ26t:H7LQRÍSeeT v ̈́jXz8`BӋN֡ ˺[OO3oQ,vB-ou\7/9HI|n=: pVL\#a]NȾ`VFs%~B9I2;bze)_֙DLk& $#\@༱X G"Y3_ԱPǔ uiN< U$=\9̢HT߾Gxyڍ+^൉j7I1xevm/;0>N[Ư5+X2}b"sVxǡ/)m3(6H}Ә TmqN][mmXa+~ĒNkX} kk1bӡ\k dWյK:6밙=fPϳL`6S\hLhN9J4 a#3Ͻ|CKjHZ:5 )[:]3 M`ݵzpNZ]hp){MS$ʊ9:;M+pb&f,06&p36DRuESqMmE+ΗC:.Ѭ ,WDz|qUI[SIgT*J6KUSmvXD-v0G'阣U<_0G"E=ҼsO1G\ܭ 3b^+l|H6 ӿdAؿfSiH v{(Oe2f Г<Z= -juF##-š5W|a~̑4PX9=8xSϹχ6fuV|S'T O%'N^8篯f& /3dHm&ʨd PU5 CP D*$+56HR`cQ mӓ"kV@´~5 a. EwHp<m)tºo1x]R 9c[&4X|jFn, z$.|F$%;we1KhY+qD$dȮK`[<3W$Xe;MTƑt~"(פ(+ |/v~RHeAM& kD`PaMm Ö>*Ү^B;(Ѭh$XxsrRښHz8? Ivkt!ɚ: )jӐ$j1+4~LC:<]Ř$PЪ\l)גOq-Hi.II4|-ACƤI6w K)v߼Yq~|($y,GHz#=KEͅG#ðhE&j|hs9כ&2҆߯1Rp^y&؄ξ>`w*ϼI)FNa&;NËg<~U2%2byY.$ܐک2ꁃU&3ʨ ~3ѯKبawM( ,Vq*H^w>-[gNurN>D 2P){Z.ƂINn]8r+iN\:<]7;=oWIb]7't$F.rR#IB y_@] Eq{3k NӺ{4X/ Rhz2 JxM`n{jH1G>^h}NuPǵS=Pw~eέޱ+4u [1GHuaT֬0d/ К7boncV0l7Ӷ-V]~;Q #VuoL5>!3[C6Nۺm7?i;w?眴/6m~&nq q_Zh=yz >O7?i>I;VŤKLJJ`9[a|ZkGU0_=h:ePXmM Vk)Ebɰ\oƒS[%AYԎ̳ЋqQujH/*j h> ׻frEgf KťEf"KhE5A/SCOirbɆ1ZaVј\EZڒp0c% Ѹsfe,56\]|`q9(mMu$=^$RiE2oN/ʧ"ZNٌdz*/L"LTSQ!' Pq$Ɯ>\,O82.ٽF2 5+Lh$G0;IYLkXG 5| 5_ 42-X/ym1#Xh =>ژ2GsZ㧧C:lcO>O'獧T):ٗxș3sI_LgJR9§IK 9[H&D !GecxdQUfPնS̈́j` VNf"Y7IQFۢuE׭:m[C,k{ 2T){b ͐]B$шSץ) "KZ]`o$Ć%$9sw]T_~J79ryoiJtAƸRм;t6ݛMPɩB!7ﺼrR* kxo5b].aܠkPL]꘩*Ui_aheԍ/7Ѓ!GuHuЬ<&VwJ^88lbA8%-#m&Alb;ZV t!uzk=M!U C70CbI|3T]WzGT?{gFb䲇*,i6sZ YAЯ8LfgaKrҖ6)08ju'bGCĠKs>Jvi!7/;t9Hoʻ]va[ōi~ۛM XϟVw[|l>{W/(/w\_\nw/S\m~^n~߹.a|Bnxs xl(އeSdшb@{~;iLjclH'!u=ر8 )Ov|ߣ!M=D)ATtM(ZIR"+M&D,0P?Ƙ\,2!4eiySGYڕ]>(amYթ]Cג34-$! {YӐT;@iH1*>vvR\!SmzRZ  i?LC^\Ř( X(IX DXSHjU:VȆllN !%_Ґu[/IO88Z4#4Q9.s5[6f&\Cq{[tZwpcʩ Kc!w8vXx`i-EGnNg5[]OgVSg5eEԳڂtlFŪ2^KyYrջKdz|:zxL¬zFCFфw_ˣ Ñ<+s}G?qtgS^jȼ`LL} O=hOUTmQ c @*GY|b q|ӽ"k3TDY0_½GUUŊ4U¢~N#HbX>I#{3g5`+ cHF1NKg2!d4l߉ 7 A C2K 6 :c!xQ4 䂼8gwSs)*]_^ wou?DdKfJpy[3'\^[U?mnyZ0_֘,>Iy >_j:F3Zj ֋Ë,d2E~LiArNʺZmSZ!7 mU[dKԶ-]i'(Kv,Yрet)궲 UJWLT֭ *jk]R-m]qM5ƒTMmrSmQABrޔw W ?¶7w{?^ 6|ӻ^(Pfz.NE|K\m~^n~߹.a|Bnxs xl^ o_}w Ii&eؼTW? g;#Ʒ2EԭhHcǢ48w i=Omb9^H%-xM(t"iD0l?ca dh#K3,Y%n`D&wrx]'D0i2m3T;fBS*:b18s*j3\:o0sj{urc*TH.Q49NSQ:YD`H0P:f"JkZTXNl ?E2=laMrP|hV K j'C_41: AC0rM֋}iYk&!Óe3 uEN\Ĉ;ʣq ʣJ ˥n͈jhyb M(8($3vOGJ(. ЙwA2yPu&tR<92mDa5R>(Ke ,9o?eP@ 3@g;PӍVxNYq*Yw/)_r; ?ؘ׷ۮn;l_wtpiCк﮽UkҴ^㟬mJvmJ[uJ(k]bL %ʕ5kREF7ζ5֍\4JTYuqK˫׷n zy%|Rj;<7oyus{w?_G?|}&hBt<rcW^MLS4;T*Vo7%+ RSƊjG ;+(i P%ƒ΂f,9;h⴬mӲ/e} %Ӳ("[a4s-xweF֐x7ԭ;eEɩJiYSx0bm!Z>2 --Ud$E&2 ihl$lJ?eWAxhrmT5K sFa;sв.ehjLpu,G:eq^xZV؜jǦ89-+JEAtNJ䴬XO5o emNbLDˊQleSmbZVNNI]0elZVl3iY פAOqH 4P4q,SҙNG_4$-kxdV,Aˊhf5J0r=:y}^Q`h0?kM!w8QyRCoɓ5C ۖ"yTq"~aHLʾEѐJx4{}7'fʋw@ȥpL|a@:5 炟9+\Fu΄?wR/&L239Q/SM/E|p5wNzB pfg>1Hs`]^,0C]qs.Pu#ʻj5vG%M|(;;,Jr>CXրH%el1C> 9jBQ e0.E*Z -U`b (K*X˂–`S$o ۲nmMݶԺƛTGF:eIZ!ڔZsԻ $Ґ:Q[mI>{vưr:Szk7u oŽ1!/3q/yo )[=N=!E੎JiHŲZk? :%AS1"tM(#Q&) il j6ea&'0[3fT,K HѬm2DA,Mn0GDU M-q<06.NNCR^I8 )jӐbR4`*.D4E;NDoF[ 8%J6h2{EBi51*F'k85bPI!V:18ԋ<$q0H#ۓ]4fxì79+i`V8Y5|:?Qx~/c"_46aL尗(b_xe(d8!+OZBV$dETeT:ԟ?zfPPj ,"yCQ"hP0G=@ʣJFI~3lpqDQTs`WZI=*FM'As ,+ )Wmٲ RYk+VB54 ~dO)(A2lk;1q2'@ 4`Չl@cA^XچuZh>s 39@` 6@,) !hqB٫%Bw_R@">9h<}..Ng@fC Ӈ5cʁuqoB bgFA]|k<VZm8b,28%)-c[Ǵ)傡ak7R1M][jEͺmSX2^ OQdK:hPu]#8#d*kj Han-FJ_iXR4mА| 6W!=\NCb1  ;\$kCY%=;mxlYfRPz;z,:NC2';*=!M>vM񀃲PA7%2;8r$ /,P5ddu9iP϶g1*Β,F2Rաᶜ ˏ`||V0Eb2sƅhH&cz> 4)nu%&`t'Y]ߺuF| "]nm1tʠB%%< *# iZo+!ͩQ!Tog!’xR>! b!MriHxڌhHRk "q-d"ZtJVSl j@60)BjU53Jgr6?F~=mZt&ȚMY "w\EiGWżUd7䩈 vUuVhC;NYCABP:C}~?W?~4!Qy>ߏ{X[d`bz!PY7i_v矎~/|p$,ϸ˗MSmcvjQcdQtU+zB09vݷ:_* q|#%v*tIV®_BEŘTɨъ9cn۷h?vj~BZgۇ9[͵c>7k,W͵thF6|Uݴ#pmx%6a[?Q\?wC s7ݸ~5=F=V2JժB_]6Է\C*Z t]wa"Qt&CIԏOЬAD}ow]H>, Ucۯ]_s ?텙:LSp 8*q|69sP,u"\j"P<{>MFt=clZjʏGiQU@n**WuFY PWu UQAV}_?QD1[DSY 6 B=> 6 B/'  aB&WyKVg$_Γ ;zn%7PP9:y2թ9Zb%OF=R#gS'e Y22T޾9#x朸=- )~e'@+èK"O݁\kR!nj-Fs]ƴE]VXˑu>괮@k0=QΌkõx\{>fƵKz3\*u$PF u,ҀOҢAf*Ad/:\NI |ɆJW!?c/):8  _j M,YK:ǾI]/O =`"=`$jmxI2}&&U0!4ҸAGJcp\ƙ r_h2'{h?'7_76jtH]fgjQ\fl/5f@//GiQңJ[40oο)4_8Ɣ(:4Zht'F-<>l!EO.܍*; #kb%3(]W`4x$~a qrSTqo,f{ n;Dދ徘OM=ۛؒẙ(d/K`8o<ʅchk<2e d ht"ZrQΓA˃uB'u|!)* yA۠s9p^Q[8C&l-enCel9x[%6v,/p{1g^Ы1{t-NE(tU#[MyvFԮ6"p_x㈼-`#dq}s Mt74Dw&:][w{*juSҺr5段8Cpԝ龩J~~hW8&+P!D1w yδm߆^m Qa+kgZ #߶!>S/9D"Ozx2Fx0|{,t=? r)~{Q!2{l2V;Q>KV^ZGV$&92լ $Eg!sBǐ;wt#lQ#z(ũ^RX^0Vb2prRM&=MKͨ2 BjHu3/ l[FEKZ8(o@ʈP_℠i7zBD=&R0gMqs6;gMA8gmz4Y;<]fY(j.^F/*SmfΚL'O%9k&Ӊ=Sg-b p90svp<1do`gIp:5QSe:&&tUӲNuEB_D^׮Ͱ myÙm`kM};70_c]Nڙ0 Cmqqz3.]ר]o3@̄IY"r>bVF9,|į"&1ehY5&{2.;p&q-Wo:5j[ ;o󝩧pɏ.M(ִc&ڢSQBN_@`^:3x51Sum:w'三qJpar:=5B_=)10:eJ&a7&S #hX'vKK{kӦ{=Ueu ۭw:neԵA-vf'v&?TFcJ9mK]/FE֋mk04xzZjn\K,Kn{E(:ckQ]u}ž4*o:Ǯ"g%aR?~`d-84 aRUww.ܱiAݻk{}W;kt*jzŞto_R$$B djL$#)z VP  `=? N(, 4h! G~ +Y/;/:@_n~ss^-:oG1WC~2/ )d`c&k g`%ߺ~e^Sj-n)&ڐrxM+Jts,PRDbBSs*N[[*ѭ$ԅ{Ԙ2->cTO䉨cD q8)2d}B<,_U &QǤ`#h6*6c.6XΚ0K-lh67L*|?9)ԏgej^6D'#ZI:%)L >;L>^, `EzvXք!kbjSKߚW폖迭}amyXXߖU?y<,}<ҟ/>X>MހTž800~Wf>wA.SD"t4`T`fFQ$2lM= CFf {96&Fz`؏-M5 ]o*m_~k؜FĞ7G71g>Eq(d.KU06%N ǪY^QN^@.%Դ㦗%1pdTAG[:MW {dR)Q6Fa9@E屉s8&SǸp^+Vk{Ȯk:nC^0X6T1L2na`-C2U4YYdC<mm1o8w[`|mpDXo#iY Zs Mt74Dw&:e!)4m}mkɁ k|۲רжP5MJP-+T\5ì4C>tQ' 7d7Mt{eChF4BkM6h6MM/*2O躥KdjL񤵎peZ֓oOӲ -sohY\Z aZyl30Ef+e=)}뗲*bYt9BˊjV/{EgiY2%#+Ӳf#/ ؒfYy^RXadkjsвRMv( %C&qfIxkRoPV ]"-5]͂|Y"=PacSS%jLBD;i2ӲTO,Ӳd*2|`(Yʧ`(Zx\ egeNMẗ >E#Ogoߧ͘ &QPTl!( D9XXX$$jfқL'5QYIo|I,il}p{!ML, Tq4ɢ ;;9,kBC)l5f36>GbpfsqFW򒇅"x)yqf5v؉g~Ö4֌aM0Ѩg*ޞ*LiU iڢ=Kq[1s|R 8^޶ ` Ѡ)z,,vE1rh0v͈fh8h<Ld0[&O'YpPe[f퍭d93ZPr#s[JO1zarfLGry1#A~7_EIu/N8Nʈki3鹫Ö$ ~g`E 5 ƞO%C]r祱]Ä*y s 63A]vIA&?uW|y jXK3.:bz.NR{Uyy2ۥN-`Z0l/vAb ,f]CagO7À72S3Ya1X7]/zM1X\*MwC@X( 2D,=rԵM +&Q?Ld6t{'R`Rv]3vr 4ޖ^k,m\c}X*ۘ] 15+y!%eR$:1k^Oc<ץ-T@ʜ` "|1X(<|LX@aLϏǾ4̵ Hvc1p4K>t|p0\MQYx061NkO>)x+1&Q^Q|9: hlhN2K v@b%Kˆ44_G1%3oXE/Gjsd-jskcv01Xqk=Z 1X"'wf`bx V:զ` Բh  iaGO'WS:m}*рvN9`GHlŽi55H#^BM ;׉ gm8<>R]dNa i`NnN_8PK@ٳY yYurD<#5P'IsGBH {uP}]m~XXevջ/}}dz|:r˜QFžxYt~~R6^FSx*Dhi0*vXrh^lk mRN$$F3(t4^͛3'+TkrPhjRjz "9 6GZ`IlQBɎȝ@3&E#9{^Xm<3u]wg~/)big%I'`< 7$Dd";G;\Zь,P7Tޛ@>dGTx4Ys]lvYj] j-([(T 9mfۣǾd sUDyN AD?fb'KQRI8΂%Ӱӟ%X?%]ŽLpgv1!g46@%E&hTو.s(6fgPS.Tʔ`Ώ#H:JLm/1\IaP/9+ु:=a4u( s] A e_stᳵE] Zo1}rN Fz6|]PX}kO 3WwoCxwvGDvi1`5ؤ]>ĤČ]1& ]蛚h˺m(S kM2b*S:nJ"b<9Gd{)e0X{`u'0X`!f 7{x xvY-`|iwW6#ٍ ㈏`EVn;` \&ݹJ[m<{c.h֌I1XX蔁&G^ ~],:G ZBS.j kh&&K:8hX>GsM'D`-tZҵU}~aڳ>%QS: l`Smz @`7tn4h*7c*ؑ@QRfQt vNaG2P(Q:Id̂!(51(^`g$qxIHu95&'@Ezuzȑ43'G`%MKOiY`a l0Kt%MfPVnW[׷_Jʧ7 iq.b_|1Fo'e3Fu*CILDT>aHLv@lF0ZW[ ] ҭHya?F%WɊY34kL dB2tE·5xԇCld*Ald#N<&jCl(K XAl\ݧx܌ 6EePBF:<r 6ZTWba)A-XU}vׂ/0i£&u 3:9 xI-)ͳ#)< z+ah<чFd$v{o7!cʍ`ʁ%$0Zx #<0T4͚Œ!5\<ȱbsGAAL11R 8ap^:qJҐ5bJsjg%WCbԑ9/uǜdQ8~9(ٟՏ~b:%*Bo2D<\C:U8.ҿgyEmdsk]t^ñN Dxk5X%̀NsATiNaq:%!Z5IlTY4Xj:3Xӻ-zLVv(e_oغ7ۮ({ƿtupWv!!y -iڠ韬ÿ{MKo^zқ~/C?}dBquYcɵM`mFk&ߚRٲ-lV?t~+ϫ՛O>n9'Zuc>߮ aTl:wg6]=fcGGI1).1v0wʖ(=B #U]YEQ t4)+]x.B,kX)juUOH)"O:P/Ɠc{O{כnЇo>q =jߕͧl^՟wA߆_ܺ۾Q[ &~F+^j?_s_~+ ڮAo*no ?U/nboC|g1:Q1"~/MkC> j7{|~<vkّzGFab^ϣǢ, =Q5"6`wՒ߄$`c&+§ẘmc}V(丅|:ƵIL>3dӘtW~ estb ״@j0GkV'r2hZSݓ'b>%Rގ1XhOJ,ZnYZPRx]F)`gh6ɻq3b+9GM-R6X2զf tRT)JʜJ&)l6&kt.=KBmjx2yGNPv!ְ֬j.04NJ(9֑3|P}]m~jXXevջ/}}dz|:zˆu/F5r&eQ!_"LxNGӇ'5Cem5ҲdL"L/5E;Q5nh"OM=Jb[wь0_"o4R×JurY7ڂ75{[!7=Wy#߆^Pm^ibFCs鸩Kt]p|c !q4N%pذv"^|)h<6*#h` ):;z,:IC@;8Q鐆4y o_x0E$t`a ,jz ygZljT.GkP zf69Ɩ`2t]N@9 )nM?f4$SҐdsd< )jӐ$jY!gx< }:͘#QBg6 ?T#I[ddXQ駳W~KS~5hDШ^y}'v`/1E nܽ7#csDgLȕ;U:ty'rTĜϾs'y䖬nx wHTdjI0t*, ecu&iftzbX.M)4zP6-p`SFo ]Rz0!<w: d QR)ٺԂ(߿ĺ&*}9.q^'+Hag6W&ث Sr>b]*->+\% x%r}n#: @e}gkUN N' Xs1 Lŝg-F*p+E9ms-F-*+McE2ݕbViw-_n&AM|-ZXϮE yÅk" $B6)c1#ay3uIV ؔ%UUmlQ=,A F'Y7FAPZ¦ (teI;9e"e5 kk1n|{s-4J,^=?9cEN9H$[ȑq#KI-iFCsԬyGE1GQ.^s4yx,rZK-07/vI1Gcs4060w-Ѕ͘'XcjPe-_kWZMIJNaFOu,=<#'VUs4v#CW{OtaͩdcTdt&sN1G1j9v&hx9=\fL9P,f֋Fbҩ65(F'd 3 ܍礓l|NllXg%:U5-*\';)(~L3Y}1 )yV'X$hlvV'Ů`LƊ_k}7G[>noygo>|~o}|_l>lN@S8gT֛}Wl5z즰8gM!o^x]9; -/ %: *2o!7%DAQ {QPY Iv3D +0~U{璦F'&[u6jm[VU`F66uT h#M;~{;V!?%ۮKC/cvoO8G#9.q̉xA 8.$k"^JqGW&p(N.F`*YDh[7B%q9hJX 9onZ' MHpST&|kVAFUҿc)BQP>@L>@؁bh9Dֺ+ף"f_۩r;&/b^_o ?⽺U]SW'|%iZlߟk}WKZz~-=ܟ7XMβ2+ZW!J)Jo]ye{>J0w]r̷;}Κ|wOGc='JwmA}>]!' US´USU TmU25O Pڲp-R1P%֓^7ɱ5r'A ´oӲ: })/Ѳ?A˒\iY{Q#le`kšeAޏ/_ J9gH󻓊+N4< -+Y{.вb]@+Ӳ&mnle c c 1)-Lg&ک`QZnfC_L )T ,QdrrpꆭSgV'9ndjVs| ݅еbX 詰 pbTI:T\GOSmbTZLFSã`*7cTo^9sbWgs#e{(9hYd~c4F#1O ܺE`]a,ʁFeQ$^X,5unFù{AF%>.4@R$G'&e `j,7lbSj1\VeEy򩡹#[Lk52%H#+8c0T<1V&|y6aD[:Vr0뗯$օo904h1H WR 9!VntWx:յ܄;5^##Nz[) ySIF]ii4uځ캅{U-/ aeMʷDV R!;]ߎ!Z/O_z/yur@+j).6Xt\Ui@]ii`/j.ښ'@qI9o_ !o8{bTXo=~d32T@cJFQYTum&Vڐ~l  }-;f 7!d*"-kU7=劦!E-o*`+`^IoƓcöw/2o3{]B}=?gCQ_"_ԗ/\f8ҹ޶û8aŇEV۔DlU X 7}sۨ64}ChuU1ZoX 0ktaKjʶ([ruIh55IjAqGjq.`q6pٹ^Dajv>0yהk2(rZM [_aA#iB~feL>aVY5!roE,]U !=r>S~3rOfdZ?Yx@Fx\؜2 z8=.JEWApT!N>O'WS"Y/IUKT'uŅ_TBMKUI$'FQ)6U-\6@{$(hpr\Tҹ&? K竳: \YM)b砪NS8;ĔfFYpaϲBp) .@ʊNcϧŞ"v)-nˣˢHzȭŒfD1P)ev8R.`k%i xtzb2ߡb0-n %lm6ܨNZ1͐#N;6B3uBcW Rku Y;\.yeL)|9,H xk±AJ^:`TD0y"p+GD(p3X%Q.vu-^ $ܕ$.>ֵ bΉ{[Dc#ޖeꈨrW=tX 惔*QF]yt%)wŷ hfFCaAp2 ;[ΫQ[Z,wLYpc?4 7'4m~C |оJB"Z4Hl˺*d,u*.jӼ x҉soƓckOZ UepߞÙ^Txx8勇e_d|0`p㙗l59"hH3ႚ}e<\ d1n=~uQ\NF ]τ;F-}.s-įߌugضJn*d.G+9nZBrG׮Zt9vo5CW;v*%hYka$(3Xkq}y8k-jӳb( nFAXcY|C4ncQ8)|`B'cۡ 35#3W$ u"K  5@s=`Ah愳l3&u1u=h.H!ǧפhgbg'&$'׭Dljϛ&uݓ5fʄw33/cݦa$\UvTU@%;B96m ٫b00mFp0#^RMnD0jsWިb5ڲW4YoA$ǴK o@[0+86֐3:)Mcjn:fDyzOJ{b(ӕԮl[.K[UÀdq'M А@C(2iHDА$_N.[;Hq.8FChH~R>Me/;сzM-{$ۻ"H@=PH'l* x$/SauܲpNBFdIIAԪtM@ 5uQKR?=isMidmR#H3 ~פNOΊݙ"-J+NAC0rFd}^BwVw{5@Gԙ8T@9C @yT=A։*eV9b e ce>_ZƏ KmKt%Ra7b7ބ|oL$Ow"sGD=nߜNg Rݵ @dL3 )癉MȽ0m:DV&B\]*#\^ uu6ΏPsОձ 1+ T}/c:.}671uتb(HC@wۊaOtᳵQ5 CSԐ<VWRyqD #gO x-{ZdO 8>1rHa_xe4&Zy߽(ܡD;Ĉ`Pzt] eQnBf@5 sD6sSb l@L=(%;Gev.+(.,-+ZA8e1^&;}CvV'Tgn R3@@ bޜ8|@WyAR3uK5(Hg$)5.Ó:Z\Juu&pe^ D8y:' uNjb@KFOt|UQ87=^%HH T9 9t]JH2=M@- GET^KxW"S7"%?w~i[Ӷo;ybf]aq#1{[w\&uY]}E߷]}EwiY;JP7Yau Qc SVbQ6Uc(+u!j^痍7⯛ŻOJ]Eq:OI_6_o{7u_ohؤi۴mޥ !kgɔZq)lhGEUHuCBei!q*QdoƒC[KjMyhYO=LjKFe}{ =|iY(}X9VcC/Ոښ=_#EV Uރ㰬8X7܆òPO+ @ƌuai b\]( xSW?JLrtN[dȩUQ%ؓ:]]V,tU{j~0e?"іq1ҊR7z%:<'%Q=A9|)GmfMB V/DW5eRQ'v<6֢q`Z_V8p ㇶ>w涮[GR FV0[u 1 m,E?F-Օp˲ԍB[X]`6}Y7?$Ojw??,[wXOl}7~jA_s;S,>U#S'ewn^[Y=Bg8=hפj媥, 閍"˒ډ"]?~J<)C i^hZJj\ a^kBp$WBDnO*ӹap րR#:8ZhAJCS $$S·gU`) 0rjH{qfz,/;mᏲ|[aXMaAu ;B8Q MQD1^,'ANӿh01$352`kKkҙJ-#mn?bꏙst17/LWF$:t e\ 3$?э U>7/qkz ۭ @он7 `ruxO醔=ʞ Wy"ݐΏc@5#eTP8wИ͉C۷Mf'5`ղRhZte]p#96sj?}{Ȝ?3qNfL;MLԱJgd8HyL'#"fZ򦜙`L\o>D;%œq^(V:=':lT"O^jIWB+$eF%8/}h8w,O*ԫQEA7-</| 譎n&ϘnPYB9cH^^ tPƽv7BsBn!Z*U!JhpFqQD8ΖB[]u*ʽh1f |8;F7Bc 5I6hWnQʜX}s DO]VLmEVw+sYԍo-f?q9};ǽo9 ~5iX\X7V(E)ANd"ڏGle*mڸeÍfӦ_W;6C?te?Xhk^G7_7u_oh&IB VNHw*P +{#CYo:QȰk2Ҹ#?_zσKjR1ZzVNh\h*; $x3ڱXd'uzPU锶emYOe]ئ{]P+4tò+?}גDyϻN`khFƘqhIףCZ+t0nO p] 6 `5S u$he*F?{Wƍ/n! 4.ޛb~˜!9[d-9mv~ɑlQLg8c[ּsH9ŊaF=L(Ƹlyr&DʸDg:#tt=J9 mR72vk\aQsy=x]7iG #%jFQ.ݧf塜:RkG2ЎRe? `gM #O̯Y=s.ǂ\i|RFhDᤱ*+ HDeAثU*+\=U !jK}nQMG4#pSddֽ81EvᬒܩE.x:yo,<>vq Sĕ ڶ5Oc󖏫24ϛ~&F?SSd8?=\BѠP6NK:bDY. c_&A}A#a08x3f{=aOf]\YlK;{IShʛ}Q<=֙=cL]B8*Hŋ YRc.XJg8û0JyC,vEr]icu Sř?.C[#0u-XOhyڮR]ptorT{ϣTy<82Geů%/wVwMrEF'Ҋ"AsM˒lUDMtAU 6BKNPÕ) ,2Ahrr+c̈9R(5Uk{68htEq*,ODĈC\s3'RxDse2kY܈c3AmvQH!ATX}LG=G'ۑ\YE{Ԟ 2E}fG )h_ٳ1r1%%_x[ѥ_+g>RI(!!ѻqE&MiatՏTWv'+Q NK|E} K[x%ֻOOz/#jA7'>./WW[AQ^y8frڛ?shD-O@xD|=M7ѳfm*#R`a|h垆JLI& BR0R9x`%%)|>pISE LzG, R+kكFB5+H*MABJ\JY:l*^PlA2"$HS3V$[HN9;Vq'uTG[\.?~(Ru@V:zbESP#b5I1:$:qq`jkޖSbJd/Myԛ-XO >-8w*we奧ܟS9q:_ ?` y)CL:8`jѦ_J\ [ш_I 2!%NU(9v\I"l:#,NrgL١" I)9vܡ ʨ? /4 8` Q!/c Vn,'yF'Υ? f#py$9i"[~ebxRs:1'%V2M@F8 LG #ձ@bTɰkp “eX~u!пbAAj$UnkEX@,)yTg 99:O4z8SNT/#ģ ;|WC¥УnYc:vg 1 0Ԩ`TTTPS\ ]*)UAUraQE%)BjAh%KM" w/T?JmJJ`0N$"T ڂ!%)F% @8K̀K1f|Dyእjg,34!e=\SFQH(sn3ZmJkx4 r^a0;/L˲`*JMq+R/5>n942"\ޜ׮f|>luY'J.cgof{{3co.fj3UV?ɷ I|ǪB͍]ԿrNb9qqnTΌ V3;XL}z'vs?׊&ntge=CslPznI6E} SR11wn#{䊆n}X 7$Rj~&LwK h`L|G6,͎i>-[MMq7lR"1^I&LC¤ᕧBU̡^ IMԔqia*]TiqʽJ7H!HQI,u($EC")?J2 Bk7 Y!| "4AAe 04֢2o*| kӏoOܘrXI֢}s>E{nsY,߯7CYkw/`[<)O{v[%:„ȓ>MPJvn2Аrkѿt./ ~sP7#wo@=^tg( FTF냭#"F`#`AZy! JtKQvf”*+ֈ/D%BnHBKcR2[[_G]S5Q1|J]!Ӧr_#*}(hd><7E`5_ᡂA`xDŽ!w!'%ˣTkIɣŰ%BѺ|<,/yFXKGPh!L) ft"yT,^+"X=(8R&b9?WL h( +,hR[2Ņ , AWE**O%W . NPȎw/5c+WRnM)J#j4"P*+ 8k0<ШL?%**Ql BxOeh3]*uE 1mKYaZW` i!u/F;Zv wۥy6ƌJ (%Q1RSAE{1R=X 7$b|#^Fo-I}GL679mEwKhhևpM'qnFR11wn"T-[MMI*CI׷ 7KF=G"N_YFYD(8/P-"[z`SpT&c2Rp#D;IX SDX wD BͱCxx0kuWauWŇ0$z r,r⼽mY0Ncp,#bL 24ӃnN_jUݞyM $k?7mӮwn_{qC?$gY~A=7f=Zi@N' Z>LE -P9.B QIR}ZigI">@K7_F[XِRWzj kȰ%WitT`\)J( եAVTzsK\Ͽ~FjzxSy|'N pWqRB] ~X͍S}4p+f@^hʄH^)W~t:EYۋּtmif,|ZM4BpsUjBq)q5uC5ʔQ7R@Ny7nZɋ!y~b=7}ظ{c)UmpnD*iA1 +M+/LOuGB5(Ky F= 0X*|RhIʚݾQ`)b"a?iMeW RrnJŹ*hv1D+nmQQDWr a |r؁>5Dѡχ'[Ql GX3=@lሎw. Ǒ?^ +lkBК +򸩿O6V'W3 {t b)y}FYH,F(2PuFR㼵;)>(&"3,I,xϣ0)p蛁HELqHU+; '^ = NDKU1^=Q *n+, ę)\/WQ0J R\.Q8 IzӰ?YyƁ e>7USfuQ&../}q^Y's}k?̷3Ϛ}] 3"Ǡ.$J s"T -*Qp(F5z 2YS_zt@w}Zv^4i|(s)9ςv\fUJ3Zp)`0T@XJRPSVag`XDOwNs 2Hs]6!D #C!]ºaEHem垬JᬣвHc[$$9Q~Hl)HJ,>> k̟m=ޗꙹ[9 غkh>ެJEw~Ҹ`Q??ʽ@o}~o5)?]^"9qy=x]7,׽{=7kqw}l+PG՛o1yy<>Yy!zcC(!3 4$rGĠAjBj_txOW&{aH'jc;UQ;Hn+Ƽl<'g6, U(H6nIխֈ,VvΌTE?yx;#ˆKXص[b;&&gIx>|p#COoHya+ )/P|CƌёqY;K$*8Js(aa6UA+kT9n NëBK7]Eѽ1~IdzڿQ^dÔ0ќ#r%hl62RjbYgEV9E&QĸI_* ᘴX vTY cKļ'ij@HniWlvK4&# ⑎ak @)HApʇt#6>1F-Ÿʇt맳#!) u۴97`5 '8ٴV%6OqQQ:qfJDS34.15F'Djh_S, SnMj 3>ct&59 7_&5 mI-atk ɌR^)%!TN*Wa@`e7c)3?n[ԋCyt:T"`Pdf[UB; -Buoaִ~[\5+QhG"cq?oy+s!'n5mqjm2Ͳ2 Xa]ͱUP|b]Dz3ᔸ u>o?,TSA;vӢptgT.NFL {M(;MՐG3 sN)wSN NGu ;M;Qaz L̍;n@7w6֪(9:4\ 3ueG>z NVSt! He[d-zl]+ZEJ/"B' BVdp`[Pµ2u%5jV554@04+CIЬԏؤo"ʺIoTB|WI#wDR8M][Ŗ[GZ#52kZ檦;m!U,)|7Lm/,)-)ΐ%OYc;J7 93Txir4zfgڠE}톂 ?!/[on0FKe?:P-oܲi~?O126˞ͻqU=g|Xm^?<`X4ٷyc5%A-Zk oz͛{JlC-t/Ro(ɷƪr'56-"PB6~vi㫥O+?|Iƭb7&zCFGgVGGI|> bUJ5R,VUtXQSq7u?N}qqEeW ד {5{*2Mj;-?z_| [U q+v*P uƄ",wQk߉@WE˫" Av66R*e#MͰ]Y/OzU(<@MHv^YQIǩZXg`:x_ؾ↫pv_A89/o6:/F-ш|`*/XpEyY&b\j+Oq:q' 1E2421.B'ADI) O)UYI"UQ|1Q@Co=LjNqS1.')[Y@3 y 8r& /5fGҏf_ 3IJ< ߎ [^1)E߁E_bJkY;i+-H֦:%'[%[]3IF_jDu]FȪ!TUilD#,ו%+ImϗYnLLn ]zzr Ay;B8&!425C#-c!z B 翎7 ~f'X1Y!4Ј(^z;ELj FYyeRs^/%9] ,:d݀O m:MK&[hKa-5 p`,"Nad@񗈓^{/fdHfЌ"Ps@(QOZbAңy5\sN:b=Mdž9+R @b@ Diߨ|vt1{'|tHiE)Niv^4Ӭv:Y6ꡞQ΋u`%dH IEӝ<]ӱ8lqyLvO*.x_*x85)h 7Sym4|D?/20}vK WJT"TzJcO&Zmݩ WhumUL ނ?Go?)2B6X׾EzKY $%Q ^ZYwb= f * d|,JFPzM8Ÿ(e$U̍%iuԑX8g-HH'!/q+14%Q%Ԏ{Rp%QNJbN[u@{_}vz"ѫ@:farƊ*$ W3| "if{&`"i]>PŌ1O hҬv^slY y}wafS#[*橫ݕW}MZoϏk]+ZEnS7K]ܪph#\jA1U9XZcڐl,p8iZ+>t{{¡4[F}\t1w ɤ_?vgkA~_/?/.ؤ4"+Meݤi*!+HiZ k*EFmQuZW e7Q,1VXR2|?Lm $jhƶY"-pZwҲV.돂`+KD&Q;{) PZ6$qAz]2ԭý}ѫ8k6-kt>3/%~462ޝOrвOccRYiYE_UT(uc@S6ouicdž6hZP<JoM9!z:Q]|8N [CmAeũ2lOʧtٕӲ_hY1͠~IaʧP0©fj`I:%Ѳ)6-+\6!AJ z2)9'gFѤF!!6pKgJ[doLj9jYI&5촬 ,@\^ݝ:H?cf{.RjT)^`zBg\2I0 $Au[@< ň D" z@/1UF4+7 ՐRSh"f'AE5p57ڟY(Y ? \JV7Kc3Z CV&|bh{bӅ 'Abw'dT0U(_EE<0q;]Lî)`9.ABݯ>,IPA x@:I0m{jL:ㆫDt1Oڼj2ti6:(ˁtSc2u{  BH)At:6' ibpif,ב/ϵ &AQ].yqFKPy,>i<6/ T;%tB BZv4(JH?T8/<Ϡ%- AxŅ4+CNԣ,Mk)G:АIfC;]Ю:!#asT[9a*C:fUϬJ̪ bvӎW'Wp1F(@d6Ҏҩ62(H'&D):$" ДlkTHUN$Ycr) fyщ? EBRz,VΤz2d/Lj1q/hLf ܉}]l}| xڄwݗy2} CR9u>+xIr󓾔s#ING2G<*f(P^p"T`ނ7v(L[#BeQ+ĈD<F乧~_ EښJa䀰 )4Ki3|ӽ"ɜPJle!Z$&4%,\J6 T<'6~`vА3bKg&c/y@'HOHVʶ:N87m;3uNeeV*n *Wu^a qu;QtVP.ۥvjX!PelgY!I%'\EJP>TUv[K3C.ad3BӾR D[kIyEP E6H Ey6}P>UZsAPm!bnڦhjوmۚc[W `0ߒVK ,l*K47U\#BQVFTVZ P(<Ē"mm{JF!|Bre,NWd@ TR6:oxͮIيǝm ɫOM`\]#瑱>Y}G ԔtG (9) ihIXem2 Dcə3C! xhaM2'fOwee*n4GT9 4lPHKoHk($9w8vY @RA EqȪEKlcZ|_ֺmN daq#뾾6{W{\uY4kO?Y_mEo.zu~3?,+J5plLYBU`J.6Ti,BcY?PV.~7_78z'j1A7.3q>=?tmY?o\|He4n,yTI`LJ\d˖(/tӠUu\| $iܱ Vr49څ7ih-KSO|KҮF| dhV#%pZߜj˦89-+PEѲ|ZV:Fe%12!-+ޥemNbD QԽ)Q4NiYa:1R"RĄ8N hYAA"'"hY רG'g/}p$Ryk 0M4J3q&2Qar"Z`FX Ƥ nvcUdTO>S *;y65)'g}*~ uy ( @ˣJ`YT*MJ6XPA0 )06iZsxWdl9  xiV/KU-ꦊܴb/{dCy6q~i߿K##3UGT%P҉%d=R)l DҞFLFlOIXGW`Jra]gk$N]_kQXÔ;ȏv]Yb|ص|QG"»NUАv3B'ʂvl#5ˉ`iy:D0qQy+ҔA=+V -cS] OQeX*k&H`,6rTihĶ@-=imdI}pd]h(prVs]kTI-˦)4lAMmiKC,mmm{s$%.q&[Z!@(k_ K'a0G[7veije!ĭhV2Fj,MnE^عA s9Zd LBzTsvxDq/p̑ߜj 8=(HEscҩ6>(D-0G\o*.XEu5bҩ66(H'*/'G$j\^NNJIA8tM F$a,ŒcqG\pJ[06ᳶI-p2E}(VJF@9')~MuRw5(ts2EU$ghoHxAYuxґ~>~OOW^71I?1yL8G=LɢJژ r 7&r'5rNOPqO@ 4s3xvH`m7,-o:>ÍXҮ"λt5iZOVz۵ۮvJwp^퇥nhkj[J7EeZ#- ZQUM &(V,~7_7֨z'h1A7.9q7qCwFl~kVϿ>w8vD*)&eӟw1)> .iҀZʦnV`6%ZPJT\u阬ݫV7Y^ݒC%+BrЋ; h8ƴK/O*N\u, \/ YTPRRzA+ȤŤEA^A[9QjOi5R6 ű,j2AŅDr4Ha*KHZ:w#΀"9uqrzQ89(SmtzQZbBzQ;›^:1(DQ07[c SmdzQNV 6%jTzQN&'ы)6(@6$OAA/7Œ'wO _Ƙ(WB&5)^I&5Y&edQʅl0L; }}]l;| xڄwݗy2} 8`Ͳ9㇇<[7}7ILJ]{,=/>7]Zηo~&a_,x|_NuP7Jtnqq}zp%S}\/HOHS,~<7?U$=@#pALa"^n7+zcfeQ%'ϖ7!FP )E:it72EUTuɍ`@`>QMEZmaTePHzs>h7ky6iZiֆLRFͦI2j|.QSh@Ve)um²aMViQֶh7BEMP| ƒC[KZR}z(r~( Ջy 4^\PhOϟ?oC/w.$g){Wo߸Oro!_;ǧ,|ZO$1 Do/~cYj@l?xŊAE*\y"R0@էv3DdA&0tC.le]?󞄃>l@e؀Id}ذ"M -2XRa8.M)gCʧ"#†fgir9b޳40s*[\"K,dC9q ͩȂTT@Sm|Z(Fo*.XEA 6[mdUNq94lN)V!Y%TfNJ /aZ|VS <'詤 ENItK:̄\|,TzR]2ɤ1Q5LO }0rF}=*rMS NG;͊m;u'Ov'/<YD{^I//v@FfN(b餤6'EaEpV.{L"'uD.{nI&%OD4B`"u-smUBOt_˂iQ^n nyrbZ՗Toֳv%H,Ë4z~y_:Zǎq͎URZF]ZOFhmUIԸ:@QVulVT4TJZBQ)Sn` WKjm=wKBc2h4Um *)B n[UUmԥkԮL` bw.=* _|)9Wc ÞA ws x̧U hqIf*PP)U#Cv;;qu?7s5@Oœ{c:3䅙񿿛obX|SԐؒa ~/^KzS/~K_~!jz ]IT|H#`K8Q\gbl A$.>~j>w~_p(Ok_ݧ8?zz < Oo_<ӗK×?_0/e*ASyoh* +]L{/cи:Ah50kt#-p]ɸxJmPY)ZՁ:*+VCkT](g٪xA_cퟅʯ`C׎ (4n!Z-KAcDoZeYlus_jc?{l^@Ӈz0LHoC-3.ntqx65Pq^GCPbu3 9a (-ьڴco}׵{H%h4)P+i+VYVӌ0l$^⦒8&񡵮%7pѩ8=$i6/ Nc0)\}CZ`o`jKYVكڀ1$No'=btkDz:(_au4 cdƛımةgCc !mS∰jĥ=dv84qj|7~}M6Yγצqqhp#}jFji#Š`cQt_G)bCHC8c=h0 q`lFZgL,hL_H^lR#dG1BELE[i rU|=@ԅ;@59 fF{ ,tZΕڃE\L7~%7<\#"HfE d1նH 6H.8&c$.e3Hf(*h%|Q4#YL19:9gV OHj[dN5#YL=0y"o9@h$p]= v-pYaW_(B\ꄏٙx=S mf7XFæf{01Wi&6NΔʑ?6D⓽tSb^Y[:/K0_HR$꾐l<7cT!c4;Os >'cIH?4 cG!]>,9\^M{[GDB$rߖ8T/-MC7÷k& Tm ~'+DoN82G$VbaO>#C䖫\}DI?ye:)! }/Ccc |E5]y3ٙ-d;#[xYkg<~ل_vlzDMzMJ0Sv܈FQzn샨uv8M#ƘMFőG;߷`RY4oƓ8ڶqC9nf0vNU c0::5k<)&s`_|{8Y`1^_@(P+N8,-ԏ9 !̏~xl7p8!&}VoTkcUR` ЭFפ;D7srփ/rъҐ.ѲL6S-JCZkw0;҄\ڪ)ʵ*ňHiktP#dmҿFWR5*Z|jUZJlB)ɛ'stP/qt2R,G]Y6峫ʩ*KE6r4dռlF4Eu|4rmLCщR4LN9vdɦ%e[*}m Љ7 nv<$w]T"wuC;:HޮNP1lkUo%^mMiߪnJ+MFCAu!pnЧq% *n!7%Ȭx vk5 &fy3<;EN&ãAPyi ((WR/q|FɤN'n-tK&T{zH'Q|!pcB%/PylC}ΥV1BPPwcj5as|!dwFA=f [uSb3*!԰+n5we<7hH0:hG~d0#@]RN[m#'zTw,O^= 4 Vkc ccCm]AG{IN$ew ŷhHnetR ҐD!#4A/Izʣ!=+K?*bX;!v_MhHIXt7Аb8Ik)8MhLhH)&3â4&30DCZ)L[83ָ55Yïomݗ7u¹F{j HKEUUFڽ*]mP i@n )oNCJ[C.GCRˣ,"4rGCPC,GCZ>I!=?]\flECP)̤!SmkRN:ytNAdD8OT*jU9V& 8#9I:9|ᷫSC.vu'vCewucPi bvVL$^zz/f/ imŠ"z/1&r@oi"h7Yq P}:u&GأJ Pq5&p|w1L: | ]'`T]'h xf7&Oe2Ͽ|IvCҜ zOq84t 5OޜmޤniM/zCŠ$rܖ8f^,F6Y=nJir$-Aܦ$ޖ@r`D NސNő)8)FCM7E/:4ɾNcPQ HsQҐV iԸٗ$ &KU W 2QP!ڽ3i(g{ 6EAe-$D (:CAU$YaE͊fP bG &PPYFխb^nqҰ'!Nܜl61@䇅gv8۱Jy*_$@ԍ$1LA%Iwԙo'uÏӓIb]yǷ&\sT$Zߘ@N+Fc7$N0+A BTp\%}-ja@B0#GUPP@E@xԽ!,CQ=(ǐ1 (`}Nc>+ `X/UcPtWA醃ف9d>#8 g_2Rf y/DıسЍ|L{B7V֎X4;`F'CaԞ0 0l fڐk@~s mPwhӵʓ缃chH/LCw49:RH<;p%"y4$jѽǒؑ$&4u9XtotQ5 ixQ gr\@4%1( 4%d!4& 4Sp_gptk5i`ЏUktsT~t@JՅYҭՂs:0z$=\")!iHT۞V\3!-iHOW[ѐ2) PYͤ!SmkRN,eRvIe'G6C5ɶbUb؇S_.:3!y ᮾp$4jBnWOji]=<;,7!ĻۜY]iH V5;DX9 gm3MM:#avlTGTfE3<c4) hxdlÎ0rhb'lNɡB?v4] '6iR^yz2I1u><0^O+IenLǑ~Q/鉰$sinFWQ'r@fc aNz{mwے1ݷD@"ruU"x$g4*y^8޺)uQ8vA:2]wK + 5naA9̨һW$]s+}Nhߋ"]Bd} е@ fm ud{U@W>#B9nӶo 6atW^"![` (3_E4bp*;ۍ9vrgLvf2$eM>e4 Q]s(m*-v% .Q .)}Y-$ZrƓ΃Ėlu lSQZ_7L1@ܹ*xFuhHߞ!y44/ ܐٴy*ЩhH{~i L8G@\t:Lј_F4d"e-͘)nu9AƮ>gڀ9N.hjtVetdH*Z n{~ek$GG7stP@8::\:ʸQϮJb*l0 iMC?\fDC()Fv!SmfJ'Oaut1B*(+U[Zt|(nB4sE}7Yۢy]]kkK:&l)ח9>::OO47I> U _Ty7'YMf|aކO7.`keƓ2!!$ qGsg! F*8i%u> xQۋw90nur4B2R>+-nԣ )dvI1{11u3SaSUT@kkǥ+~e PU.!@pP(m]R)C,ƓhI1UJ'm` :4Mk<5dZȕLm˶b)qUdrt9pB@ǸqLΉSQTtV<.;>>QU9E#xFxWHXWE4[0 Mʷ:@d|:`2׮DC@f ..i{żmpwBL1&a;p_wYt]m3gXhvm} _叛+Ivt~n}}DM.~8 Hh6C,pJ)m(`cLyϾ<uC~ǻ>3_wh>롯k]kiͻOzǏ*fkQh@ULASWcJLx*ߖUEcj*q]mPJ*ۚ³t/ -YK5PEa<9FN$s2-3bO>Kz~@A Ҳvul6XF}EE:(㩗EOȯ뮷Hн)fu=/:Kҹ`xSDžJǴ;țn %T ) Z#Gdb\O?5V@ɉ|3v6o|myV݆(^r]8cݖj1 * > U`=ZwcK9n6x;]<YvHupkAD^@`0X= 3`=Z:`(9*ZlBmͶ0ںn+- آ sm+h M`MAUXbUwn hiCۀa(뀤)b*ddrh н!p/7*J)ݹ SKּ(t/K ~H&((^8V.%Ӹ &t4k}|ľM7p>^GS⽛נwz:RN;!ՉR6Qn"?TS%E0>c.6TqvulUGh @[lՋ60Gk"|j[X;V0]b;$;mqI߻-0 n /#V%' }Tͬ;UBi53L̩y%jVșF'1:̗syE Zu/ofG%/꜀kduʻǎGW39LάtBfN}3i`»dSXu>ʿ~4$%-xr tq l=S&$DT"1(1EAX?xn&#xtkPKat0߻)&F)`U RA6DPQV0ٮoAسE/Ǭ@QBl^@!eRc0uehęI,e ҊkFӷ&zt_3HuM]z~XGhɇvݡgs&3;) `m6k {%Pnh]w;0;iNV0lkv8ZmώM9T0K Ey-3 Rɴ>gOb0&c0bf &[Z* Vqԭw9hBϾ*q(mg(ސ0T$4Ǔb P1؈Š@4V6- n*ZU)]bV|{s[h9#rczQ\'8[QJp`YsD?zi0γ cHŮR2`qF@6Tk®z8+ƅ`v nqbXî!bt0d_ f aGc?v{U.ڣͷO){'w u޵5m+:/yY\}:}ުVT$&Nd;e)Ξl/dH-l4f2Mwn.8 }>kB,lR2F6}`a*XdŞΰ8 rQ|]՛,\J5_ehi z勊} ЁƐ0n~ڰ%Bl|~Ɋ<ύ%.;Z E,wG=z)4}"Y9pFp\iX5 *1l0))xzWxfG ]-<iƮNlpg_'2Q BN~ ی?·Y $aw/=жw;MG֟&if 3f7̾tP  4Ai%/`Gz!\븑[@d{0p^>С?ޅ*8e|n)7G/޻O,"iLgw`dOIқ!Iz3$Io$8I:Ъ% դvβLCFZ2~SOaՀpm:CY:CFNh5`gjZ 4`ـP7J{vaNHbF-X4띱Wa1O5-Vl[mbRlX>\bPn6r#ćWs_I29/Pfedch|δG*RjDH ]aUBSY$y01A,ړeY U'eD0ӡp0.ɔ//בry*fEŚ~dPP윔<_vxKTX%s]^?Ki r aP, [M-vO֧n6QҖ9쨜;Tq?*]WA:&Q KdJ+=TۙwJUj%.QJz:OcRi>F ,%lMުhcIP47lpcw/D=|.^8|&-*Z+T,z_(O ϋbU.2=N.i1ʧZ GM(6W{)8p:hKNJ#v`CIyjxv}=z\nIo%q>yD:M t qYRZ^FU/XR+/k`E"OI=cXL9Ȁӥ-nV7uiK!w]pvv_u?vq:7Kѭ-RHR|`y7nP#?w/}wu,e ąSg'V gn>~h7eaBܚ1\Z˙BdM?y8F{7[F}v}Z>w1u/ /?o>;ڬa=ǀJXч|w5_whENݠWK0 `6ɂC-1vVÝƮF6{uo @d/VӕhQiYwO"[ fv7[qvMg^7Cݪo|VHߪFwPV܏9jDm߇rcFXCRѦՐ7?xlYz71@ >+ihY;5ƸAJ{"Nhiɚak,c&GMD.]1GF{=Px>PՊ6hEsZP؎xR)Dxe=ݸ!Ћ\d;~(,TpK/*>\j7g.Ƽc2=~LvrBu+1Qg#yD\Hp,Nu Q؆H!~CbI!a A!vgT@F_ #NPcMD=hGgKk]FsBaEyjyK*u.r_y)0!B^!rg޶Zvܣ a/Ibodzhʄ4ttu?^j։(A6,97PRs .jݴR j;.-z;r.f΅5}7aa &DÖd=;sT}+ 2- i9M&QtEh1 s/4Nwa F7t3]j0U&I]$m|QY<[j\ԛI!ii2iklee#1,qRX!A!e$qX9arЇb~J/Zt3ffnwε`u:{g%e9֯̾e j%vos]gDI.8ْFKC^KD-)va.7Bumu$i(d#6dos؃Rfm]Xt֘%uōSӥF"!-:yh? ۢ"O%8-!2Wdƨc q *WΥdUBըa)e=r0TjLRdjj,5F*=V$_Z04Sw%<*}WP`ߕ4wj]IA +OǣOwOGs' j2+js7Hi dپ+i^m|@''hj)?N9ĖCVh ,軒$$),k$5 YTV^2U4 r^[+tᲺ5[LO-;Ӏ#'m2 YQx?ٻr->oѱE?kF}f`O,[4OX48NrNpo˃95įcM(=0A9f y|A=:JO)2o~nڪ׭ ~as`t/7h뜐i1AcL/9jSAr(&]NB)bqLQ@M㙻,zJItU tXg񺀓bWa2( M=oUX̀J:m' @JLr3PB[NHÒo}{g磂_Su ׈p+A*sF6 ]EQm $"RZMI ,ʣuMI٢QEyN#E[1X1Y_4i\gV eZJ] sԐF1ꭐR j{=RVE^pP,'30PTQc`JB ztֳr-zJњ Gh.i1fJA-3aJ)?I ̜&ҟjf4@@JTΠT4LIT΋MU uh&rKY.=y$gKM.yi Zr[|1o8SΗ0xxb35a!Xc0P9+UbAB4RcriTuGٌNU ɪT *w&c])I`"&hj==Y)taCjocĆ^X$m⥴b>1;sԭhu(&˹,Ң0Jva KMҥBG+eЇ_6>֙o>@^uZhR_` ̵0>iȮZj3@"\d5׽f:Jj1&ƒ3xh,5AWYYp=b]]m{Ơ]o`kxn ,V&yA!eLS`p":F|db)݂@Q41s3sPU=V =pg A ]xERɶFBe@1GLIDAFNTA m9V#Ո[tS"[J Ȣ(i7%AdiEiE7jHcqQCzWPHcVT4&7]EQèQ*v SZ,9J4Gq1#@hR"/Ǵ-0zţ(Vg]hMs{8Ř(Q%re2/js4Ir 5/_NNըA8eцCl ڰxؔ%x"CGxuے fg*)[T9*֒JQ0]M>- #h l&᪴r@،# 䵣{Bģx֙$$cz>B*4TPz[70z&:@)[2 BAd[by9vMeِ7vk8뚍lچ%-8RS.H<(otPߙ+TZRu Ț`j= TUQ0\,pPL_(|\/XܢJwAh%IsdMW/{%C]  ]:JY-; -3h<[S4"~&]w*,j`C!lHo<<7 .̆t&c@QzRK*yq"+EKsDgˈ1>V0i^(Zb%sc/&&B;DZcL ;I4^&%kl5v(@?{FWjS+x5Z5܃f1Zf]tCJr:<eɵA?/Nwm/)d[)ba81oZcDg4iH! oА`А\4)F!H wz㩗'_ގc?+H $4juwt[tĔ@ӥJ4S6 o>p2t^0d#@4IG⚌ǍÍ ބe =cJ. !:]jZ4QGM[`Hɭ*,SV*{:qc= |6#ӊA369ӊ% GCJiHx:ѐ8:+_ޯeҐ61 hHh7ѨIiH<(R>f!1l覉Ɖ1o8syc[dŸLx"8ʏO,̪㠲N+ˏKg@-~JUt >>(Fgi|8<`3T۲0#CKu>S|(O$'憄h3Sx*#4㩈,C-rWoFdÈ͋!x, $ %Em(>cutTꨯςu]ɇxnYOnAǃoC(|3(Qii.0C㏢l✑0s.eedsqux!']Yl[hMQ¨L[.s ھe)̝kЂNC p̻/]kxW/9 Aإ1GAh@,st +RmD0Ee0G[Kprfm0V.uUޅwlſ%Y /?W~tחPw = __Nq'Vݼ2yxէ׵I{fRJC)>a)ݿSIYU- !']SmH!gj2lD -կ!?l}[sl0Z*lB㧏ax/?>nz| >՛vǏʇ,jcFY1)"}IibMb%F4P@`4)JAmݷԭ1-6)9F@R cڞ` ,`}y r1X~ I4< xHGݵ,G:b1ZuytStŒɋm)XJB~2edF!(u:$(56 WFM@ܠ?+GY`_ΔCI/QV@QZ,uDiLYQ%*L֔ >T(!#%M@>+nLOD=rQG|MO⸥R(Xqc4ktvO'c* Q#Ж֊2)X\ EWJɘt|AW&tI%2֯ie> Ydg 񧟘nzR։9u4?;3/M/@J!U %qYE X>1(V>۰6@ŧewwa7%ӇFežțm@P/oQ/cpzyt(He7k~f"aY,澅x"5UF> `YEd՛ِN"`YD+KbqXV|w\88EGP:Hka! #龭jO YsAڔKrwܱ>uŠ2*Pp8`U /'&(|3NbqP̑l$[BkQ|&ݱ(-&ؚrQn@K'J]uoo-!Zu.8堚nK#̮0za>ӕF+56ҡt8,6HtGF*O"-+nLgSD=sk`iY\qGJiYx:Ѳ80IUX+ʤesmbZ'd.إFM]pb>sl6Bb>/%}pɃDs`p!NTmV- N)09R.t8ى=-k1y"o#hYb@,}3,Fɐ5CGڦbJZsB4e4ZV e%;./wWoFdCQem-M$iunL[QF]pWe6s ڶJuv45[:´Ul:C#k7׀>Q%,h?3F9vߟqF^,ʴceT41(i 3QpǸpu Q3 a]{ŐChˆk',) p) rqn.QehYsIGX-V7biiZu]N;(t 6ʐUvatٕ-jG3Ӳnn;J*#7CB' 1HK!-#l0|БJvr:[^ޏw[EdzYKQVn5ZXY3XJ:FIָui'^XT֋l+hu1JԤ!=MC )AiHOӐ|sY. ɛ0aER,P`I`Hz ㉗Z_k,GZ|!ōÐ8.j+u>Ð6= jMaHL4 itvO'c*Q o kE0|M C\&NB Ƅl6=D`[*caZÏǙ,ZD9YEŸ-U @YfT~>wjVBĹٴp̬0FeNdRRl A>ofJg5< }4tg<2ä́<";w&=q%XFy@볻ifnO,9.QJf7^^׶%@ B Gg*mb>zdM$@ȂNQ6qN\9 e %:ϸ`(/ʮR-7Q];/N5E; }A)Buו`Krbay(ۘ;ZBk wL`hfKҕԺ ])$\t=6Kg [5 F\Ј+ƭN4CЖ*ـ w2Ō$kn}ߺ<- g.OcdI Nge:eTU-ٮDd"AqWcבko;ӫSRv\VJVIi@)ikUGZR1J 0g`H–s0/Ð(5H{CRF& Ie\F7|=<G~݋]y릧Z<|'߅{xm?ޭw}xXmӳ|z޼[?G5xFgzUo_wMv|X>]*~`švj_yR SO9^v:O%5_74 շUZ֠u պ'ښo'Oo{g~a͑IY_7܌(̀@x7_\ |1o8/>d$ˇ>CzH"gjSϘuZ,HC 0,ֵKWw&= bgI3VE U55[wJz`ܖrlROxS6J&A 3‰YQ%d,иrYqeCI(|3E|S(_aZ8b!t͡!dAΔ mx\Ђ2n,xX-A>\ٹQh а<#`9=dq, a,}v|x3&}vtSqfUy pV忑_L vh՟ ܆!1z j[ܷ{BF6o}|9z߬_J}(އRz?,7ZWJJrRھk75PU5J2nk/[ĿV>~|Vc?teu%cOÆn5_|~s|׫7YFHD+҇+i։hVHB+-My jŮFN4Jw}S ;4mL RR;n["( -Lk{!` `Tt` kB@S/%?TٶgY&4ncpQp<%R[C<{mHy! 'XFfSN$;-eÍM4¢*:c7CGIkuħ%J,7j@}a$*QrYrRtjFvdDJR>i7sGΏc- KN(Z|<^Wgqtx#{:10?*-҉smj<'ɤ)3Ѩi9c+ɧQ|́M4Ѳgт8Yn+ ue8RtR]RJoQ;ƺ8suxhZ+NA+E[3^*~fzuRF%۠#+LV ؀ ꉯ4bp:FF12ɸN(2vI>eTO؁jƵiEّ谮+ը+ kv+-10;g*6RU4*,,#9-}[NZ6s]P ;7BI`^4Bt94$%ʥ!SI(3phMPECRCO g\zȑl $s\=媚zwv}N*2L$j3"!Pf(4n{w/:JCJ3"s^i4s[ :3>WF NFg!SR1%oMIg0L+#Yf0hr:K":!4+FE黰W]z]Έ&(ۋQgG %H@>N㎧SmtNZF2O6y:{;\?)v0ǟɧfm2$($RD%փהkJI oA,dǧ!e]ۚ8o^ 5$J_/&;*}egR (= L-").u4T*z/2hedż؁ʇhU}ŪDA؁ӎJ<JA#(2;*J&nMjNUJ*ɔ4*;g0rnM{E3U04Դl8Sx40)J&ݚZ[' xkTL.T`ZM*&i $Y鰡(z\;1Ԍ!.ؑo;rђqDRBᩜ(Z5Ȕ <(&2 P U!ShsY.HTBPP:̈́; cEx9 Kبuc*eҊuQ-)NfbeSMFtrs+P(;qR;4]Ɇ ڼݶ p;23?-[S[Ml~qY&(tOmWm+j[UnF qaeT\Ț7 &9k傃@Dls[XߧOwk9`OpB2rO3Zh|/k7{"4 4c䌉h*&KS]1'+ ~t8Qזwиg$eNTD(_U2 J%Iwcɡ-r%QyhY[eHe} e^Ҳ (RiYԱ㡗udC+PaQhYQj`_tdd<1KݹGVfMkXalBeH`YiY؄"Ldo? D5̸ftЦ-LhV%LI{!V a>tC. ͣIRc5xfO>g5!qݜ^-":!4>`R9";IFQs`# `y'9j1G-A0O`ɴ#ctw?rw#_V J $  * x"ġ5aWL*(q_\Bz` >5rTуp. d 4tٶȑV48RZ2K#Y5cK˝~_|qw.c>1i|TfE$Vԯ(|}&8"Nuod7)c!.$]ͦ\hfNG%0m~]l4F/CC*m?diοVCVP ! PwDf`}8z zhO/m%%dpXf.xʅAF:9?ku}' :w'$Qc7`Xr(RN%P.9N$Љp}D=x\'RH^"J+ہBFyA 2%H:B>9( ˟wftˀUh376 8;WZ6|fl 8 .ӿh){5\Ě~=r՞ojϷ\uϷtY(BjQZYYr*݀`R6| õ4ahfM ЏnFB@d5%j^yq-?x|{}[탋oxo>}7ǶG]kRϓ}%dcNI7uÅ@,pZ)I*?ZaԵCЮɦZxVMX1s,z.]h4AÌpcw ™B \!UK2;4Ԕf8+u.g} Bw,b1L !G⪮CXU s.jU5:,yt5C⬡k}~{u*@Uʨ#&/J$64sGmgm\jnr7%XFEdJ(f9ݐё2X-%VzdNV$u{@~빰 (uPTjx&-y:1Zx$N~02!2;'f;*7HCK*41sSR ;rO{Eژ732Ls{A4K_J@e!-yq߈òf7oaأ }C*:iMIQ*ݷ?14h/J~WOK0\[ܨt- .#鼝[P#ӻx:Bmgd2#O%ɊaUtI$\;c.]b*AnoȀ*QIb33#VX1T:h/uW>b#';4Z7I~PyI Y#W126Eޥet6; y(5 븨i*Z0ysI2@׼BH$dv 署9f]mIlGH G+QWU4*X,SC0.$uʥ׭E#/@sjCAuе㡗NEdCLr9 )[4D +1EE4$EڰhH}w 4 k49>pvR]24OCʧ400 MC<](i"enj O (;iK*XBRtjMaAJ" )LҐt"e@8Yg$(ĿNi0%}7U) `h&Zlys|NVJt~ ɷ]oÈZJ]?~ ?8>Xjyozm0㽭W>~i7mw~-uyύy~?|yz#pZ >i2oo]ۛS}}bU|]Kx~l+X˻fYݹ~bdY1a @ZS??|7vkM&͏CXL5 ⇥@- |]H |7A2rHEd[P+@\[\ڹ7;_y!= ޴Rjg_DR惚}v^~HY[^;kY xrVZP2a:=c0Ԭ*כ~ӿ|k̛e<ٟpv`Zr͘2KNH46r#"Kn+Er F3xI޹\~dK[4&e xMS,Vmg_+N5S}9RWt1A(B:Ț%+HH_> |t$a/%2'q G }:vkt$hn?C!u>ish#=8Bj1<^2镯2'rs A1$;-*Z Gz1`rg+РIΡ:͘4ݫʇOd'Fe?o ـ.U/۲A/^^W=k?jK6iս-;-nܾt3Cp7S^y-Zz78'v%YCԅT5FB0-xdXV445J4FVT|~jqd^N-OQ`4I䧶t 뺼?~)7{u-P*f86*/J7yBlW[0(UZrEԯ6Ӣjk%JWQVO$%1LIj):}[vCZp71EqMЦҨXHqKQrk[@5nӄַEYlEIo|ZNldFh9*n'6bȂKeFe}KF#̆"鏒>9A7'0{{ לu{qȇ+PЉ Õ7@pvo{aK~`[̴$!|$$CA4G9*[jțl!V6\˲be#2bk+ʂ)RR-%+FM'I"b$h*S**jE]1Ŋ0L[ȺPfђ)H ?2ZSq!=<΅Y\H!D IΌ9s(6$ޕKiJImcHb\!Æ6ym^6HtO*m! n8S\ w)qnd!vS"GeCJَ1OQ 1FX r{hsW Htrh9KPxy%(a _jHGodz!TkGgCDQؐܒp6P )-d`1I#o=QB cPǓ Z===xb%Ɂ2_# Cm`¥(z|fAv>\[נGˉd cz bz$O 7|°j;1td\\A5y fLJB 3dqH KZQP(pz>}GI?Wz jơ#dN`/6ɀo/D&#Z_x$"xdW(09wk7Lst?zÕ΅yeçAFr.3!*LJ& Fnh,Xi +:be(K#]5ۺP@ J &z„%IEHRT$I,mAW6ʒʒ.ASF`(h'D.m]$)ґiֲMsGȵ4#1͑SocH{` 8w9R9x) #p@E3sBs6+Zm~(4vZM{-w/sDL-/iF9D YV|T!.S>4GNC‚#IDs6Gjy$](a,"sYU٢RU8#ENcxj:4GQ(*CŹ%4G67s{*67ǔb+}\29\XaO^&`00!U5B7! BĆO]~)*?/LHX@ YL%2!11 LHYLIN6de? [n?Q<q&VLk;E00gS% /{qK'N$3tgvsW- `z#xy~q9tnGðCuuC ~pߣ!J!&$o!%a")k&  ̀whyp%I ѩ J*_t4+dQV_o_+'*'ֶ)n+ֿf`?3pZXfE4Z ZFVdS55/DiFFP>;XH4c7Q"@Tox" "ӀEmҙ\U27|U uQx(j̚F hKRtuwqvvvBNݡ[˘w@{ 4*@O'f )_ڡtcJz>q5vRcr+`ze<\~Vs;VffI[sC7+Jai%K3ŲnF|j `EY eі:G!=L4@dy eL>V~8 g7N(C+OBu-'}c%k"M2׸{$8~JY%-%*E  =h"/懤@)SR[{cmoC57G|lAmV1 cɜ Rqo9uXV *XITO(#L]i*Vi`Yh0]M~Z͆,>\~T+wu ywƻ{m7/T>isD~d{_`f5+[ճ¿ywq#o4tRL\gJE@lp9ZGǜ_ӶVsǻL..yХ3h&=Kh}Q923kGڧ hJ-@o x]Ԍu"e`,mX%JiScuʨQy<]2 i R/- 'Vhq;];;A@x{F$QƈH,sǻ$N??>ks|Ak@wġ| ģؕ8(!v~!K+Bӣa t!K ڀmXn?7:, B)өGߓf@WkrL85vVȔd{%D`S> NV%BphHv 璑OuwtkX)J% h,- ^Ʃ)pE`K Jk *f|5G했#<х4+!sb7͊jO Ӭ Ӭq6@+qLfR0~ L0͊3rb i4ʍ=fE?wKMK"l0APs$3%ڰ}܄ZD WJSDlp??G'ۣ\gB6k֝Vdu6[7\Z|႞7]~OUE~nnյU=1(y,_UrM{~­K.?rbo1꾠Dz^9N_seU}@U4I:Ov#ZTN7b~Ѭ[:guBC^\쒫?ĺquKA~#ƺ1F>pͺӼZ!4䕫NRc1D3aI]Y=W@oԎE7qP(Fl{fٻ6$WEgFFX4XӂSԶg~FI>ht|t[ŊȈ/&JV.Hn~~۱i;l=_Ӂk~GzB-`Dտo|e^Mb-߷'eѯ?"yq(g]4F'g'?k.晻=v@I2>`3}QEu>nQ VɏEJ{U.8v_jY(E6OO)\m}~߫Qd80 eE`$A8kWV 1Xϖ:_]AtzUbcp]xN?*gT˶/"l V?y吪O/>E6P^>Sw_M학F@w^RvPS/5k|;+47΂LܭA!@Eto> nv d'.¯0F 创tŒ}`O)69L;x-sJJ W浉1S}K؛`)#%8ʮknןIxe-'+_ʫO _+(%_[gbɿpLUu*+ qվln<PiWUG+t@ q SVT~ȯ&9㵓K0|?usw\!C.;/$L M}e#фҖrtm.A=\[&.זR1魁lW6G TԨszs#=t#,qq Dܞy2x|­B(x^)h[c/ńr9\ [7 .[4ji#R >K֕㨺㰂#w;ڑ|4sTWWr,o5s*UZY 5ٸyR8bVyQO),س@L`-HY 5Qn)g,389VOZxλ!BV+1(Uq܃}L_I q^xRj8 q< 1DҼxďӛxpT=X7E6V~Ft1>L1ט14nXn:5B.)|).>)s=qӍn:5BlS0oy5e`S2:ޢKb 垩랳.&Hw3I{)KLx/;0aSJ:˧áhY@:6Il> l}vqrջLt0S}WFk9zsLuI^<&%qc#`VR?&7/8\#(AJ 2f5H}@ak̻{LC)eo+3*Z,umv0v>pL%5G@j~ 2tf ph)*{1 Q|jL&fi&%@7`P]juIǿ9ɖ&Z*dː̃jIJ*\pK9sldo}ca|_]"S(\O[tI-y[HJT4oȲ<5s@Tƿ3cQQZȊۼ]%wOc-ZFcc~تV994)1ѫBk$3ǁG 𖃱#x niF/?%It/KQ3vD`7)mR.O܏V'|h[u Jq K^Vtb,E^ rc6 rrKG훖k?g*O2^\[:3l7V-`%Z}{^R0̦wҨt;+|, J1uJqXIp+dw1=jNtZ Cx wO)꓅!"FBƣ;D~5{;2 Y9`jg$.W@zV{b sw<[T;'h1yCzm0xG ~1Q!ғ` }0d|C(H<I$=I> wќk,Tߊ{4f* !8!Z C%&(pr(0/H=y8>ya$)$> [̉m&+g/?Yz(TW!%9W}r%e,$I8K0$ț/k؜*a5$R0d*=;!,ap(03`NA{Άbyݪ6MrؖAQh(>%T\O/^<+?(}=c%(qyxǃaȘ+5tGabժ'8XㇶUDQmbdT|R({0$>V;+{+5x_YknbK45wCVׇn/[C Xv>N5omV,MoRA S.]m{>Mץ@z>O̰@  I':5с%{!0b*=;$yꜥ__c(<}ݶO^3Mі ~ " rSmxPd]sh}Ol{xg*WbH73d 3n[W%;Ya0RP#&Tw]jhvۣ=6<ykKB0\S7Gnۣhl78XДrZlm]_ʍBNo0- /W#!ګ#]*!fO?-v'@d$ަKAӯ/Qh !υ) uu 4`5ŢvY)˷Gmly&WgϢ\D*61{ss0bs^ǗXNVt45vɂ2Dh7i 9]?CT2zbvVQ4AT74K X7p>^./ْ\{*&3oQuKkJN3.0%_t [H)NuS܃6;L/ DN<(Wԭ@Jex,^J;8eCr5_܃v#ʤV7QCY6TUI)RU,Ξ력ӫ7,LŒ!%0\w:[#iu[ʸ(Fc iTMD V5D޽y_uGL6Pc9=d%8U#pX/rx6lMѪ#1|3GllЌx ҍH7U1\ dy2PL?Bˆ>rB+Ikb ҊJo"' m g^䊒D(U1cOt٧,ApzHPC"%3|`=Rj0%\]xc^O_v:C|gC4-.w`Ab7P\m"6l,е^"Jm1U@"dIXCiQ01>&Əb>0*K7v`l%Y%W(\Nó]zM<^~=[ .pTv4MTO7GM(!PNۺdje=;BU|T'?4ϔY l5gA zbp^WIϢPO"(@u/v3nz vKZ':Gm}.6lfȫ/;Cy\p { >}q}R$5nLo5%>WF 퍙˼\TR_I?;;,;- \]izꯗST^fiX1>wsiJ8nS%VN<'[oI+Bl;LZ-g6 4 /5z1w"FJmi.um5YՓ_~5/~_LdRx-E% u@ C܇ BZYAoeltU&o{1jI6/ox8aILLRo/y(/5J7 F]IOo6-Zm{ì`(tY! j،Q 0}D'O!ӯד4׳Lt?nX6orItm=xR;$RK1| e>TwE?9x8f0[Xs;ksݟM/ӫd-z<녝tr:j?|\<=rU^s\Zp*c^,۷/ d:J.xZwU ,LLINA2!>߬Bʟ'JϪM>CX6VMfN?6WzF aj%ag5)&G6u}툒 =ʐkoo&ޙa_N~<;O7Lmu)/g~/Qa7 (["\YJÝ'GU*ʁ?_m?(7+U0h-䷉@GA{]"u?< [@Aځ1~Q~=^>oǏӕƛ~ZwH5T#q(: γOoXzEDBւDrw9;qhx[J$zu`zqfof8q2+ǧ~O޿gVN>X?q_^;VBmtRogXXIyz4XR =Pӗ:}_>H9raq:\vRhd(U[ɢuWx_,'ѵg66/6Rۍ?pauXyy2|D-vn,V|(kw?8v( 7{W:M*O.S\77oƒ"Š5<Y.5<V>lQӛ^*4ywu/` ۇK-#xN^M0٥y3٨PjE`g/EJNT@bʡ`cIC>>~J!7V{TcZusq1XW'i;u8HO%{=/l-H"vLe>9d)#k\}Ex՚x;f NJjmE+.y >\N h2&EZ˝s`t,DE*1dPFj#wBG]wK~~>)uQEr $R2DmG9ɐC ĠΜ'%I[Hm,V$u u"BYI쬀L#)kh I`U %d5R '=ǿ7dGcry@(yHzg \ ڀ0{z &kOE4ECc rz*S! 2NX-qXé#Jp}RʄQBs\xɛ D1zt@F0OLyei<&QuR4_ԉ$,Ww8L:]\g8Rw%],!'44Ժ1yP\U]$"OCEE ѓm^7[XJhטJZ ke4F2To;E[gr4!+NZqP)"LsbO48QGe\<*bH5qh ZG V#W gj P?&Jf~GQ3iPq,頙"jMRYD,n{R[(uG'oG?%6#"hψ+zGbDpsh+Ut ]pŐ*6>p,=~%`%e;sdR)Jl[/,``-oEZH„ S&VXFZu7ۤ_`lG#Ze4~P0yn۵f+!vPWGDsۋ>+5U|ݙ K^x}RߕZڍG/=F/U ժ_~辰T5_I >^p^$﫱>E_R[\^-؃Aw] `01@l4+ ҇ﺂ@^+.^"Z"lV!߳gKCd4dS1dW2 GHXO.x>Z PL`t&5 I$R@璴t٧*xT`]4ssp_j]/B=~bоX^]&Gl,E2M2myHK6'%-6ښ=Hcm0?%Y\Vk)J@_'/N]xB i==$xE,^ R^0&s6,DCUKm$=1rPޭrɹEwmqF}MmEYqE;$WYrZ8A[QK-\"::XU YE#3|>\@Z#uC&#XV0BPKJTV: x.%4+J*zg,zjHeYET7Qs*qoA0ݎ1@R [VJ и#ŕnmYMldRk/#@ "G1P6 ,hX7W׳3!/f?ųr7? L&=1AD5}4كvR9uq˦'[fe3鍙-,sD5hbjꃞjQZ0Lj\|>*ϟ/.t˟-8>Ϝu!wWl lZ79P ]eqQRT0hS (?cMNl jm?,N g2Bt>U 7~ؗZ uD)Wʯ0 䌙}☤L(K׸7rz?sh-蠁=ڗ),*'%_f|>p҈=Bvk6^mzzuI$Ы$ɠWi@cܮ}تօNad0"Ӻ: Xޝ7i}(?'DJǠ~Z/WL5y9ކH{6fݽ?~ŵ_A=,% 1"rV=Mf[57;wu-2xH0N~ i3Re7_e3L6G?h$vUِUކd|E'[&|"d  \(Zf he*m>R7{t> N+5ؔj MhQ]ѼSrbMJz!s0J(D/LQ{U4Fm}Y}1ou,;7H72QU30t-l_VwK,Yl8VOoL2r·lO6f-3{gȥczv1{F-6Fm·g=68ׄFoT0^Ip3턞e8Vg7gj嬩p6v90J'"(f*`S,8ñDM|VN*əm]"hA?4P 6,]xU+:]*ZJ/ 6[2902e;jTWFɁLaTVJ{YQyoC6? \挨[ odY5+ժiegΫ -w$W{5@`W;{75_@o9Ԣ:.W?621s%-"c޾D4:IFzqNcg@(8@$l%jb&g_%EaDQcq;GXoN|r"m=mY6$qC-{`47=<\c?RR@L!-{wKݟQ']1$)Y*ϴ rI3V=x -xk΁Wa6ANܫɈ$IlɆMIGld#kРrK݌E'[2lH[Zb.SAiω(:#ڒ,s!9 w3NVBN?m.3.BqE91$}2 ;0A :1+/ JwFsS0J![ؾRڹwt On8VOLOV<ɹF<\a59^7K<^̩d_Ǚ֒C`ä6O>LjzxaҷO?vu H}J 7G.ŇɘH]*Sxo3ls{&ߑW=t1h= %C K3&T|tuW;ޮb%tWLЛμN֗N<Ɗ~%04xƦTk'A`s:4vNz("VTԒޖߤZij^eyjvXȟ֘k4XrQޝ)_EsHO(q<ӷW_ƷB2_-C,S+.XF +To|}KA]ӿn==zhsYru6UtNAGY7Hm$Tg7zYE_,5έV6UtN!}yߺuZ192.168.126.11:10357: read: connection reset by peer" start-of-body= Mar 09 09:20:29 crc kubenswrapper[4971]: I0309 09:20:29.788018 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:38334->192.168.126.11:10357: read: connection reset by peer" Mar 09 09:20:29 crc kubenswrapper[4971]: I0309 09:20:29.788102 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:20:29 crc kubenswrapper[4971]: I0309 09:20:29.788317 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:20:29 crc kubenswrapper[4971]: I0309 09:20:29.789941 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:20:29 crc kubenswrapper[4971]: I0309 09:20:29.790004 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:20:29 crc kubenswrapper[4971]: I0309 09:20:29.790021 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:20:29 crc kubenswrapper[4971]: I0309 09:20:29.790743 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"ee06813ea29c5d40ce45eb35e4c86781711a1d33441dc3d2963cea3b483c5823"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 09 09:20:29 crc kubenswrapper[4971]: I0309 09:20:29.790967 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://ee06813ea29c5d40ce45eb35e4c86781711a1d33441dc3d2963cea3b483c5823" gracePeriod=30 Mar 09 09:20:30 crc kubenswrapper[4971]: I0309 09:20:30.078623 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:30Z is after 2026-02-23T05:33:13Z Mar 09 09:20:30 crc kubenswrapper[4971]: I0309 09:20:30.306370 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 09:20:30 crc kubenswrapper[4971]: I0309 09:20:30.306753 4971 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ee06813ea29c5d40ce45eb35e4c86781711a1d33441dc3d2963cea3b483c5823" exitCode=255 Mar 09 09:20:30 crc kubenswrapper[4971]: I0309 09:20:30.306832 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ee06813ea29c5d40ce45eb35e4c86781711a1d33441dc3d2963cea3b483c5823"} Mar 09 09:20:30 crc kubenswrapper[4971]: I0309 09:20:30.306865 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f45e548e9b8fb830b43d23c0f2d5e57fc21975ade10bc6ba044d998b75e92bfe"} Mar 09 09:20:30 crc kubenswrapper[4971]: I0309 09:20:30.306985 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:20:30 crc kubenswrapper[4971]: I0309 09:20:30.307780 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:20:30 crc kubenswrapper[4971]: I0309 09:20:30.307844 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:20:30 crc kubenswrapper[4971]: I0309 09:20:30.307855 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:20:30 crc kubenswrapper[4971]: I0309 09:20:30.309285 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 09:20:30 crc kubenswrapper[4971]: I0309 09:20:30.309718 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 09:20:30 crc kubenswrapper[4971]: I0309 09:20:30.311612 4971 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0651a7226ac7342850ed5f0faf8ec5493bb6178f9c14a076ea6efb23a1972b5b" exitCode=255 Mar 09 09:20:30 crc kubenswrapper[4971]: I0309 09:20:30.311665 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0651a7226ac7342850ed5f0faf8ec5493bb6178f9c14a076ea6efb23a1972b5b"} Mar 09 09:20:30 crc kubenswrapper[4971]: I0309 09:20:30.311713 4971 scope.go:117] "RemoveContainer" containerID="e0cbca7ab2ac4006b68b497c60375cf797895f1540a3d124c207a7518ee238fe" Mar 09 09:20:30 crc kubenswrapper[4971]: I0309 09:20:30.311917 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:20:30 crc kubenswrapper[4971]: I0309 09:20:30.313053 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:20:30 crc kubenswrapper[4971]: I0309 09:20:30.313082 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:20:30 crc kubenswrapper[4971]: I0309 09:20:30.313094 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:20:30 crc kubenswrapper[4971]: I0309 09:20:30.313680 4971 scope.go:117] "RemoveContainer" containerID="0651a7226ac7342850ed5f0faf8ec5493bb6178f9c14a076ea6efb23a1972b5b" Mar 09 09:20:30 crc kubenswrapper[4971]: E0309 09:20:30.313855 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:20:30 crc kubenswrapper[4971]: E0309 09:20:30.905181 4971 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:30Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b21c26331c394 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.071999892 +0000 UTC m=+0.631927712,LastTimestamp:2026-03-09 09:19:57.071999892 +0000 UTC m=+0.631927712,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:31 crc kubenswrapper[4971]: I0309 09:20:31.078134 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:31Z is after 2026-02-23T05:33:13Z Mar 09 09:20:31 crc kubenswrapper[4971]: E0309 09:20:31.303551 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:31Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 09:20:31 crc kubenswrapper[4971]: I0309 09:20:31.312750 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:20:31 crc kubenswrapper[4971]: I0309 09:20:31.314937 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:20:31 crc kubenswrapper[4971]: I0309 09:20:31.314989 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:20:31 crc kubenswrapper[4971]: I0309 09:20:31.315000 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:20:31 crc kubenswrapper[4971]: I0309 09:20:31.315025 4971 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:20:31 crc kubenswrapper[4971]: E0309 09:20:31.318021 4971 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:31Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 09:20:31 crc kubenswrapper[4971]: I0309 09:20:31.318195 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 09:20:32 crc kubenswrapper[4971]: I0309 09:20:32.077900 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:32Z is after 2026-02-23T05:33:13Z Mar 09 09:20:32 crc kubenswrapper[4971]: I0309 09:20:32.543126 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:20:32 crc kubenswrapper[4971]: I0309 09:20:32.543299 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:20:32 crc kubenswrapper[4971]: I0309 09:20:32.544690 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:20:32 crc kubenswrapper[4971]: I0309 09:20:32.544733 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:20:32 crc kubenswrapper[4971]: I0309 09:20:32.544744 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:20:33 crc kubenswrapper[4971]: I0309 09:20:33.080636 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:33Z is after 2026-02-23T05:33:13Z Mar 09 09:20:33 crc kubenswrapper[4971]: W0309 09:20:33.485541 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:33Z is after 2026-02-23T05:33:13Z Mar 09 09:20:33 crc kubenswrapper[4971]: E0309 09:20:33.485620 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:33Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 09:20:34 crc kubenswrapper[4971]: I0309 09:20:34.078274 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:34Z is after 2026-02-23T05:33:13Z Mar 09 09:20:35 crc kubenswrapper[4971]: I0309 09:20:35.079703 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:35Z is after 2026-02-23T05:33:13Z Mar 09 09:20:35 crc kubenswrapper[4971]: I0309 09:20:35.743784 4971 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 09:20:35 crc kubenswrapper[4971]: E0309 09:20:35.749718 4971 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:35Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 09:20:35 crc kubenswrapper[4971]: E0309 09:20:35.750988 4971 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 09 09:20:35 crc kubenswrapper[4971]: I0309 09:20:35.792012 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:20:35 crc kubenswrapper[4971]: I0309 09:20:35.792305 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:20:35 crc kubenswrapper[4971]: I0309 09:20:35.794078 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:20:35 crc kubenswrapper[4971]: I0309 09:20:35.794125 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:20:35 crc kubenswrapper[4971]: I0309 09:20:35.794142 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:20:35 crc kubenswrapper[4971]: I0309 09:20:35.794981 4971 scope.go:117] "RemoveContainer" containerID="0651a7226ac7342850ed5f0faf8ec5493bb6178f9c14a076ea6efb23a1972b5b" Mar 09 09:20:35 crc kubenswrapper[4971]: E0309 09:20:35.795298 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:20:36 crc kubenswrapper[4971]: I0309 09:20:36.079680 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:36Z is after 2026-02-23T05:33:13Z Mar 09 09:20:36 crc kubenswrapper[4971]: W0309 09:20:36.889709 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:36Z is after 2026-02-23T05:33:13Z Mar 09 09:20:36 crc kubenswrapper[4971]: E0309 09:20:36.889783 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:36Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 09:20:37 crc kubenswrapper[4971]: I0309 09:20:37.078007 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:37Z is after 2026-02-23T05:33:13Z Mar 09 09:20:37 crc kubenswrapper[4971]: E0309 09:20:37.225599 4971 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 09:20:38 crc kubenswrapper[4971]: I0309 09:20:38.077343 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:38Z is after 2026-02-23T05:33:13Z Mar 09 09:20:38 crc kubenswrapper[4971]: E0309 09:20:38.309468 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:38Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 09:20:38 crc kubenswrapper[4971]: I0309 09:20:38.318550 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:20:38 crc kubenswrapper[4971]: I0309 09:20:38.320337 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:20:38 crc kubenswrapper[4971]: I0309 09:20:38.320471 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:20:38 crc kubenswrapper[4971]: I0309 09:20:38.320517 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:20:38 crc kubenswrapper[4971]: I0309 09:20:38.320562 4971 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:20:38 crc kubenswrapper[4971]: E0309 09:20:38.325731 4971 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:38Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 09:20:39 crc kubenswrapper[4971]: I0309 09:20:39.062751 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:20:39 crc kubenswrapper[4971]: I0309 09:20:39.063803 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:20:39 crc kubenswrapper[4971]: I0309 09:20:39.065700 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:20:39 crc kubenswrapper[4971]: I0309 09:20:39.065778 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:20:39 crc kubenswrapper[4971]: I0309 09:20:39.065798 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:20:39 crc kubenswrapper[4971]: I0309 09:20:39.066836 4971 scope.go:117] "RemoveContainer" containerID="0651a7226ac7342850ed5f0faf8ec5493bb6178f9c14a076ea6efb23a1972b5b" Mar 09 09:20:39 crc kubenswrapper[4971]: E0309 09:20:39.067242 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:20:39 crc kubenswrapper[4971]: I0309 09:20:39.077399 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:39Z is after 2026-02-23T05:33:13Z Mar 09 09:20:39 crc kubenswrapper[4971]: I0309 09:20:39.656926 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:20:39 crc kubenswrapper[4971]: I0309 09:20:39.657178 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:20:39 crc kubenswrapper[4971]: I0309 09:20:39.658570 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:20:39 crc kubenswrapper[4971]: I0309 09:20:39.658618 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:20:39 crc kubenswrapper[4971]: I0309 09:20:39.658635 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:20:40 crc kubenswrapper[4971]: I0309 09:20:40.079328 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:40Z is after 2026-02-23T05:33:13Z Mar 09 09:20:40 crc kubenswrapper[4971]: W0309 09:20:40.406343 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:40Z is after 2026-02-23T05:33:13Z Mar 09 09:20:40 crc kubenswrapper[4971]: E0309 09:20:40.406495 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 09:20:40 crc kubenswrapper[4971]: E0309 09:20:40.910007 4971 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:40Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b21c26331c394 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.071999892 +0000 UTC m=+0.631927712,LastTimestamp:2026-03-09 09:19:57.071999892 +0000 UTC m=+0.631927712,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:41 crc kubenswrapper[4971]: I0309 09:20:41.079639 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:41Z is after 2026-02-23T05:33:13Z Mar 09 09:20:42 crc kubenswrapper[4971]: I0309 09:20:42.077087 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:42Z is after 2026-02-23T05:33:13Z Mar 09 09:20:42 crc kubenswrapper[4971]: I0309 09:20:42.658087 4971 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 09:20:42 crc kubenswrapper[4971]: I0309 09:20:42.658745 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 09:20:43 crc kubenswrapper[4971]: I0309 09:20:43.078948 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:43Z is after 2026-02-23T05:33:13Z Mar 09 09:20:44 crc kubenswrapper[4971]: I0309 09:20:44.078243 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:44Z is after 2026-02-23T05:33:13Z Mar 09 09:20:45 crc kubenswrapper[4971]: I0309 09:20:45.077568 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:45Z is after 2026-02-23T05:33:13Z Mar 09 09:20:45 crc kubenswrapper[4971]: E0309 09:20:45.313421 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:45Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 09:20:45 crc kubenswrapper[4971]: I0309 09:20:45.326471 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:20:45 crc kubenswrapper[4971]: I0309 09:20:45.328053 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:20:45 crc kubenswrapper[4971]: I0309 09:20:45.328105 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:20:45 crc kubenswrapper[4971]: I0309 09:20:45.328122 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:20:45 crc kubenswrapper[4971]: I0309 09:20:45.328161 4971 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:20:45 crc kubenswrapper[4971]: E0309 09:20:45.331422 4971 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:45Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 09:20:45 crc kubenswrapper[4971]: I0309 09:20:45.366026 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 09:20:45 crc kubenswrapper[4971]: I0309 09:20:45.366289 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:20:45 crc kubenswrapper[4971]: I0309 09:20:45.367546 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:20:45 crc kubenswrapper[4971]: I0309 09:20:45.367588 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:20:45 crc kubenswrapper[4971]: I0309 09:20:45.367596 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:20:46 crc kubenswrapper[4971]: I0309 09:20:46.078628 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:46Z is after 2026-02-23T05:33:13Z Mar 09 09:20:47 crc kubenswrapper[4971]: I0309 09:20:47.077269 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:47Z is after 2026-02-23T05:33:13Z Mar 09 09:20:47 crc kubenswrapper[4971]: E0309 09:20:47.226341 4971 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 09:20:48 crc kubenswrapper[4971]: I0309 09:20:48.078697 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:20:48Z is after 2026-02-23T05:33:13Z Mar 09 09:20:48 crc kubenswrapper[4971]: W0309 09:20:48.524445 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 09 09:20:48 crc kubenswrapper[4971]: E0309 09:20:48.524583 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 09 09:20:49 crc kubenswrapper[4971]: I0309 09:20:49.081077 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:20:50 crc kubenswrapper[4971]: I0309 09:20:50.079527 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:20:50 crc kubenswrapper[4971]: E0309 09:20:50.918734 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21c26331c394 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.071999892 +0000 UTC m=+0.631927712,LastTimestamp:2026-03-09 09:19:57.071999892 +0000 UTC m=+0.631927712,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:50 crc kubenswrapper[4971]: E0309 09:20:50.926159 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21c267885eae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.144784558 +0000 UTC m=+0.704712378,LastTimestamp:2026-03-09 09:19:57.144784558 +0000 UTC m=+0.704712378,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:50 crc kubenswrapper[4971]: E0309 09:20:50.931420 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21c26788bf53 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.144809299 +0000 UTC m=+0.704737119,LastTimestamp:2026-03-09 09:19:57.144809299 +0000 UTC m=+0.704737119,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:50 crc kubenswrapper[4971]: E0309 09:20:50.938047 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21c26788f6ea default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.14482353 +0000 UTC m=+0.704751360,LastTimestamp:2026-03-09 09:19:57.14482353 +0000 UTC m=+0.704751360,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:50 crc kubenswrapper[4971]: E0309 09:20:50.945457 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21c26bcba71f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.216302879 +0000 UTC m=+0.776230699,LastTimestamp:2026-03-09 09:19:57.216302879 +0000 UTC m=+0.776230699,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:50 crc kubenswrapper[4971]: E0309 09:20:50.950689 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21c267885eae\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21c267885eae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.144784558 +0000 UTC m=+0.704712378,LastTimestamp:2026-03-09 09:19:57.252522968 +0000 UTC m=+0.812450808,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:50 crc kubenswrapper[4971]: E0309 09:20:50.958259 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21c26788bf53\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21c26788bf53 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.144809299 +0000 UTC m=+0.704737119,LastTimestamp:2026-03-09 09:19:57.252559979 +0000 UTC m=+0.812487819,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:50 crc kubenswrapper[4971]: E0309 09:20:50.964963 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21c26788f6ea\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21c26788f6ea default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.14482353 +0000 UTC m=+0.704751360,LastTimestamp:2026-03-09 09:19:57.252576729 +0000 UTC m=+0.812504569,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:50 crc kubenswrapper[4971]: E0309 09:20:50.972608 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21c267885eae\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21c267885eae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.144784558 +0000 UTC m=+0.704712378,LastTimestamp:2026-03-09 09:19:57.254136043 +0000 UTC m=+0.814063863,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:50 crc kubenswrapper[4971]: E0309 09:20:50.980765 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21c26788bf53\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21c26788bf53 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.144809299 +0000 UTC m=+0.704737119,LastTimestamp:2026-03-09 09:19:57.254162473 +0000 UTC m=+0.814090293,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:50 crc kubenswrapper[4971]: E0309 09:20:50.988671 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21c26788f6ea\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21c26788f6ea default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.14482353 +0000 UTC m=+0.704751360,LastTimestamp:2026-03-09 09:19:57.254174594 +0000 UTC m=+0.814102424,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:50 crc kubenswrapper[4971]: E0309 09:20:50.997204 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21c267885eae\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21c267885eae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.144784558 +0000 UTC m=+0.704712378,LastTimestamp:2026-03-09 09:19:57.256479358 +0000 UTC m=+0.816407168,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.004663 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21c26788bf53\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21c26788bf53 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.144809299 +0000 UTC m=+0.704737119,LastTimestamp:2026-03-09 09:19:57.256500458 +0000 UTC m=+0.816428268,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.011590 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21c26788f6ea\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21c26788f6ea default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.14482353 +0000 UTC m=+0.704751360,LastTimestamp:2026-03-09 09:19:57.256511919 +0000 UTC m=+0.816439739,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.017985 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21c267885eae\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21c267885eae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.144784558 +0000 UTC m=+0.704712378,LastTimestamp:2026-03-09 09:19:57.256613242 +0000 UTC m=+0.816541062,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.023813 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21c26788bf53\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21c26788bf53 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.144809299 +0000 UTC m=+0.704737119,LastTimestamp:2026-03-09 09:19:57.256648363 +0000 UTC m=+0.816576183,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.029896 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21c26788f6ea\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21c26788f6ea default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.14482353 +0000 UTC m=+0.704751360,LastTimestamp:2026-03-09 09:19:57.256658903 +0000 UTC m=+0.816586723,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.034882 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21c267885eae\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21c267885eae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.144784558 +0000 UTC m=+0.704712378,LastTimestamp:2026-03-09 09:19:57.256790056 +0000 UTC m=+0.816717887,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.039876 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21c26788bf53\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21c26788bf53 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.144809299 +0000 UTC m=+0.704737119,LastTimestamp:2026-03-09 09:19:57.256819617 +0000 UTC m=+0.816747437,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.047131 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21c26788f6ea\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21c26788f6ea default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.14482353 +0000 UTC m=+0.704751360,LastTimestamp:2026-03-09 09:19:57.257296751 +0000 UTC m=+0.817224601,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.054040 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21c267885eae\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21c267885eae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.144784558 +0000 UTC m=+0.704712378,LastTimestamp:2026-03-09 09:19:57.257925658 +0000 UTC m=+0.817853468,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.059886 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21c26788bf53\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21c26788bf53 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.144809299 +0000 UTC m=+0.704737119,LastTimestamp:2026-03-09 09:19:57.257949209 +0000 UTC m=+0.817877009,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.067755 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21c26788f6ea\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21c26788f6ea default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.14482353 +0000 UTC m=+0.704751360,LastTimestamp:2026-03-09 09:19:57.257961869 +0000 UTC m=+0.817889679,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.074600 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21c267885eae\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21c267885eae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.144784558 +0000 UTC m=+0.704712378,LastTimestamp:2026-03-09 09:19:57.258793442 +0000 UTC m=+0.818721262,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: I0309 09:20:51.075042 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.084917 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21c26788bf53\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21c26788bf53 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.144809299 +0000 UTC m=+0.704737119,LastTimestamp:2026-03-09 09:19:57.258815533 +0000 UTC m=+0.818743353,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.093599 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b21c285a19f79 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.649756025 +0000 UTC m=+1.209683845,LastTimestamp:2026-03-09 09:19:57.649756025 +0000 UTC m=+1.209683845,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.097927 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21c285c6ab73 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.652183923 +0000 UTC m=+1.212111743,LastTimestamp:2026-03-09 09:19:57.652183923 +0000 UTC m=+1.212111743,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.101572 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b21c2860e4bc8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.656878024 +0000 UTC m=+1.216805884,LastTimestamp:2026-03-09 09:19:57.656878024 +0000 UTC m=+1.216805884,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.108257 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21c2872fbdf0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.675847152 +0000 UTC m=+1.235774972,LastTimestamp:2026-03-09 09:19:57.675847152 +0000 UTC m=+1.235774972,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.115725 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b21c287cd6ea4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:57.68618154 +0000 UTC m=+1.246109360,LastTimestamp:2026-03-09 09:19:57.68618154 +0000 UTC m=+1.246109360,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.122715 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21c2adcdde84 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:58.323744388 +0000 UTC m=+1.883672208,LastTimestamp:2026-03-09 09:19:58.323744388 +0000 UTC m=+1.883672208,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.129940 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b21c2adce61a7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:58.323777959 +0000 UTC m=+1.883705769,LastTimestamp:2026-03-09 09:19:58.323777959 +0000 UTC m=+1.883705769,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.136401 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b21c2add06978 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:58.323911032 +0000 UTC m=+1.883838882,LastTimestamp:2026-03-09 09:19:58.323911032 +0000 UTC m=+1.883838882,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.141876 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21c2add0695a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:58.323911002 +0000 UTC m=+1.883838852,LastTimestamp:2026-03-09 09:19:58.323911002 +0000 UTC m=+1.883838852,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.149117 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b21c2add9e4a6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:58.32453239 +0000 UTC m=+1.884460200,LastTimestamp:2026-03-09 09:19:58.32453239 +0000 UTC m=+1.884460200,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.154067 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b21c2ae62544c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:58.333473868 +0000 UTC m=+1.893401678,LastTimestamp:2026-03-09 09:19:58.333473868 +0000 UTC m=+1.893401678,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.158803 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21c2aeca2d9c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:58.340279708 +0000 UTC m=+1.900207518,LastTimestamp:2026-03-09 09:19:58.340279708 +0000 UTC m=+1.900207518,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.165689 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21c2aedfe22b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:58.341702187 +0000 UTC m=+1.901630037,LastTimestamp:2026-03-09 09:19:58.341702187 +0000 UTC m=+1.901630037,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.170584 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b21c2af1f556f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:58.345860463 +0000 UTC m=+1.905788273,LastTimestamp:2026-03-09 09:19:58.345860463 +0000 UTC m=+1.905788273,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.176559 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21c2af212244 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:58.345978436 +0000 UTC m=+1.905906286,LastTimestamp:2026-03-09 09:19:58.345978436 +0000 UTC m=+1.905906286,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.182218 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b21c2af21fc2a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:58.346034218 +0000 UTC m=+1.905962068,LastTimestamp:2026-03-09 09:19:58.346034218 +0000 UTC m=+1.905962068,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.189742 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21c2c507f97a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:58.713428346 +0000 UTC m=+2.273356186,LastTimestamp:2026-03-09 09:19:58.713428346 +0000 UTC m=+2.273356186,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.196636 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21c2c5d435dc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:58.726813148 +0000 UTC m=+2.286740968,LastTimestamp:2026-03-09 09:19:58.726813148 +0000 UTC m=+2.286740968,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.203156 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21c2c5f033ff openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:58.728647679 +0000 UTC m=+2.288575499,LastTimestamp:2026-03-09 09:19:58.728647679 +0000 UTC m=+2.288575499,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.209244 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21c2d1f8a3fc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:58.930527228 +0000 UTC m=+2.490455038,LastTimestamp:2026-03-09 09:19:58.930527228 +0000 UTC m=+2.490455038,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.214783 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21c2d350fb97 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:58.953094039 +0000 UTC m=+2.513021849,LastTimestamp:2026-03-09 09:19:58.953094039 +0000 UTC m=+2.513021849,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.221398 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21c2d36299ac openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:58.95424862 +0000 UTC m=+2.514176440,LastTimestamp:2026-03-09 09:19:58.95424862 +0000 UTC m=+2.514176440,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.228453 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21c2e00f4e43 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.166893635 +0000 UTC m=+2.726821445,LastTimestamp:2026-03-09 09:19:59.166893635 +0000 UTC m=+2.726821445,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.235078 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b21c2e0e109bf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.180638655 +0000 UTC m=+2.740566465,LastTimestamp:2026-03-09 09:19:59.180638655 +0000 UTC m=+2.740566465,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.241511 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b21c2e0e342b3 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.180784307 +0000 UTC m=+2.740712137,LastTimestamp:2026-03-09 09:19:59.180784307 +0000 UTC m=+2.740712137,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.244621 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21c2e0f9a2da openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.182250714 +0000 UTC m=+2.742178534,LastTimestamp:2026-03-09 09:19:59.182250714 +0000 UTC m=+2.742178534,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.248981 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21c2e1059fbd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.183036349 +0000 UTC m=+2.742964159,LastTimestamp:2026-03-09 09:19:59.183036349 +0000 UTC m=+2.742964159,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.251192 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b21c2e148cab3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.187438259 +0000 UTC m=+2.747366069,LastTimestamp:2026-03-09 09:19:59.187438259 +0000 UTC m=+2.747366069,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.254120 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b21c2ec262cbc openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.369718972 +0000 UTC m=+2.929646782,LastTimestamp:2026-03-09 09:19:59.369718972 +0000 UTC m=+2.929646782,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.256122 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b21c2ec2c7817 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.370131479 +0000 UTC m=+2.930059289,LastTimestamp:2026-03-09 09:19:59.370131479 +0000 UTC m=+2.930059289,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.261438 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b21c2ec388c60 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.370923104 +0000 UTC m=+2.930850914,LastTimestamp:2026-03-09 09:19:59.370923104 +0000 UTC m=+2.930850914,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.262917 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21c2ec50b25d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.372505693 +0000 UTC m=+2.932433503,LastTimestamp:2026-03-09 09:19:59.372505693 +0000 UTC m=+2.932433503,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.269632 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b21c2ed09a80d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.384627213 +0000 UTC m=+2.944555023,LastTimestamp:2026-03-09 09:19:59.384627213 +0000 UTC m=+2.944555023,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.275919 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b21c2ed1b0fea openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.385767914 +0000 UTC m=+2.945695724,LastTimestamp:2026-03-09 09:19:59.385767914 +0000 UTC m=+2.945695724,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.282140 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b21c2ed8440a5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.392661669 +0000 UTC m=+2.952589479,LastTimestamp:2026-03-09 09:19:59.392661669 +0000 UTC m=+2.952589479,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.289739 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b21c2ed958ce4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.3937953 +0000 UTC m=+2.953723120,LastTimestamp:2026-03-09 09:19:59.3937953 +0000 UTC m=+2.953723120,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.296156 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b21c2eda3d8e1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.394732257 +0000 UTC m=+2.954660067,LastTimestamp:2026-03-09 09:19:59.394732257 +0000 UTC m=+2.954660067,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.302231 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21c2edf661ef openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.400141295 +0000 UTC m=+2.960069105,LastTimestamp:2026-03-09 09:19:59.400141295 +0000 UTC m=+2.960069105,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.308749 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b21c2fafe756b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.618774379 +0000 UTC m=+3.178702199,LastTimestamp:2026-03-09 09:19:59.618774379 +0000 UTC m=+3.178702199,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.314994 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b21c2faff26ce openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.61881979 +0000 UTC m=+3.178747590,LastTimestamp:2026-03-09 09:19:59.61881979 +0000 UTC m=+3.178747590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.321681 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b21c2fc349aae openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.639100078 +0000 UTC m=+3.199027898,LastTimestamp:2026-03-09 09:19:59.639100078 +0000 UTC m=+3.199027898,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.328537 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b21c2fc44ddce openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.640165838 +0000 UTC m=+3.200093658,LastTimestamp:2026-03-09 09:19:59.640165838 +0000 UTC m=+3.200093658,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.337136 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b21c2fc6e445b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.642879067 +0000 UTC m=+3.202806877,LastTimestamp:2026-03-09 09:19:59.642879067 +0000 UTC m=+3.202806877,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.343616 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b21c2fca7e8ea openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.646656746 +0000 UTC m=+3.206584556,LastTimestamp:2026-03-09 09:19:59.646656746 +0000 UTC m=+3.206584556,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.350665 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b21c307c872bf openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.833338559 +0000 UTC m=+3.393266369,LastTimestamp:2026-03-09 09:19:59.833338559 +0000 UTC m=+3.393266369,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.358233 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b21c307efa74c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.835907916 +0000 UTC m=+3.395835726,LastTimestamp:2026-03-09 09:19:59.835907916 +0000 UTC m=+3.395835726,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.364886 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b21c3089f1401 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.847404545 +0000 UTC m=+3.407332355,LastTimestamp:2026-03-09 09:19:59.847404545 +0000 UTC m=+3.407332355,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.371521 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b21c308bf43e1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.849513953 +0000 UTC m=+3.409441773,LastTimestamp:2026-03-09 09:19:59.849513953 +0000 UTC m=+3.409441773,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.376641 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b21c308da583b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.851288635 +0000 UTC m=+3.411216445,LastTimestamp:2026-03-09 09:19:59.851288635 +0000 UTC m=+3.411216445,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.381600 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b21c311969b10 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:59.99784424 +0000 UTC m=+3.557772060,LastTimestamp:2026-03-09 09:19:59.99784424 +0000 UTC m=+3.557772060,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.387681 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b21c31240d92e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:00.009001262 +0000 UTC m=+3.568929072,LastTimestamp:2026-03-09 09:20:00.009001262 +0000 UTC m=+3.568929072,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.394924 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b21c3124fd26a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:00.00998257 +0000 UTC m=+3.569910380,LastTimestamp:2026-03-09 09:20:00.00998257 +0000 UTC m=+3.569910380,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.401749 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b21c31b98b8e8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:00.165755112 +0000 UTC m=+3.725682922,LastTimestamp:2026-03-09 09:20:00.165755112 +0000 UTC m=+3.725682922,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.408582 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b21c31c81c19d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:00.181027229 +0000 UTC m=+3.740955059,LastTimestamp:2026-03-09 09:20:00.181027229 +0000 UTC m=+3.740955059,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.416325 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21c31da716ef openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:00.200251119 +0000 UTC m=+3.760178929,LastTimestamp:2026-03-09 09:20:00.200251119 +0000 UTC m=+3.760178929,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.423472 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21c3289dc81c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:00.384190492 +0000 UTC m=+3.944118302,LastTimestamp:2026-03-09 09:20:00.384190492 +0000 UTC m=+3.944118302,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.430758 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21c329797c24 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:00.398588964 +0000 UTC m=+3.958516774,LastTimestamp:2026-03-09 09:20:00.398588964 +0000 UTC m=+3.958516774,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.438213 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21c35a08ade1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:01.213279713 +0000 UTC m=+4.773207523,LastTimestamp:2026-03-09 09:20:01.213279713 +0000 UTC m=+4.773207523,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.445016 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21c366dba52d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:01.428432173 +0000 UTC m=+4.988359983,LastTimestamp:2026-03-09 09:20:01.428432173 +0000 UTC m=+4.988359983,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.450669 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21c3677e4ddb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:01.439092187 +0000 UTC m=+4.999020027,LastTimestamp:2026-03-09 09:20:01.439092187 +0000 UTC m=+4.999020027,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.454535 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21c3678f1481 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:01.440191617 +0000 UTC m=+5.000119427,LastTimestamp:2026-03-09 09:20:01.440191617 +0000 UTC m=+5.000119427,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.456717 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21c37312f094 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:01.633382548 +0000 UTC m=+5.193310358,LastTimestamp:2026-03-09 09:20:01.633382548 +0000 UTC m=+5.193310358,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.460823 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21c37409dc83 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:01.649564803 +0000 UTC m=+5.209492613,LastTimestamp:2026-03-09 09:20:01.649564803 +0000 UTC m=+5.209492613,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.463543 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21c3741ea8ab openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:01.650927787 +0000 UTC m=+5.210855597,LastTimestamp:2026-03-09 09:20:01.650927787 +0000 UTC m=+5.210855597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.467034 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21c3807511ce openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:01.85791739 +0000 UTC m=+5.417845200,LastTimestamp:2026-03-09 09:20:01.85791739 +0000 UTC m=+5.417845200,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.470464 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21c3817af0ae openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:01.875079342 +0000 UTC m=+5.435007162,LastTimestamp:2026-03-09 09:20:01.875079342 +0000 UTC m=+5.435007162,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.473820 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21c38189bb8c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:01.87604878 +0000 UTC m=+5.435976600,LastTimestamp:2026-03-09 09:20:01.87604878 +0000 UTC m=+5.435976600,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.477170 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21c38d327401 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:02.071655425 +0000 UTC m=+5.631583245,LastTimestamp:2026-03-09 09:20:02.071655425 +0000 UTC m=+5.631583245,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.480924 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21c38e2420b5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:02.087493813 +0000 UTC m=+5.647421623,LastTimestamp:2026-03-09 09:20:02.087493813 +0000 UTC m=+5.647421623,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.484272 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21c38e375d36 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:02.088754486 +0000 UTC m=+5.648682296,LastTimestamp:2026-03-09 09:20:02.088754486 +0000 UTC m=+5.648682296,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.488049 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21c398d07b70 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:02.266561392 +0000 UTC m=+5.826489202,LastTimestamp:2026-03-09 09:20:02.266561392 +0000 UTC m=+5.826489202,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.490924 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21c399a47ed9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:02.280455897 +0000 UTC m=+5.840383717,LastTimestamp:2026-03-09 09:20:02.280455897 +0000 UTC m=+5.840383717,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.495161 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 09:20:51 crc kubenswrapper[4971]: &Event{ObjectMeta:{kube-apiserver-crc.189b21c59b98ff2d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 09 09:20:51 crc kubenswrapper[4971]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 09 09:20:51 crc kubenswrapper[4971]: Mar 09 09:20:51 crc kubenswrapper[4971]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:10.903191341 +0000 UTC m=+14.463119191,LastTimestamp:2026-03-09 09:20:10.903191341 +0000 UTC m=+14.463119191,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 09:20:51 crc kubenswrapper[4971]: > Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.499899 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b21c59b99f532 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:10.903254322 +0000 UTC m=+14.463182172,LastTimestamp:2026-03-09 09:20:10.903254322 +0000 UTC m=+14.463182172,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.505625 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b21c59b98ff2d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 09:20:51 crc kubenswrapper[4971]: &Event{ObjectMeta:{kube-apiserver-crc.189b21c59b98ff2d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 09 09:20:51 crc kubenswrapper[4971]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 09 09:20:51 crc kubenswrapper[4971]: Mar 09 09:20:51 crc kubenswrapper[4971]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:10.903191341 +0000 UTC m=+14.463119191,LastTimestamp:2026-03-09 09:20:10.919277848 +0000 UTC m=+14.479205678,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 09:20:51 crc kubenswrapper[4971]: > Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.511159 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b21c59b99f532\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b21c59b99f532 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:10.903254322 +0000 UTC m=+14.463182172,LastTimestamp:2026-03-09 09:20:10.91933052 +0000 UTC m=+14.479258350,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.520094 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b21c3124fd26a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b21c3124fd26a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:00.00998257 +0000 UTC m=+3.569910380,LastTimestamp:2026-03-09 09:20:11.251582954 +0000 UTC m=+14.811510774,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.525233 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b21c31b98b8e8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b21c31b98b8e8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:00.165755112 +0000 UTC m=+3.725682922,LastTimestamp:2026-03-09 09:20:11.514276971 +0000 UTC m=+15.074204821,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.530732 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b21c31c81c19d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b21c31c81c19d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:00.181027229 +0000 UTC m=+3.740955059,LastTimestamp:2026-03-09 09:20:11.525111489 +0000 UTC m=+15.085039339,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.536608 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 09:20:51 crc kubenswrapper[4971]: &Event{ObjectMeta:{kube-controller-manager-crc.189b21c6043692f7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 09:20:51 crc kubenswrapper[4971]: body: Mar 09 09:20:51 crc kubenswrapper[4971]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:12.658348791 +0000 UTC m=+16.218276621,LastTimestamp:2026-03-09 09:20:12.658348791 +0000 UTC m=+16.218276621,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 09:20:51 crc kubenswrapper[4971]: > Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.541620 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21c604385324 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:12.658463524 +0000 UTC m=+16.218391334,LastTimestamp:2026-03-09 09:20:12.658463524 +0000 UTC m=+16.218391334,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.549303 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b21c6043692f7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 09:20:51 crc kubenswrapper[4971]: &Event{ObjectMeta:{kube-controller-manager-crc.189b21c6043692f7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 09:20:51 crc kubenswrapper[4971]: body: Mar 09 09:20:51 crc kubenswrapper[4971]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:12.658348791 +0000 UTC m=+16.218276621,LastTimestamp:2026-03-09 09:20:22.656395847 +0000 UTC m=+26.216323697,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 09:20:51 crc kubenswrapper[4971]: > Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.554459 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b21c604385324\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21c604385324 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:12.658463524 +0000 UTC m=+16.218391334,LastTimestamp:2026-03-09 09:20:22.656452738 +0000 UTC m=+26.216380578,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.557008 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 09:20:51 crc kubenswrapper[4971]: &Event{ObjectMeta:{kube-controller-manager-crc.189b21ca01382e82 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:38334->192.168.126.11:10357: read: connection reset by peer Mar 09 09:20:51 crc kubenswrapper[4971]: body: Mar 09 09:20:51 crc kubenswrapper[4971]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:29.787991682 +0000 UTC m=+33.347919532,LastTimestamp:2026-03-09 09:20:29.787991682 +0000 UTC m=+33.347919532,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 09:20:51 crc kubenswrapper[4971]: > Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.561321 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21ca01393d92 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:38334->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:29.788061074 +0000 UTC m=+33.347988924,LastTimestamp:2026-03-09 09:20:29.788061074 +0000 UTC m=+33.347988924,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.567002 4971 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21ca01653354 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:29.790942036 +0000 UTC m=+33.350869866,LastTimestamp:2026-03-09 09:20:29.790942036 +0000 UTC m=+33.350869866,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.572825 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b21c2aedfe22b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21c2aedfe22b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:58.341702187 +0000 UTC m=+1.901630037,LastTimestamp:2026-03-09 09:20:29.8075912 +0000 UTC m=+33.367519020,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.578341 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b21c2c507f97a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21c2c507f97a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:58.713428346 +0000 UTC m=+2.273356186,LastTimestamp:2026-03-09 09:20:29.980873504 +0000 UTC m=+33.540801314,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.583064 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b21c2c5d435dc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21c2c5d435dc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:19:58.726813148 +0000 UTC m=+2.286740968,LastTimestamp:2026-03-09 09:20:29.989389836 +0000 UTC m=+33.549317646,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.588431 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b21c6043692f7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 09:20:51 crc kubenswrapper[4971]: &Event{ObjectMeta:{kube-controller-manager-crc.189b21c6043692f7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 09:20:51 crc kubenswrapper[4971]: body: Mar 09 09:20:51 crc kubenswrapper[4971]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:12.658348791 +0000 UTC m=+16.218276621,LastTimestamp:2026-03-09 09:20:42.658687285 +0000 UTC m=+46.218615135,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 09:20:51 crc kubenswrapper[4971]: > Mar 09 09:20:51 crc kubenswrapper[4971]: E0309 09:20:51.592560 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b21c604385324\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21c604385324 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:12.658463524 +0000 UTC m=+16.218391334,LastTimestamp:2026-03-09 09:20:42.658970593 +0000 UTC m=+46.218898443,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:20:52 crc kubenswrapper[4971]: I0309 09:20:52.082038 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:20:52 crc kubenswrapper[4971]: E0309 09:20:52.317932 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 09:20:52 crc kubenswrapper[4971]: I0309 09:20:52.332235 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:20:52 crc kubenswrapper[4971]: I0309 09:20:52.333505 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:20:52 crc kubenswrapper[4971]: I0309 09:20:52.333545 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:20:52 crc kubenswrapper[4971]: I0309 09:20:52.333557 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:20:52 crc kubenswrapper[4971]: I0309 09:20:52.333582 4971 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:20:52 crc kubenswrapper[4971]: E0309 09:20:52.338320 4971 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 09:20:52 crc kubenswrapper[4971]: I0309 09:20:52.657618 4971 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 09:20:52 crc kubenswrapper[4971]: I0309 09:20:52.657718 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 09:20:52 crc kubenswrapper[4971]: E0309 09:20:52.663775 4971 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b21c6043692f7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 09:20:52 crc kubenswrapper[4971]: &Event{ObjectMeta:{kube-controller-manager-crc.189b21c6043692f7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 09:20:52 crc kubenswrapper[4971]: body: Mar 09 09:20:52 crc kubenswrapper[4971]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:20:12.658348791 +0000 UTC m=+16.218276621,LastTimestamp:2026-03-09 09:20:52.657695879 +0000 UTC m=+56.217623719,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 09:20:52 crc kubenswrapper[4971]: > Mar 09 09:20:53 crc kubenswrapper[4971]: I0309 09:20:53.081080 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:20:54 crc kubenswrapper[4971]: I0309 09:20:54.080805 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:20:54 crc kubenswrapper[4971]: I0309 09:20:54.151400 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:20:54 crc kubenswrapper[4971]: I0309 09:20:54.153151 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:20:54 crc kubenswrapper[4971]: I0309 09:20:54.153212 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:20:54 crc kubenswrapper[4971]: I0309 09:20:54.153231 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:20:54 crc kubenswrapper[4971]: I0309 09:20:54.154182 4971 scope.go:117] "RemoveContainer" containerID="0651a7226ac7342850ed5f0faf8ec5493bb6178f9c14a076ea6efb23a1972b5b" Mar 09 09:20:54 crc kubenswrapper[4971]: I0309 09:20:54.384964 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 09:20:55 crc kubenswrapper[4971]: I0309 09:20:55.078577 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:20:55 crc kubenswrapper[4971]: I0309 09:20:55.392082 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 09:20:55 crc kubenswrapper[4971]: I0309 09:20:55.393202 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 09:20:55 crc kubenswrapper[4971]: I0309 09:20:55.395658 4971 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="be92731244d565c48c1f8b9a81c38478a3e1e7f546fd9a47273fab0ab8a786ff" exitCode=255 Mar 09 09:20:55 crc kubenswrapper[4971]: I0309 09:20:55.395709 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"be92731244d565c48c1f8b9a81c38478a3e1e7f546fd9a47273fab0ab8a786ff"} Mar 09 09:20:55 crc kubenswrapper[4971]: I0309 09:20:55.395748 4971 scope.go:117] "RemoveContainer" containerID="0651a7226ac7342850ed5f0faf8ec5493bb6178f9c14a076ea6efb23a1972b5b" Mar 09 09:20:55 crc kubenswrapper[4971]: I0309 09:20:55.395924 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:20:55 crc kubenswrapper[4971]: I0309 09:20:55.397387 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:20:55 crc kubenswrapper[4971]: I0309 09:20:55.397411 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:20:55 crc kubenswrapper[4971]: I0309 09:20:55.397420 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:20:55 crc kubenswrapper[4971]: I0309 09:20:55.397866 4971 scope.go:117] "RemoveContainer" containerID="be92731244d565c48c1f8b9a81c38478a3e1e7f546fd9a47273fab0ab8a786ff" Mar 09 09:20:55 crc kubenswrapper[4971]: E0309 09:20:55.398012 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:20:55 crc kubenswrapper[4971]: I0309 09:20:55.791788 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:20:56 crc kubenswrapper[4971]: I0309 09:20:56.079112 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:20:56 crc kubenswrapper[4971]: I0309 09:20:56.401365 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 09:20:56 crc kubenswrapper[4971]: I0309 09:20:56.403819 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:20:56 crc kubenswrapper[4971]: I0309 09:20:56.404663 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:20:56 crc kubenswrapper[4971]: I0309 09:20:56.404702 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:20:56 crc kubenswrapper[4971]: I0309 09:20:56.404711 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:20:56 crc kubenswrapper[4971]: I0309 09:20:56.405282 4971 scope.go:117] "RemoveContainer" containerID="be92731244d565c48c1f8b9a81c38478a3e1e7f546fd9a47273fab0ab8a786ff" Mar 09 09:20:56 crc kubenswrapper[4971]: E0309 09:20:56.405496 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:20:57 crc kubenswrapper[4971]: I0309 09:20:57.081328 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:20:57 crc kubenswrapper[4971]: E0309 09:20:57.226862 4971 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 09:20:58 crc kubenswrapper[4971]: I0309 09:20:58.076119 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:20:59 crc kubenswrapper[4971]: I0309 09:20:59.062608 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:20:59 crc kubenswrapper[4971]: I0309 09:20:59.062780 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:20:59 crc kubenswrapper[4971]: I0309 09:20:59.064101 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:20:59 crc kubenswrapper[4971]: I0309 09:20:59.064170 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:20:59 crc kubenswrapper[4971]: I0309 09:20:59.064194 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:20:59 crc kubenswrapper[4971]: I0309 09:20:59.065329 4971 scope.go:117] "RemoveContainer" containerID="be92731244d565c48c1f8b9a81c38478a3e1e7f546fd9a47273fab0ab8a786ff" Mar 09 09:20:59 crc kubenswrapper[4971]: E0309 09:20:59.065702 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:20:59 crc kubenswrapper[4971]: I0309 09:20:59.079513 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:20:59 crc kubenswrapper[4971]: E0309 09:20:59.324310 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 09:20:59 crc kubenswrapper[4971]: I0309 09:20:59.339326 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:20:59 crc kubenswrapper[4971]: I0309 09:20:59.340466 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:20:59 crc kubenswrapper[4971]: I0309 09:20:59.340574 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:20:59 crc kubenswrapper[4971]: I0309 09:20:59.340588 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:20:59 crc kubenswrapper[4971]: I0309 09:20:59.340616 4971 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:20:59 crc kubenswrapper[4971]: E0309 09:20:59.345265 4971 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 09:20:59 crc kubenswrapper[4971]: I0309 09:20:59.660318 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:20:59 crc kubenswrapper[4971]: I0309 09:20:59.660480 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:20:59 crc kubenswrapper[4971]: I0309 09:20:59.661719 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:20:59 crc kubenswrapper[4971]: I0309 09:20:59.661742 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:20:59 crc kubenswrapper[4971]: I0309 09:20:59.661751 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:20:59 crc kubenswrapper[4971]: I0309 09:20:59.673478 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:21:00 crc kubenswrapper[4971]: I0309 09:21:00.081600 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:21:00 crc kubenswrapper[4971]: W0309 09:21:00.170725 4971 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 09 09:21:00 crc kubenswrapper[4971]: E0309 09:21:00.170990 4971 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 09 09:21:00 crc kubenswrapper[4971]: I0309 09:21:00.413772 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:21:00 crc kubenswrapper[4971]: I0309 09:21:00.414744 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:00 crc kubenswrapper[4971]: I0309 09:21:00.414785 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:00 crc kubenswrapper[4971]: I0309 09:21:00.414811 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:01 crc kubenswrapper[4971]: I0309 09:21:01.078932 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:21:02 crc kubenswrapper[4971]: I0309 09:21:02.078577 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:21:03 crc kubenswrapper[4971]: I0309 09:21:03.077615 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:21:04 crc kubenswrapper[4971]: I0309 09:21:04.079012 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:21:05 crc kubenswrapper[4971]: I0309 09:21:05.077404 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:21:06 crc kubenswrapper[4971]: I0309 09:21:06.078980 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:21:06 crc kubenswrapper[4971]: E0309 09:21:06.330921 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 09:21:06 crc kubenswrapper[4971]: I0309 09:21:06.346038 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:21:06 crc kubenswrapper[4971]: I0309 09:21:06.347195 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:06 crc kubenswrapper[4971]: I0309 09:21:06.347271 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:06 crc kubenswrapper[4971]: I0309 09:21:06.347305 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:06 crc kubenswrapper[4971]: I0309 09:21:06.347400 4971 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:21:06 crc kubenswrapper[4971]: E0309 09:21:06.351921 4971 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 09:21:07 crc kubenswrapper[4971]: I0309 09:21:07.078278 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:21:07 crc kubenswrapper[4971]: E0309 09:21:07.227066 4971 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 09:21:07 crc kubenswrapper[4971]: I0309 09:21:07.752922 4971 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 09:21:07 crc kubenswrapper[4971]: I0309 09:21:07.768449 4971 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 09 09:21:08 crc kubenswrapper[4971]: I0309 09:21:08.080976 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:21:09 crc kubenswrapper[4971]: I0309 09:21:09.078820 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:21:10 crc kubenswrapper[4971]: I0309 09:21:10.081173 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:21:10 crc kubenswrapper[4971]: I0309 09:21:10.151597 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:21:10 crc kubenswrapper[4971]: I0309 09:21:10.153294 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:10 crc kubenswrapper[4971]: I0309 09:21:10.153431 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:10 crc kubenswrapper[4971]: I0309 09:21:10.153454 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:10 crc kubenswrapper[4971]: I0309 09:21:10.154540 4971 scope.go:117] "RemoveContainer" containerID="be92731244d565c48c1f8b9a81c38478a3e1e7f546fd9a47273fab0ab8a786ff" Mar 09 09:21:10 crc kubenswrapper[4971]: E0309 09:21:10.154878 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:21:11 crc kubenswrapper[4971]: I0309 09:21:11.078634 4971 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:21:12 crc kubenswrapper[4971]: I0309 09:21:12.038339 4971 csr.go:261] certificate signing request csr-vlm4t is approved, waiting to be issued Mar 09 09:21:12 crc kubenswrapper[4971]: I0309 09:21:12.051197 4971 csr.go:257] certificate signing request csr-vlm4t is issued Mar 09 09:21:12 crc kubenswrapper[4971]: I0309 09:21:12.088007 4971 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 09 09:21:12 crc kubenswrapper[4971]: I0309 09:21:12.921121 4971 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.053213 4971 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-31 18:52:32.130595861 +0000 UTC Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.053278 4971 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7137h31m19.077322857s for next certificate rotation Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.352836 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.354631 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.354694 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.354719 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.354881 4971 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.365520 4971 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.365902 4971 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 09 09:21:13 crc kubenswrapper[4971]: E0309 09:21:13.365944 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.370672 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.370735 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.370753 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.370775 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.370792 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:13Z","lastTransitionTime":"2026-03-09T09:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:13 crc kubenswrapper[4971]: E0309 09:21:13.390816 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"be9838f3-2d37-4bad-8f06-25d06e28ed61\\\",\\\"systemUUID\\\":\\\"12d699f3-b441-4abe-bc2e-d70473202cd1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.401442 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.401552 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.401576 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.401607 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.401629 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:13Z","lastTransitionTime":"2026-03-09T09:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:13 crc kubenswrapper[4971]: E0309 09:21:13.414738 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"be9838f3-2d37-4bad-8f06-25d06e28ed61\\\",\\\"systemUUID\\\":\\\"12d699f3-b441-4abe-bc2e-d70473202cd1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.423134 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.423213 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.423237 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.423270 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.423297 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:13Z","lastTransitionTime":"2026-03-09T09:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:13 crc kubenswrapper[4971]: E0309 09:21:13.436622 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"be9838f3-2d37-4bad-8f06-25d06e28ed61\\\",\\\"systemUUID\\\":\\\"12d699f3-b441-4abe-bc2e-d70473202cd1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.447722 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.447758 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.447766 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.447780 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:13 crc kubenswrapper[4971]: I0309 09:21:13.447789 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:13Z","lastTransitionTime":"2026-03-09T09:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:13 crc kubenswrapper[4971]: E0309 09:21:13.458754 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"be9838f3-2d37-4bad-8f06-25d06e28ed61\\\",\\\"systemUUID\\\":\\\"12d699f3-b441-4abe-bc2e-d70473202cd1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:13 crc kubenswrapper[4971]: E0309 09:21:13.458870 4971 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 09:21:13 crc kubenswrapper[4971]: E0309 09:21:13.458893 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:13 crc kubenswrapper[4971]: E0309 09:21:13.559177 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:13 crc kubenswrapper[4971]: E0309 09:21:13.660434 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:13 crc kubenswrapper[4971]: E0309 09:21:13.760596 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:13 crc kubenswrapper[4971]: E0309 09:21:13.860809 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:13 crc kubenswrapper[4971]: E0309 09:21:13.961507 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:14 crc kubenswrapper[4971]: E0309 09:21:14.061603 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:14 crc kubenswrapper[4971]: E0309 09:21:14.162307 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:14 crc kubenswrapper[4971]: E0309 09:21:14.262869 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:14 crc kubenswrapper[4971]: E0309 09:21:14.364032 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:14 crc kubenswrapper[4971]: E0309 09:21:14.464365 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:14 crc kubenswrapper[4971]: E0309 09:21:14.565491 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:14 crc kubenswrapper[4971]: E0309 09:21:14.666074 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:14 crc kubenswrapper[4971]: E0309 09:21:14.766244 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:14 crc kubenswrapper[4971]: E0309 09:21:14.866499 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:14 crc kubenswrapper[4971]: E0309 09:21:14.966611 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:15 crc kubenswrapper[4971]: E0309 09:21:15.067709 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:15 crc kubenswrapper[4971]: E0309 09:21:15.168813 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:15 crc kubenswrapper[4971]: E0309 09:21:15.269303 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:15 crc kubenswrapper[4971]: E0309 09:21:15.370154 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:15 crc kubenswrapper[4971]: E0309 09:21:15.470624 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:15 crc kubenswrapper[4971]: E0309 09:21:15.571269 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:15 crc kubenswrapper[4971]: E0309 09:21:15.672488 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:15 crc kubenswrapper[4971]: E0309 09:21:15.773539 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:15 crc kubenswrapper[4971]: E0309 09:21:15.874468 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:15 crc kubenswrapper[4971]: E0309 09:21:15.974832 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:16 crc kubenswrapper[4971]: E0309 09:21:16.075836 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:16 crc kubenswrapper[4971]: E0309 09:21:16.176046 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:16 crc kubenswrapper[4971]: E0309 09:21:16.276808 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:16 crc kubenswrapper[4971]: E0309 09:21:16.377185 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:16 crc kubenswrapper[4971]: E0309 09:21:16.477587 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:16 crc kubenswrapper[4971]: E0309 09:21:16.577912 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:16 crc kubenswrapper[4971]: E0309 09:21:16.679033 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:16 crc kubenswrapper[4971]: E0309 09:21:16.780047 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:16 crc kubenswrapper[4971]: E0309 09:21:16.881071 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:16 crc kubenswrapper[4971]: E0309 09:21:16.981703 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:17 crc kubenswrapper[4971]: E0309 09:21:17.082670 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:17 crc kubenswrapper[4971]: E0309 09:21:17.183691 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:17 crc kubenswrapper[4971]: E0309 09:21:17.228109 4971 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 09:21:17 crc kubenswrapper[4971]: E0309 09:21:17.284553 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:17 crc kubenswrapper[4971]: E0309 09:21:17.384773 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:17 crc kubenswrapper[4971]: E0309 09:21:17.485752 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:17 crc kubenswrapper[4971]: E0309 09:21:17.586295 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:17 crc kubenswrapper[4971]: E0309 09:21:17.687339 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:17 crc kubenswrapper[4971]: E0309 09:21:17.787519 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:17 crc kubenswrapper[4971]: E0309 09:21:17.888683 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:17 crc kubenswrapper[4971]: E0309 09:21:17.989844 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:18 crc kubenswrapper[4971]: E0309 09:21:18.090016 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:18 crc kubenswrapper[4971]: E0309 09:21:18.190206 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:18 crc kubenswrapper[4971]: E0309 09:21:18.291021 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:18 crc kubenswrapper[4971]: E0309 09:21:18.391530 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:18 crc kubenswrapper[4971]: E0309 09:21:18.492411 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:18 crc kubenswrapper[4971]: E0309 09:21:18.592874 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:18 crc kubenswrapper[4971]: E0309 09:21:18.693681 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:18 crc kubenswrapper[4971]: E0309 09:21:18.793842 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:18 crc kubenswrapper[4971]: E0309 09:21:18.894575 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:18 crc kubenswrapper[4971]: E0309 09:21:18.995408 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:19 crc kubenswrapper[4971]: E0309 09:21:19.096408 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:19 crc kubenswrapper[4971]: E0309 09:21:19.196752 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:19 crc kubenswrapper[4971]: E0309 09:21:19.297088 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:19 crc kubenswrapper[4971]: E0309 09:21:19.398223 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:19 crc kubenswrapper[4971]: E0309 09:21:19.499101 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:19 crc kubenswrapper[4971]: E0309 09:21:19.599941 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:19 crc kubenswrapper[4971]: E0309 09:21:19.700622 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:19 crc kubenswrapper[4971]: E0309 09:21:19.801143 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:19 crc kubenswrapper[4971]: E0309 09:21:19.901473 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:20 crc kubenswrapper[4971]: E0309 09:21:20.002446 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:20 crc kubenswrapper[4971]: E0309 09:21:20.103320 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:20 crc kubenswrapper[4971]: I0309 09:21:20.151466 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:21:20 crc kubenswrapper[4971]: I0309 09:21:20.153508 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:20 crc kubenswrapper[4971]: I0309 09:21:20.153579 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:20 crc kubenswrapper[4971]: I0309 09:21:20.153604 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:20 crc kubenswrapper[4971]: E0309 09:21:20.204425 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:20 crc kubenswrapper[4971]: E0309 09:21:20.305015 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:20 crc kubenswrapper[4971]: E0309 09:21:20.406214 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:20 crc kubenswrapper[4971]: E0309 09:21:20.506902 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:20 crc kubenswrapper[4971]: E0309 09:21:20.607933 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:20 crc kubenswrapper[4971]: E0309 09:21:20.709036 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:20 crc kubenswrapper[4971]: I0309 09:21:20.741933 4971 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 09 09:21:20 crc kubenswrapper[4971]: E0309 09:21:20.809759 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:20 crc kubenswrapper[4971]: E0309 09:21:20.910596 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:21 crc kubenswrapper[4971]: E0309 09:21:21.011083 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:21 crc kubenswrapper[4971]: E0309 09:21:21.111646 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:21 crc kubenswrapper[4971]: E0309 09:21:21.212158 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:21 crc kubenswrapper[4971]: E0309 09:21:21.313072 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:21 crc kubenswrapper[4971]: E0309 09:21:21.414176 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:21 crc kubenswrapper[4971]: E0309 09:21:21.514915 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:21 crc kubenswrapper[4971]: E0309 09:21:21.615991 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:21 crc kubenswrapper[4971]: E0309 09:21:21.716558 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:21 crc kubenswrapper[4971]: E0309 09:21:21.817086 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:21 crc kubenswrapper[4971]: E0309 09:21:21.917733 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:22 crc kubenswrapper[4971]: E0309 09:21:22.018152 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:22 crc kubenswrapper[4971]: E0309 09:21:22.119160 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:22 crc kubenswrapper[4971]: E0309 09:21:22.220152 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:22 crc kubenswrapper[4971]: E0309 09:21:22.320946 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:22 crc kubenswrapper[4971]: E0309 09:21:22.422016 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:22 crc kubenswrapper[4971]: E0309 09:21:22.522872 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:22 crc kubenswrapper[4971]: E0309 09:21:22.623621 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:22 crc kubenswrapper[4971]: E0309 09:21:22.723780 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:22 crc kubenswrapper[4971]: E0309 09:21:22.824967 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:22 crc kubenswrapper[4971]: E0309 09:21:22.926040 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:23 crc kubenswrapper[4971]: E0309 09:21:23.026371 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:23 crc kubenswrapper[4971]: E0309 09:21:23.126712 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:23 crc kubenswrapper[4971]: E0309 09:21:23.227555 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:23 crc kubenswrapper[4971]: E0309 09:21:23.328763 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:23 crc kubenswrapper[4971]: E0309 09:21:23.429788 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:23 crc kubenswrapper[4971]: E0309 09:21:23.530446 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:23 crc kubenswrapper[4971]: E0309 09:21:23.630773 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:23 crc kubenswrapper[4971]: E0309 09:21:23.732157 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:23 crc kubenswrapper[4971]: E0309 09:21:23.831234 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 09 09:21:23 crc kubenswrapper[4971]: I0309 09:21:23.835408 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:23 crc kubenswrapper[4971]: I0309 09:21:23.835446 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:23 crc kubenswrapper[4971]: I0309 09:21:23.835455 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:23 crc kubenswrapper[4971]: I0309 09:21:23.835470 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:23 crc kubenswrapper[4971]: I0309 09:21:23.835480 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:23Z","lastTransitionTime":"2026-03-09T09:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:23 crc kubenswrapper[4971]: E0309 09:21:23.846403 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"be9838f3-2d37-4bad-8f06-25d06e28ed61\\\",\\\"systemUUID\\\":\\\"12d699f3-b441-4abe-bc2e-d70473202cd1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:23 crc kubenswrapper[4971]: I0309 09:21:23.850007 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:23 crc kubenswrapper[4971]: I0309 09:21:23.850062 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:23 crc kubenswrapper[4971]: I0309 09:21:23.850078 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:23 crc kubenswrapper[4971]: I0309 09:21:23.850099 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:23 crc kubenswrapper[4971]: I0309 09:21:23.850111 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:23Z","lastTransitionTime":"2026-03-09T09:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:23 crc kubenswrapper[4971]: E0309 09:21:23.862166 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"be9838f3-2d37-4bad-8f06-25d06e28ed61\\\",\\\"systemUUID\\\":\\\"12d699f3-b441-4abe-bc2e-d70473202cd1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:23 crc kubenswrapper[4971]: I0309 09:21:23.866241 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:23 crc kubenswrapper[4971]: I0309 09:21:23.866310 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:23 crc kubenswrapper[4971]: I0309 09:21:23.866319 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:23 crc kubenswrapper[4971]: I0309 09:21:23.866334 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:23 crc kubenswrapper[4971]: I0309 09:21:23.866343 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:23Z","lastTransitionTime":"2026-03-09T09:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:23 crc kubenswrapper[4971]: E0309 09:21:23.876846 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"be9838f3-2d37-4bad-8f06-25d06e28ed61\\\",\\\"systemUUID\\\":\\\"12d699f3-b441-4abe-bc2e-d70473202cd1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:23 crc kubenswrapper[4971]: I0309 09:21:23.880369 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:23 crc kubenswrapper[4971]: I0309 09:21:23.880424 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:23 crc kubenswrapper[4971]: I0309 09:21:23.880438 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:23 crc kubenswrapper[4971]: I0309 09:21:23.880457 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:23 crc kubenswrapper[4971]: I0309 09:21:23.880474 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:23Z","lastTransitionTime":"2026-03-09T09:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:23 crc kubenswrapper[4971]: E0309 09:21:23.893538 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"be9838f3-2d37-4bad-8f06-25d06e28ed61\\\",\\\"systemUUID\\\":\\\"12d699f3-b441-4abe-bc2e-d70473202cd1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:23 crc kubenswrapper[4971]: E0309 09:21:23.893699 4971 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 09:21:23 crc kubenswrapper[4971]: E0309 09:21:23.893722 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:23 crc kubenswrapper[4971]: E0309 09:21:23.993847 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:24 crc kubenswrapper[4971]: E0309 09:21:24.094852 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:24 crc kubenswrapper[4971]: E0309 09:21:24.195818 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:24 crc kubenswrapper[4971]: E0309 09:21:24.295953 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:24 crc kubenswrapper[4971]: E0309 09:21:24.396909 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:24 crc kubenswrapper[4971]: E0309 09:21:24.497894 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:24 crc kubenswrapper[4971]: E0309 09:21:24.598548 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:24 crc kubenswrapper[4971]: E0309 09:21:24.698683 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:24 crc kubenswrapper[4971]: E0309 09:21:24.799851 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:24 crc kubenswrapper[4971]: E0309 09:21:24.901072 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:25 crc kubenswrapper[4971]: E0309 09:21:25.001769 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:25 crc kubenswrapper[4971]: E0309 09:21:25.102927 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:25 crc kubenswrapper[4971]: I0309 09:21:25.126514 4971 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 09 09:21:25 crc kubenswrapper[4971]: I0309 09:21:25.151109 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:21:25 crc kubenswrapper[4971]: I0309 09:21:25.152715 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:25 crc kubenswrapper[4971]: I0309 09:21:25.152957 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:25 crc kubenswrapper[4971]: I0309 09:21:25.153112 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:25 crc kubenswrapper[4971]: I0309 09:21:25.154332 4971 scope.go:117] "RemoveContainer" containerID="be92731244d565c48c1f8b9a81c38478a3e1e7f546fd9a47273fab0ab8a786ff" Mar 09 09:21:25 crc kubenswrapper[4971]: E0309 09:21:25.154836 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:21:25 crc kubenswrapper[4971]: E0309 09:21:25.203983 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:25 crc kubenswrapper[4971]: E0309 09:21:25.304418 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:25 crc kubenswrapper[4971]: E0309 09:21:25.404861 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:25 crc kubenswrapper[4971]: E0309 09:21:25.505791 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:25 crc kubenswrapper[4971]: E0309 09:21:25.606970 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:25 crc kubenswrapper[4971]: E0309 09:21:25.707982 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:25 crc kubenswrapper[4971]: E0309 09:21:25.808191 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:25 crc kubenswrapper[4971]: E0309 09:21:25.909129 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:26 crc kubenswrapper[4971]: E0309 09:21:26.010117 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:26 crc kubenswrapper[4971]: E0309 09:21:26.110943 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:26 crc kubenswrapper[4971]: I0309 09:21:26.152059 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:21:26 crc kubenswrapper[4971]: I0309 09:21:26.153733 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:26 crc kubenswrapper[4971]: I0309 09:21:26.153793 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:26 crc kubenswrapper[4971]: I0309 09:21:26.153832 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:26 crc kubenswrapper[4971]: E0309 09:21:26.211089 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:26 crc kubenswrapper[4971]: E0309 09:21:26.311807 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:26 crc kubenswrapper[4971]: E0309 09:21:26.412834 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:26 crc kubenswrapper[4971]: E0309 09:21:26.513660 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:26 crc kubenswrapper[4971]: E0309 09:21:26.614777 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:26 crc kubenswrapper[4971]: E0309 09:21:26.715865 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:26 crc kubenswrapper[4971]: E0309 09:21:26.816819 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:26 crc kubenswrapper[4971]: E0309 09:21:26.917045 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:27 crc kubenswrapper[4971]: E0309 09:21:27.017737 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:27 crc kubenswrapper[4971]: E0309 09:21:27.118710 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:27 crc kubenswrapper[4971]: E0309 09:21:27.219404 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:27 crc kubenswrapper[4971]: E0309 09:21:27.229037 4971 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 09:21:27 crc kubenswrapper[4971]: E0309 09:21:27.320687 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:27 crc kubenswrapper[4971]: E0309 09:21:27.421455 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:27 crc kubenswrapper[4971]: E0309 09:21:27.522322 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:27 crc kubenswrapper[4971]: E0309 09:21:27.623585 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:27 crc kubenswrapper[4971]: E0309 09:21:27.725220 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:27 crc kubenswrapper[4971]: E0309 09:21:27.826704 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:27 crc kubenswrapper[4971]: E0309 09:21:27.927082 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:28 crc kubenswrapper[4971]: E0309 09:21:28.027595 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:28 crc kubenswrapper[4971]: E0309 09:21:28.128167 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:28 crc kubenswrapper[4971]: E0309 09:21:28.228491 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:28 crc kubenswrapper[4971]: E0309 09:21:28.328827 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:28 crc kubenswrapper[4971]: E0309 09:21:28.429630 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:28 crc kubenswrapper[4971]: E0309 09:21:28.530208 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:28 crc kubenswrapper[4971]: E0309 09:21:28.632981 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:28 crc kubenswrapper[4971]: E0309 09:21:28.733839 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:28 crc kubenswrapper[4971]: E0309 09:21:28.834858 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:28 crc kubenswrapper[4971]: E0309 09:21:28.935910 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:29 crc kubenswrapper[4971]: E0309 09:21:29.037048 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:29 crc kubenswrapper[4971]: E0309 09:21:29.138236 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:29 crc kubenswrapper[4971]: E0309 09:21:29.239188 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:29 crc kubenswrapper[4971]: E0309 09:21:29.339832 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:29 crc kubenswrapper[4971]: E0309 09:21:29.440939 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:29 crc kubenswrapper[4971]: E0309 09:21:29.541714 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:29 crc kubenswrapper[4971]: E0309 09:21:29.642092 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:29 crc kubenswrapper[4971]: E0309 09:21:29.742961 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:29 crc kubenswrapper[4971]: E0309 09:21:29.843828 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:29 crc kubenswrapper[4971]: E0309 09:21:29.944936 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:30 crc kubenswrapper[4971]: E0309 09:21:30.046075 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:30 crc kubenswrapper[4971]: E0309 09:21:30.147286 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:30 crc kubenswrapper[4971]: E0309 09:21:30.248653 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:30 crc kubenswrapper[4971]: E0309 09:21:30.349800 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:30 crc kubenswrapper[4971]: E0309 09:21:30.450612 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:30 crc kubenswrapper[4971]: E0309 09:21:30.551267 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:30 crc kubenswrapper[4971]: E0309 09:21:30.652357 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:30 crc kubenswrapper[4971]: E0309 09:21:30.753321 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:30 crc kubenswrapper[4971]: I0309 09:21:30.836244 4971 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 09 09:21:30 crc kubenswrapper[4971]: E0309 09:21:30.853479 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:30 crc kubenswrapper[4971]: E0309 09:21:30.954373 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:31 crc kubenswrapper[4971]: E0309 09:21:31.055575 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:31 crc kubenswrapper[4971]: E0309 09:21:31.156645 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:31 crc kubenswrapper[4971]: E0309 09:21:31.257270 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:31 crc kubenswrapper[4971]: E0309 09:21:31.357714 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:31 crc kubenswrapper[4971]: E0309 09:21:31.458877 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:31 crc kubenswrapper[4971]: E0309 09:21:31.559991 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:31 crc kubenswrapper[4971]: E0309 09:21:31.660323 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:31 crc kubenswrapper[4971]: E0309 09:21:31.761315 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:31 crc kubenswrapper[4971]: E0309 09:21:31.862488 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:31 crc kubenswrapper[4971]: E0309 09:21:31.963430 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:32 crc kubenswrapper[4971]: E0309 09:21:32.064427 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:32 crc kubenswrapper[4971]: E0309 09:21:32.165668 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:32 crc kubenswrapper[4971]: E0309 09:21:32.266589 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:32 crc kubenswrapper[4971]: E0309 09:21:32.367563 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:32 crc kubenswrapper[4971]: E0309 09:21:32.468444 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:32 crc kubenswrapper[4971]: E0309 09:21:32.568619 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:32 crc kubenswrapper[4971]: E0309 09:21:32.669393 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:32 crc kubenswrapper[4971]: E0309 09:21:32.770489 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:32 crc kubenswrapper[4971]: E0309 09:21:32.871586 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:32 crc kubenswrapper[4971]: E0309 09:21:32.972344 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:33 crc kubenswrapper[4971]: E0309 09:21:33.073427 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:33 crc kubenswrapper[4971]: E0309 09:21:33.174545 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:33 crc kubenswrapper[4971]: E0309 09:21:33.275601 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:33 crc kubenswrapper[4971]: E0309 09:21:33.376729 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:33 crc kubenswrapper[4971]: E0309 09:21:33.477941 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:33 crc kubenswrapper[4971]: E0309 09:21:33.578894 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:33 crc kubenswrapper[4971]: E0309 09:21:33.679962 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:33 crc kubenswrapper[4971]: E0309 09:21:33.780641 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:33 crc kubenswrapper[4971]: E0309 09:21:33.880727 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:33 crc kubenswrapper[4971]: E0309 09:21:33.981734 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:34 crc kubenswrapper[4971]: E0309 09:21:34.081865 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:34 crc kubenswrapper[4971]: E0309 09:21:34.182694 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:34 crc kubenswrapper[4971]: E0309 09:21:34.220447 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 09 09:21:34 crc kubenswrapper[4971]: I0309 09:21:34.225206 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:34 crc kubenswrapper[4971]: I0309 09:21:34.225272 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:34 crc kubenswrapper[4971]: I0309 09:21:34.225291 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:34 crc kubenswrapper[4971]: I0309 09:21:34.225314 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:34 crc kubenswrapper[4971]: I0309 09:21:34.225332 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:34Z","lastTransitionTime":"2026-03-09T09:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:34 crc kubenswrapper[4971]: E0309 09:21:34.235784 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"be9838f3-2d37-4bad-8f06-25d06e28ed61\\\",\\\"systemUUID\\\":\\\"12d699f3-b441-4abe-bc2e-d70473202cd1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:34 crc kubenswrapper[4971]: I0309 09:21:34.240111 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:34 crc kubenswrapper[4971]: I0309 09:21:34.240156 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:34 crc kubenswrapper[4971]: I0309 09:21:34.240167 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:34 crc kubenswrapper[4971]: I0309 09:21:34.240182 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:34 crc kubenswrapper[4971]: I0309 09:21:34.240297 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:34Z","lastTransitionTime":"2026-03-09T09:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:34 crc kubenswrapper[4971]: E0309 09:21:34.254339 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"be9838f3-2d37-4bad-8f06-25d06e28ed61\\\",\\\"systemUUID\\\":\\\"12d699f3-b441-4abe-bc2e-d70473202cd1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:34 crc kubenswrapper[4971]: I0309 09:21:34.258505 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:34 crc kubenswrapper[4971]: I0309 09:21:34.258574 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:34 crc kubenswrapper[4971]: I0309 09:21:34.258593 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:34 crc kubenswrapper[4971]: I0309 09:21:34.258619 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:34 crc kubenswrapper[4971]: I0309 09:21:34.258638 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:34Z","lastTransitionTime":"2026-03-09T09:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:34 crc kubenswrapper[4971]: E0309 09:21:34.271104 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"be9838f3-2d37-4bad-8f06-25d06e28ed61\\\",\\\"systemUUID\\\":\\\"12d699f3-b441-4abe-bc2e-d70473202cd1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:34 crc kubenswrapper[4971]: I0309 09:21:34.275491 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:34 crc kubenswrapper[4971]: I0309 09:21:34.275546 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:34 crc kubenswrapper[4971]: I0309 09:21:34.275563 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:34 crc kubenswrapper[4971]: I0309 09:21:34.275587 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:34 crc kubenswrapper[4971]: I0309 09:21:34.275602 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:34Z","lastTransitionTime":"2026-03-09T09:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:34 crc kubenswrapper[4971]: E0309 09:21:34.287062 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"be9838f3-2d37-4bad-8f06-25d06e28ed61\\\",\\\"systemUUID\\\":\\\"12d699f3-b441-4abe-bc2e-d70473202cd1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:34 crc kubenswrapper[4971]: E0309 09:21:34.287279 4971 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 09:21:34 crc kubenswrapper[4971]: E0309 09:21:34.287316 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:34 crc kubenswrapper[4971]: E0309 09:21:34.387791 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:34 crc kubenswrapper[4971]: E0309 09:21:34.610122 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:34 crc kubenswrapper[4971]: E0309 09:21:34.711109 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:34 crc kubenswrapper[4971]: E0309 09:21:34.812705 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:34 crc kubenswrapper[4971]: E0309 09:21:34.913129 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:35 crc kubenswrapper[4971]: E0309 09:21:35.014451 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:35 crc kubenswrapper[4971]: E0309 09:21:35.115656 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:35 crc kubenswrapper[4971]: E0309 09:21:35.216119 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:35 crc kubenswrapper[4971]: E0309 09:21:35.317003 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:35 crc kubenswrapper[4971]: E0309 09:21:35.417138 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:35 crc kubenswrapper[4971]: E0309 09:21:35.517783 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:35 crc kubenswrapper[4971]: E0309 09:21:35.617964 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:35 crc kubenswrapper[4971]: E0309 09:21:35.718833 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:35 crc kubenswrapper[4971]: E0309 09:21:35.819108 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:35 crc kubenswrapper[4971]: E0309 09:21:35.919448 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:36 crc kubenswrapper[4971]: E0309 09:21:36.019842 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:36 crc kubenswrapper[4971]: E0309 09:21:36.120937 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:36 crc kubenswrapper[4971]: E0309 09:21:36.222030 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:36 crc kubenswrapper[4971]: E0309 09:21:36.322513 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:36 crc kubenswrapper[4971]: E0309 09:21:36.422708 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:36 crc kubenswrapper[4971]: E0309 09:21:36.523944 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:36 crc kubenswrapper[4971]: E0309 09:21:36.624704 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:36 crc kubenswrapper[4971]: E0309 09:21:36.724955 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:36 crc kubenswrapper[4971]: E0309 09:21:36.825831 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:36 crc kubenswrapper[4971]: E0309 09:21:36.927080 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:37 crc kubenswrapper[4971]: E0309 09:21:37.027406 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:37 crc kubenswrapper[4971]: E0309 09:21:37.127856 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:37 crc kubenswrapper[4971]: E0309 09:21:37.228122 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:37 crc kubenswrapper[4971]: E0309 09:21:37.229255 4971 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 09:21:37 crc kubenswrapper[4971]: E0309 09:21:37.328520 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:37 crc kubenswrapper[4971]: E0309 09:21:37.429621 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:37 crc kubenswrapper[4971]: E0309 09:21:37.529760 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:37 crc kubenswrapper[4971]: E0309 09:21:37.630918 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:37 crc kubenswrapper[4971]: E0309 09:21:37.731833 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:37 crc kubenswrapper[4971]: E0309 09:21:37.832153 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:37 crc kubenswrapper[4971]: E0309 09:21:37.932588 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:38 crc kubenswrapper[4971]: E0309 09:21:38.033502 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:38 crc kubenswrapper[4971]: E0309 09:21:38.134515 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:38 crc kubenswrapper[4971]: I0309 09:21:38.151422 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:21:38 crc kubenswrapper[4971]: I0309 09:21:38.153291 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:38 crc kubenswrapper[4971]: I0309 09:21:38.153333 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:38 crc kubenswrapper[4971]: I0309 09:21:38.153377 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:38 crc kubenswrapper[4971]: I0309 09:21:38.154080 4971 scope.go:117] "RemoveContainer" containerID="be92731244d565c48c1f8b9a81c38478a3e1e7f546fd9a47273fab0ab8a786ff" Mar 09 09:21:38 crc kubenswrapper[4971]: E0309 09:21:38.234973 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:38 crc kubenswrapper[4971]: E0309 09:21:38.335875 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:38 crc kubenswrapper[4971]: E0309 09:21:38.436790 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:38 crc kubenswrapper[4971]: E0309 09:21:38.537562 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:38 crc kubenswrapper[4971]: I0309 09:21:38.624804 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 09:21:38 crc kubenswrapper[4971]: I0309 09:21:38.627574 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0498fa34e162baaf3d51e00c839035dfb5a043d12e709f17f37859b8d3fbe083"} Mar 09 09:21:38 crc kubenswrapper[4971]: I0309 09:21:38.627708 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:21:38 crc kubenswrapper[4971]: I0309 09:21:38.629633 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:38 crc kubenswrapper[4971]: I0309 09:21:38.629677 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:38 crc kubenswrapper[4971]: I0309 09:21:38.629708 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:38 crc kubenswrapper[4971]: E0309 09:21:38.638059 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:38 crc kubenswrapper[4971]: E0309 09:21:38.738455 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:38 crc kubenswrapper[4971]: E0309 09:21:38.838601 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:38 crc kubenswrapper[4971]: E0309 09:21:38.939797 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:39 crc kubenswrapper[4971]: E0309 09:21:39.039928 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:39 crc kubenswrapper[4971]: I0309 09:21:39.062425 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:21:39 crc kubenswrapper[4971]: E0309 09:21:39.140180 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:39 crc kubenswrapper[4971]: E0309 09:21:39.240799 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:39 crc kubenswrapper[4971]: E0309 09:21:39.341585 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:39 crc kubenswrapper[4971]: E0309 09:21:39.441863 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:39 crc kubenswrapper[4971]: E0309 09:21:39.543079 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:39 crc kubenswrapper[4971]: I0309 09:21:39.633236 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 09 09:21:39 crc kubenswrapper[4971]: I0309 09:21:39.634262 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 09:21:39 crc kubenswrapper[4971]: I0309 09:21:39.636467 4971 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0498fa34e162baaf3d51e00c839035dfb5a043d12e709f17f37859b8d3fbe083" exitCode=255 Mar 09 09:21:39 crc kubenswrapper[4971]: I0309 09:21:39.636507 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0498fa34e162baaf3d51e00c839035dfb5a043d12e709f17f37859b8d3fbe083"} Mar 09 09:21:39 crc kubenswrapper[4971]: I0309 09:21:39.636561 4971 scope.go:117] "RemoveContainer" containerID="be92731244d565c48c1f8b9a81c38478a3e1e7f546fd9a47273fab0ab8a786ff" Mar 09 09:21:39 crc kubenswrapper[4971]: I0309 09:21:39.636706 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:21:39 crc kubenswrapper[4971]: I0309 09:21:39.638807 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:39 crc kubenswrapper[4971]: I0309 09:21:39.638880 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:39 crc kubenswrapper[4971]: I0309 09:21:39.638904 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:39 crc kubenswrapper[4971]: I0309 09:21:39.640298 4971 scope.go:117] "RemoveContainer" containerID="0498fa34e162baaf3d51e00c839035dfb5a043d12e709f17f37859b8d3fbe083" Mar 09 09:21:39 crc kubenswrapper[4971]: E0309 09:21:39.640786 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:21:39 crc kubenswrapper[4971]: E0309 09:21:39.643236 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:39 crc kubenswrapper[4971]: E0309 09:21:39.743416 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:39 crc kubenswrapper[4971]: E0309 09:21:39.844450 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:39 crc kubenswrapper[4971]: E0309 09:21:39.944642 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:40 crc kubenswrapper[4971]: E0309 09:21:40.045800 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:40 crc kubenswrapper[4971]: E0309 09:21:40.146905 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:40 crc kubenswrapper[4971]: E0309 09:21:40.248018 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:40 crc kubenswrapper[4971]: E0309 09:21:40.348808 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:40 crc kubenswrapper[4971]: E0309 09:21:40.448905 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:40 crc kubenswrapper[4971]: E0309 09:21:40.552037 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:40 crc kubenswrapper[4971]: I0309 09:21:40.642520 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 09 09:21:40 crc kubenswrapper[4971]: I0309 09:21:40.645978 4971 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:21:40 crc kubenswrapper[4971]: I0309 09:21:40.647324 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:40 crc kubenswrapper[4971]: I0309 09:21:40.647412 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:40 crc kubenswrapper[4971]: I0309 09:21:40.647430 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:40 crc kubenswrapper[4971]: I0309 09:21:40.648430 4971 scope.go:117] "RemoveContainer" containerID="0498fa34e162baaf3d51e00c839035dfb5a043d12e709f17f37859b8d3fbe083" Mar 09 09:21:40 crc kubenswrapper[4971]: E0309 09:21:40.648722 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:21:40 crc kubenswrapper[4971]: E0309 09:21:40.653258 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:40 crc kubenswrapper[4971]: E0309 09:21:40.753838 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:40 crc kubenswrapper[4971]: E0309 09:21:40.854132 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:40 crc kubenswrapper[4971]: E0309 09:21:40.954653 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:41 crc kubenswrapper[4971]: E0309 09:21:41.055094 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:41 crc kubenswrapper[4971]: E0309 09:21:41.155628 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:41 crc kubenswrapper[4971]: E0309 09:21:41.256168 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:41 crc kubenswrapper[4971]: E0309 09:21:41.356939 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:41 crc kubenswrapper[4971]: E0309 09:21:41.457386 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:41 crc kubenswrapper[4971]: E0309 09:21:41.558057 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:41 crc kubenswrapper[4971]: E0309 09:21:41.658507 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:41 crc kubenswrapper[4971]: E0309 09:21:41.759671 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:41 crc kubenswrapper[4971]: E0309 09:21:41.860868 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:41 crc kubenswrapper[4971]: E0309 09:21:41.962177 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:42 crc kubenswrapper[4971]: E0309 09:21:42.062789 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:42 crc kubenswrapper[4971]: E0309 09:21:42.163619 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:42 crc kubenswrapper[4971]: E0309 09:21:42.264244 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:42 crc kubenswrapper[4971]: E0309 09:21:42.365657 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:42 crc kubenswrapper[4971]: E0309 09:21:42.466725 4971 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:21:42 crc kubenswrapper[4971]: I0309 09:21:42.509002 4971 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 09 09:21:42 crc kubenswrapper[4971]: I0309 09:21:42.570286 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:42 crc kubenswrapper[4971]: I0309 09:21:42.570657 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:42 crc kubenswrapper[4971]: I0309 09:21:42.570817 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:42 crc kubenswrapper[4971]: I0309 09:21:42.570976 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:42 crc kubenswrapper[4971]: I0309 09:21:42.571105 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:42Z","lastTransitionTime":"2026-03-09T09:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:42 crc kubenswrapper[4971]: I0309 09:21:42.674062 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:42 crc kubenswrapper[4971]: I0309 09:21:42.674515 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:42 crc kubenswrapper[4971]: I0309 09:21:42.674701 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:42 crc kubenswrapper[4971]: I0309 09:21:42.674923 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:42 crc kubenswrapper[4971]: I0309 09:21:42.675162 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:42Z","lastTransitionTime":"2026-03-09T09:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:42 crc kubenswrapper[4971]: I0309 09:21:42.778762 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:42 crc kubenswrapper[4971]: I0309 09:21:42.778920 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:42 crc kubenswrapper[4971]: I0309 09:21:42.778944 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:42 crc kubenswrapper[4971]: I0309 09:21:42.778975 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:42 crc kubenswrapper[4971]: I0309 09:21:42.778998 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:42Z","lastTransitionTime":"2026-03-09T09:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:42 crc kubenswrapper[4971]: I0309 09:21:42.882024 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:42 crc kubenswrapper[4971]: I0309 09:21:42.882087 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:42 crc kubenswrapper[4971]: I0309 09:21:42.882104 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:42 crc kubenswrapper[4971]: I0309 09:21:42.882128 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:42 crc kubenswrapper[4971]: I0309 09:21:42.882146 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:42Z","lastTransitionTime":"2026-03-09T09:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:42 crc kubenswrapper[4971]: I0309 09:21:42.984747 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:42 crc kubenswrapper[4971]: I0309 09:21:42.984809 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:42 crc kubenswrapper[4971]: I0309 09:21:42.984829 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:42 crc kubenswrapper[4971]: I0309 09:21:42.984854 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:42 crc kubenswrapper[4971]: I0309 09:21:42.984870 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:42Z","lastTransitionTime":"2026-03-09T09:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.087659 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.087733 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.087756 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.087792 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.087813 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:43Z","lastTransitionTime":"2026-03-09T09:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.121441 4971 apiserver.go:52] "Watching apiserver" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.127461 4971 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.127767 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.128227 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.128625 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.128732 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.129268 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.129401 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.129427 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.129468 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.129541 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.129630 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.132500 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.133056 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.133389 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.133433 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.133635 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.133718 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.133918 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.134035 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.139527 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.174178 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.186966 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.187817 4971 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.190616 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.190662 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.190674 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.190694 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.190708 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:43Z","lastTransitionTime":"2026-03-09T09:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.208492 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.225152 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.229343 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.229458 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.229510 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.229563 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.229642 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.229681 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.229719 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.229754 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.229790 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.229837 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.229875 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.229911 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.229882 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.229954 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.229956 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.230029 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.230252 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.230289 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.230332 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.230440 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.230470 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.230494 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.230539 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.230568 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.230592 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.230651 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.230688 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.230721 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.230819 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.230861 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.230911 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.230943 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.230952 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.230973 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231003 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231015 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231034 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231066 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231099 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231130 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231162 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231194 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231225 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231256 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231267 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231288 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231321 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231377 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231376 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231417 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231474 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231514 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231548 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231580 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231611 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231641 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231671 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231738 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231771 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231801 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231832 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231864 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231893 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231921 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231951 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231980 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232009 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232041 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232069 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232101 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232132 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232162 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232192 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232224 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232259 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232289 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232320 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232376 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232410 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232440 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232470 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232499 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232530 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232560 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232593 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232629 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232659 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232687 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232714 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232749 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232780 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232811 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232842 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232870 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232897 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232931 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232960 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232991 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233024 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233056 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233087 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233120 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233151 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233183 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233214 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233243 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233272 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233307 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233338 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233390 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233426 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233463 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233495 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233531 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233562 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233595 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233625 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233654 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233689 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233719 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233750 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233783 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233820 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233857 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233892 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233956 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233988 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234021 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234057 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234087 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234114 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234144 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234170 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234198 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234230 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234262 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234295 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234325 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234376 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234410 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234442 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234472 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234502 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234531 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234565 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234596 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234630 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234663 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234694 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234727 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234759 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234791 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234830 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234862 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234891 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234921 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234952 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234985 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235016 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235046 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235076 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235105 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235132 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235164 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235196 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235224 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235257 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235294 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235327 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235386 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235489 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235523 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235556 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235588 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235617 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235647 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235680 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235737 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235771 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235808 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235839 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235870 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235907 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235943 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235979 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.236012 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.236044 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.236077 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.236111 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.236141 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.236170 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.236208 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.236237 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.236268 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.236303 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.236336 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.236391 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.236425 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.236463 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.236498 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.231540 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232103 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.236628 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232181 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232231 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232632 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.232743 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.236760 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233285 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233408 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233302 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233458 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233473 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233707 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.233791 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234227 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234328 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234725 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234750 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234840 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234920 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.234930 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235217 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235030 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235333 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235759 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235792 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235860 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235806 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.236898 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.236948 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.235953 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.236018 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.236373 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.236426 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.236524 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.237538 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.237653 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.237906 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.238022 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.238049 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.238085 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.238558 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.238657 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.236535 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.238731 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.238763 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.238903 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.238831 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.238989 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239026 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239052 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239079 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239068 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239149 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239103 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239253 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239301 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239339 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239391 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239408 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239422 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239458 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239472 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239492 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239523 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239638 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239656 4971 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239672 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239686 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239743 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239761 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239775 4971 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239789 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239803 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239817 4971 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239825 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239830 4971 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239867 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239881 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239892 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239905 4971 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239917 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239928 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239974 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.240027 4971 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.240058 4971 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.240095 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.240131 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.240163 4971 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.240194 4971 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.240226 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.240259 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.240545 4971 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.240573 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.240595 4971 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.240620 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.240643 4971 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.240668 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.240691 4971 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.240713 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.240738 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.240762 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.239891 4971 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.241889 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:43.741864075 +0000 UTC m=+107.301791895 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239554 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239903 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.239952 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.240049 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.240663 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.240869 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.240957 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.241091 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.241140 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.241509 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.242093 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.242218 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.242447 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.242730 4971 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.242789 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.242831 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:43.742802653 +0000 UTC m=+107.302730513 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.242846 4971 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.243316 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.243384 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.245463 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.246321 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.246847 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.246856 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.247395 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.250076 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.250134 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.250149 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.250487 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.250476 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.250531 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.250766 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.250940 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.251075 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.251267 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.251702 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.251964 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.252035 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.252274 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.252288 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.252591 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.252705 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.252821 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.256069 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.256402 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.256630 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:21:43.756607536 +0000 UTC m=+107.316535356 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.256920 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.257156 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.257380 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.257968 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.257998 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.258424 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.258618 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.258840 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.258951 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.259000 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.259000 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.259207 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.259385 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.256925 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.260214 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.260724 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.260992 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.261113 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.261463 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.261578 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.261317 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.261741 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.261936 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.261970 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.262084 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.262210 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.262258 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.262520 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.262739 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.262750 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.263031 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.263317 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.263323 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.263358 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.263417 4971 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.263511 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.263548 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:43.763521027 +0000 UTC m=+107.323449177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.263635 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.263341 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.263809 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.263959 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.263979 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.264008 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.264244 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.264294 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.264862 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.264965 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.266901 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.267197 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.267242 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.267260 4971 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.267338 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:43.767315258 +0000 UTC m=+107.327243078 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.269059 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.269204 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.269264 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.269486 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.271865 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.272706 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.272822 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.272965 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.273265 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.273391 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.274332 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.275181 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.275528 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.275685 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.278728 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.278958 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.278850 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.279696 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.279847 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.280134 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.280165 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.280414 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.280535 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.282043 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.282391 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.283906 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.284179 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.284228 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.284245 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.284343 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.284463 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.284858 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.285496 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.285594 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.285680 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.286139 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.286527 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.286977 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.286981 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.287048 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.287200 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.287490 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.287754 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.287838 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.287852 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.287924 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.288007 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.288024 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.288020 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.288116 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.288159 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.289119 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.289228 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.289264 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.289500 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.290694 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.290955 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.291010 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.294153 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.294183 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.294195 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.294213 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.294225 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:43Z","lastTransitionTime":"2026-03-09T09:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.295481 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.300367 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.307200 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.309885 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.312864 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342170 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342388 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342419 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342473 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342563 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342581 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342594 4971 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342606 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342623 4971 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342640 4971 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342656 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342672 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342690 4971 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342706 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342723 4971 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342742 4971 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342757 4971 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342773 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342788 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342802 4971 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342817 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342831 4971 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342846 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342861 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342877 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342893 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342937 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342954 4971 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342967 4971 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342984 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.342997 4971 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343012 4971 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343025 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343040 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343057 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343073 4971 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343089 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343105 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343121 4971 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343136 4971 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343151 4971 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343165 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343179 4971 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343192 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343206 4971 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343220 4971 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343233 4971 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343246 4971 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343260 4971 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343275 4971 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343290 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343697 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343729 4971 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343743 4971 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343804 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343832 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343845 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343857 4971 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343869 4971 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343889 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343901 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343958 4971 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.343982 4971 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344000 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344027 4971 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344047 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344060 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344073 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344096 4971 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344112 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344126 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344140 4971 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344157 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344169 4971 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344185 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344202 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344215 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344229 4971 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344241 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344259 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344272 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344284 4971 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344298 4971 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344314 4971 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344327 4971 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344339 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344376 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344389 4971 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344401 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344415 4971 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344430 4971 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344441 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344453 4971 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344464 4971 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344484 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344497 4971 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344510 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344525 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344543 4971 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344554 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344566 4971 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344583 4971 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344595 4971 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344607 4971 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344619 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344635 4971 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344649 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344661 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344672 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344688 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344701 4971 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344713 4971 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344731 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344745 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344756 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344767 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344783 4971 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344794 4971 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344806 4971 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344817 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344833 4971 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344845 4971 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344856 4971 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344869 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344885 4971 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344896 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344909 4971 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344925 4971 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344937 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344949 4971 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344961 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344977 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344988 4971 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.344999 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345010 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345026 4971 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345037 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345051 4971 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345063 4971 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345079 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345090 4971 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345102 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345118 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345129 4971 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345141 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345152 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345168 4971 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345180 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345193 4971 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345205 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345222 4971 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345233 4971 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345244 4971 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345260 4971 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345272 4971 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345283 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345295 4971 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345311 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345328 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345340 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345372 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.345388 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.396635 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.396702 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.396715 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.396776 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.396790 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:43Z","lastTransitionTime":"2026-03-09T09:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.453852 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.469739 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.477030 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:21:43 crc kubenswrapper[4971]: W0309 09:21:43.493975 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-2d5f60e35579a588be7703a2031e409c910e0e8eb593aaf6b30b5f0b47b2daae WatchSource:0}: Error finding container 2d5f60e35579a588be7703a2031e409c910e0e8eb593aaf6b30b5f0b47b2daae: Status 404 returned error can't find the container with id 2d5f60e35579a588be7703a2031e409c910e0e8eb593aaf6b30b5f0b47b2daae Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.500520 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.500558 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.500570 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.500588 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.500600 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:43Z","lastTransitionTime":"2026-03-09T09:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:43 crc kubenswrapper[4971]: W0309 09:21:43.501991 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-d9522410f92b53b10d08d9d4f228034f3b51e815b85bfa79e702ff2cebecaf37 WatchSource:0}: Error finding container d9522410f92b53b10d08d9d4f228034f3b51e815b85bfa79e702ff2cebecaf37: Status 404 returned error can't find the container with id d9522410f92b53b10d08d9d4f228034f3b51e815b85bfa79e702ff2cebecaf37 Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.603215 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.603249 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.603261 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.603277 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.603289 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:43Z","lastTransitionTime":"2026-03-09T09:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.653808 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d9522410f92b53b10d08d9d4f228034f3b51e815b85bfa79e702ff2cebecaf37"} Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.658193 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"2d5f60e35579a588be7703a2031e409c910e0e8eb593aaf6b30b5f0b47b2daae"} Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.659284 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"303ec436e1384aea16ea319fd0db73e4fcf0e4d074bbb40892e40ebc61c110d8"} Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.706069 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.706455 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.706465 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.706485 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.706497 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:43Z","lastTransitionTime":"2026-03-09T09:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.748382 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.748444 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.748524 4971 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.748575 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:44.748559156 +0000 UTC m=+108.308486966 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.748902 4971 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.748980 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:44.748966928 +0000 UTC m=+108.308894738 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.808842 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.808911 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.808926 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.808941 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.808951 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:43Z","lastTransitionTime":"2026-03-09T09:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.848867 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.848978 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.849054 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.849197 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.849259 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.849279 4971 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.849372 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:44.849331358 +0000 UTC m=+108.409259188 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.849226 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.849410 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.849431 4971 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.849501 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:44.849476902 +0000 UTC m=+108.409404752 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:21:43 crc kubenswrapper[4971]: E0309 09:21:43.849733 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:21:44.849700878 +0000 UTC m=+108.409628688 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.911767 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.912131 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.912234 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.912336 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:43 crc kubenswrapper[4971]: I0309 09:21:43.912446 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:43Z","lastTransitionTime":"2026-03-09T09:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.016087 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.016454 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.016592 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.016752 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.016876 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:44Z","lastTransitionTime":"2026-03-09T09:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.082640 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-ghvzg"] Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.083228 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ghvzg" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.085944 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.086203 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.086558 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.095743 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.107178 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.114515 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.118943 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.118977 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.118986 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.118999 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.119010 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:44Z","lastTransitionTime":"2026-03-09T09:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.121979 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ghvzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5315be33-28da-40cd-a2df-e86f5f473b98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mznw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:21:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ghvzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.133844 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.143654 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.151418 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5315be33-28da-40cd-a2df-e86f5f473b98-hosts-file\") pod \"node-resolver-ghvzg\" (UID: \"5315be33-28da-40cd-a2df-e86f5f473b98\") " pod="openshift-dns/node-resolver-ghvzg" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.151446 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mznw\" (UniqueName: \"kubernetes.io/projected/5315be33-28da-40cd-a2df-e86f5f473b98-kube-api-access-9mznw\") pod \"node-resolver-ghvzg\" (UID: \"5315be33-28da-40cd-a2df-e86f5f473b98\") " pod="openshift-dns/node-resolver-ghvzg" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.153657 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.222396 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.222452 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.222471 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.222509 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.222524 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:44Z","lastTransitionTime":"2026-03-09T09:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.252004 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5315be33-28da-40cd-a2df-e86f5f473b98-hosts-file\") pod \"node-resolver-ghvzg\" (UID: \"5315be33-28da-40cd-a2df-e86f5f473b98\") " pod="openshift-dns/node-resolver-ghvzg" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.252099 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mznw\" (UniqueName: \"kubernetes.io/projected/5315be33-28da-40cd-a2df-e86f5f473b98-kube-api-access-9mznw\") pod \"node-resolver-ghvzg\" (UID: \"5315be33-28da-40cd-a2df-e86f5f473b98\") " pod="openshift-dns/node-resolver-ghvzg" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.252164 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5315be33-28da-40cd-a2df-e86f5f473b98-hosts-file\") pod \"node-resolver-ghvzg\" (UID: \"5315be33-28da-40cd-a2df-e86f5f473b98\") " pod="openshift-dns/node-resolver-ghvzg" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.268587 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mznw\" (UniqueName: \"kubernetes.io/projected/5315be33-28da-40cd-a2df-e86f5f473b98-kube-api-access-9mznw\") pod \"node-resolver-ghvzg\" (UID: \"5315be33-28da-40cd-a2df-e86f5f473b98\") " pod="openshift-dns/node-resolver-ghvzg" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.325559 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.325631 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.325653 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.325677 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.325695 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:44Z","lastTransitionTime":"2026-03-09T09:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.397278 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ghvzg" Mar 09 09:21:44 crc kubenswrapper[4971]: W0309 09:21:44.414894 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5315be33_28da_40cd_a2df_e86f5f473b98.slice/crio-366cd4df41a350165aea2093bae9534faa4dbeb3e162921ea7d06025f0ab3a74 WatchSource:0}: Error finding container 366cd4df41a350165aea2093bae9534faa4dbeb3e162921ea7d06025f0ab3a74: Status 404 returned error can't find the container with id 366cd4df41a350165aea2093bae9534faa4dbeb3e162921ea7d06025f0ab3a74 Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.428240 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.428328 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.428400 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.428423 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.428440 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:44Z","lastTransitionTime":"2026-03-09T09:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.465996 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-p56wx"] Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.466538 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.466841 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-wbp4g"] Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.467373 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-572n5"] Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.467542 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.467870 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wbp4g" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.468982 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.472457 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.472661 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.472817 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.473079 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.473228 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.473422 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.473635 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.473713 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.473720 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.473901 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.474019 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.488529 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.490109 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.490177 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.490204 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.490234 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.490257 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:44Z","lastTransitionTime":"2026-03-09T09:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.510164 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: E0309 09:21:44.510280 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"be9838f3-2d37-4bad-8f06-25d06e28ed61\\\",\\\"systemUUID\\\":\\\"12d699f3-b441-4abe-bc2e-d70473202cd1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.515283 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.515336 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.515377 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.515400 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.515417 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:44Z","lastTransitionTime":"2026-03-09T09:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.523598 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: E0309 09:21:44.527557 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"be9838f3-2d37-4bad-8f06-25d06e28ed61\\\",\\\"systemUUID\\\":\\\"12d699f3-b441-4abe-bc2e-d70473202cd1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.531166 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.531231 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.531255 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.531284 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.531308 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:44Z","lastTransitionTime":"2026-03-09T09:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.535091 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05fde3ad-1182-4b15-bb1a-f365ecc92d75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89k4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89k4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:21:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p56wx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.547212 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: E0309 09:21:44.548461 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"be9838f3-2d37-4bad-8f06-25d06e28ed61\\\",\\\"systemUUID\\\":\\\"12d699f3-b441-4abe-bc2e-d70473202cd1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.554668 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-hostroot\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.554730 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-multus-conf-dir\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.554675 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.554764 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/05fde3ad-1182-4b15-bb1a-f365ecc92d75-proxy-tls\") pod \"machine-config-daemon-p56wx\" (UID: \"05fde3ad-1182-4b15-bb1a-f365ecc92d75\") " pod="openshift-machine-config-operator/machine-config-daemon-p56wx" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.554801 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.554824 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.554837 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-host-run-k8s-cni-cncf-io\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.554864 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-host-run-netns\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.554885 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-host-var-lib-kubelet\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.554846 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.554919 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/05fde3ad-1182-4b15-bb1a-f365ecc92d75-rootfs\") pod \"machine-config-daemon-p56wx\" (UID: \"05fde3ad-1182-4b15-bb1a-f365ecc92d75\") " pod="openshift-machine-config-operator/machine-config-daemon-p56wx" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.554935 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c18b589a-f45c-4d0d-9779-3e39d74e057a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wbp4g\" (UID: \"c18b589a-f45c-4d0d-9779-3e39d74e057a\") " pod="openshift-multus/multus-additional-cni-plugins-wbp4g" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.554930 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:44Z","lastTransitionTime":"2026-03-09T09:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.554951 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89k4t\" (UniqueName: \"kubernetes.io/projected/05fde3ad-1182-4b15-bb1a-f365ecc92d75-kube-api-access-89k4t\") pod \"machine-config-daemon-p56wx\" (UID: \"05fde3ad-1182-4b15-bb1a-f365ecc92d75\") " pod="openshift-machine-config-operator/machine-config-daemon-p56wx" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.554968 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c18b589a-f45c-4d0d-9779-3e39d74e057a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wbp4g\" (UID: \"c18b589a-f45c-4d0d-9779-3e39d74e057a\") " pod="openshift-multus/multus-additional-cni-plugins-wbp4g" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.554985 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhbcx\" (UniqueName: \"kubernetes.io/projected/156929ae-cd9c-46c6-8bf1-bc28162f6917-kube-api-access-mhbcx\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.555010 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-cnibin\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.555023 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-etc-kubernetes\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.555038 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-host-var-lib-cni-multus\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.555061 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/156929ae-cd9c-46c6-8bf1-bc28162f6917-cni-binary-copy\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.555075 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-host-var-lib-cni-bin\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.555091 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-system-cni-dir\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.555106 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-multus-cni-dir\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.555128 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c18b589a-f45c-4d0d-9779-3e39d74e057a-system-cni-dir\") pod \"multus-additional-cni-plugins-wbp4g\" (UID: \"c18b589a-f45c-4d0d-9779-3e39d74e057a\") " pod="openshift-multus/multus-additional-cni-plugins-wbp4g" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.555142 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c18b589a-f45c-4d0d-9779-3e39d74e057a-os-release\") pod \"multus-additional-cni-plugins-wbp4g\" (UID: \"c18b589a-f45c-4d0d-9779-3e39d74e057a\") " pod="openshift-multus/multus-additional-cni-plugins-wbp4g" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.555156 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c18b589a-f45c-4d0d-9779-3e39d74e057a-cni-binary-copy\") pod \"multus-additional-cni-plugins-wbp4g\" (UID: \"c18b589a-f45c-4d0d-9779-3e39d74e057a\") " pod="openshift-multus/multus-additional-cni-plugins-wbp4g" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.555172 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-host-run-multus-certs\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.555230 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/05fde3ad-1182-4b15-bb1a-f365ecc92d75-mcd-auth-proxy-config\") pod \"machine-config-daemon-p56wx\" (UID: \"05fde3ad-1182-4b15-bb1a-f365ecc92d75\") " pod="openshift-machine-config-operator/machine-config-daemon-p56wx" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.555336 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c18b589a-f45c-4d0d-9779-3e39d74e057a-cnibin\") pod \"multus-additional-cni-plugins-wbp4g\" (UID: \"c18b589a-f45c-4d0d-9779-3e39d74e057a\") " pod="openshift-multus/multus-additional-cni-plugins-wbp4g" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.555423 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq6kx\" (UniqueName: \"kubernetes.io/projected/c18b589a-f45c-4d0d-9779-3e39d74e057a-kube-api-access-tq6kx\") pod \"multus-additional-cni-plugins-wbp4g\" (UID: \"c18b589a-f45c-4d0d-9779-3e39d74e057a\") " pod="openshift-multus/multus-additional-cni-plugins-wbp4g" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.555502 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-multus-socket-dir-parent\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.555536 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-os-release\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.555605 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/156929ae-cd9c-46c6-8bf1-bc28162f6917-multus-daemon-config\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.558132 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.567552 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: E0309 09:21:44.573102 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"be9838f3-2d37-4bad-8f06-25d06e28ed61\\\",\\\"systemUUID\\\":\\\"12d699f3-b441-4abe-bc2e-d70473202cd1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.576991 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ghvzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5315be33-28da-40cd-a2df-e86f5f473b98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mznw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:21:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ghvzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.577283 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.577346 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.577394 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.577416 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.577434 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:44Z","lastTransitionTime":"2026-03-09T09:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.590067 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: E0309 09:21:44.590427 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"be9838f3-2d37-4bad-8f06-25d06e28ed61\\\",\\\"systemUUID\\\":\\\"12d699f3-b441-4abe-bc2e-d70473202cd1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: E0309 09:21:44.590548 4971 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.592797 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.592833 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.592845 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.592862 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.592873 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:44Z","lastTransitionTime":"2026-03-09T09:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.602320 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.616281 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-572n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"156929ae-cd9c-46c6-8bf1-bc28162f6917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:21:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-572n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.635635 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wbp4g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18b589a-f45c-4d0d-9779-3e39d74e057a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:21:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wbp4g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.650466 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.656757 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/156929ae-cd9c-46c6-8bf1-bc28162f6917-cni-binary-copy\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.656783 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-host-var-lib-cni-bin\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.656810 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c18b589a-f45c-4d0d-9779-3e39d74e057a-system-cni-dir\") pod \"multus-additional-cni-plugins-wbp4g\" (UID: \"c18b589a-f45c-4d0d-9779-3e39d74e057a\") " pod="openshift-multus/multus-additional-cni-plugins-wbp4g" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.656827 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c18b589a-f45c-4d0d-9779-3e39d74e057a-os-release\") pod \"multus-additional-cni-plugins-wbp4g\" (UID: \"c18b589a-f45c-4d0d-9779-3e39d74e057a\") " pod="openshift-multus/multus-additional-cni-plugins-wbp4g" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.656843 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c18b589a-f45c-4d0d-9779-3e39d74e057a-cni-binary-copy\") pod \"multus-additional-cni-plugins-wbp4g\" (UID: \"c18b589a-f45c-4d0d-9779-3e39d74e057a\") " pod="openshift-multus/multus-additional-cni-plugins-wbp4g" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.656860 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-system-cni-dir\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.656876 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-multus-cni-dir\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.656898 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-host-run-multus-certs\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.656893 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-host-var-lib-cni-bin\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.656913 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c18b589a-f45c-4d0d-9779-3e39d74e057a-cnibin\") pod \"multus-additional-cni-plugins-wbp4g\" (UID: \"c18b589a-f45c-4d0d-9779-3e39d74e057a\") " pod="openshift-multus/multus-additional-cni-plugins-wbp4g" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.656976 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c18b589a-f45c-4d0d-9779-3e39d74e057a-cnibin\") pod \"multus-additional-cni-plugins-wbp4g\" (UID: \"c18b589a-f45c-4d0d-9779-3e39d74e057a\") " pod="openshift-multus/multus-additional-cni-plugins-wbp4g" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.656989 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq6kx\" (UniqueName: \"kubernetes.io/projected/c18b589a-f45c-4d0d-9779-3e39d74e057a-kube-api-access-tq6kx\") pod \"multus-additional-cni-plugins-wbp4g\" (UID: \"c18b589a-f45c-4d0d-9779-3e39d74e057a\") " pod="openshift-multus/multus-additional-cni-plugins-wbp4g" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657013 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/05fde3ad-1182-4b15-bb1a-f365ecc92d75-mcd-auth-proxy-config\") pod \"machine-config-daemon-p56wx\" (UID: \"05fde3ad-1182-4b15-bb1a-f365ecc92d75\") " pod="openshift-machine-config-operator/machine-config-daemon-p56wx" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657028 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c18b589a-f45c-4d0d-9779-3e39d74e057a-os-release\") pod \"multus-additional-cni-plugins-wbp4g\" (UID: \"c18b589a-f45c-4d0d-9779-3e39d74e057a\") " pod="openshift-multus/multus-additional-cni-plugins-wbp4g" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657032 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-multus-socket-dir-parent\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657143 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/156929ae-cd9c-46c6-8bf1-bc28162f6917-multus-daemon-config\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657180 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-os-release\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657212 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-multus-socket-dir-parent\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657233 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-hostroot\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657277 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-multus-conf-dir\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657320 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-host-run-k8s-cni-cncf-io\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657392 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-host-run-netns\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657437 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-host-var-lib-kubelet\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657482 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/05fde3ad-1182-4b15-bb1a-f365ecc92d75-proxy-tls\") pod \"machine-config-daemon-p56wx\" (UID: \"05fde3ad-1182-4b15-bb1a-f365ecc92d75\") " pod="openshift-machine-config-operator/machine-config-daemon-p56wx" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657494 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-system-cni-dir\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657519 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/05fde3ad-1182-4b15-bb1a-f365ecc92d75-rootfs\") pod \"machine-config-daemon-p56wx\" (UID: \"05fde3ad-1182-4b15-bb1a-f365ecc92d75\") " pod="openshift-machine-config-operator/machine-config-daemon-p56wx" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657544 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-multus-conf-dir\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657578 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c18b589a-f45c-4d0d-9779-3e39d74e057a-cni-binary-copy\") pod \"multus-additional-cni-plugins-wbp4g\" (UID: \"c18b589a-f45c-4d0d-9779-3e39d74e057a\") " pod="openshift-multus/multus-additional-cni-plugins-wbp4g" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657550 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c18b589a-f45c-4d0d-9779-3e39d74e057a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wbp4g\" (UID: \"c18b589a-f45c-4d0d-9779-3e39d74e057a\") " pod="openshift-multus/multus-additional-cni-plugins-wbp4g" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657628 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89k4t\" (UniqueName: \"kubernetes.io/projected/05fde3ad-1182-4b15-bb1a-f365ecc92d75-kube-api-access-89k4t\") pod \"machine-config-daemon-p56wx\" (UID: \"05fde3ad-1182-4b15-bb1a-f365ecc92d75\") " pod="openshift-machine-config-operator/machine-config-daemon-p56wx" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657643 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c18b589a-f45c-4d0d-9779-3e39d74e057a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wbp4g\" (UID: \"c18b589a-f45c-4d0d-9779-3e39d74e057a\") " pod="openshift-multus/multus-additional-cni-plugins-wbp4g" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657660 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhbcx\" (UniqueName: \"kubernetes.io/projected/156929ae-cd9c-46c6-8bf1-bc28162f6917-kube-api-access-mhbcx\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657674 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-etc-kubernetes\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657693 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-cnibin\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657707 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-host-var-lib-cni-multus\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657742 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-host-var-lib-cni-multus\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657765 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-host-run-multus-certs\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657870 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/156929ae-cd9c-46c6-8bf1-bc28162f6917-cni-binary-copy\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657960 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-etc-kubernetes\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657912 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-multus-cni-dir\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.657976 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/05fde3ad-1182-4b15-bb1a-f365ecc92d75-mcd-auth-proxy-config\") pod \"machine-config-daemon-p56wx\" (UID: \"05fde3ad-1182-4b15-bb1a-f365ecc92d75\") " pod="openshift-machine-config-operator/machine-config-daemon-p56wx" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.658015 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-host-var-lib-kubelet\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.658060 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-host-run-k8s-cni-cncf-io\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.658083 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-os-release\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.658091 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-host-run-netns\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.658186 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/156929ae-cd9c-46c6-8bf1-bc28162f6917-multus-daemon-config\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.658198 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-cnibin\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.658228 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/156929ae-cd9c-46c6-8bf1-bc28162f6917-hostroot\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.658245 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/05fde3ad-1182-4b15-bb1a-f365ecc92d75-rootfs\") pod \"machine-config-daemon-p56wx\" (UID: \"05fde3ad-1182-4b15-bb1a-f365ecc92d75\") " pod="openshift-machine-config-operator/machine-config-daemon-p56wx" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.658499 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c18b589a-f45c-4d0d-9779-3e39d74e057a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wbp4g\" (UID: \"c18b589a-f45c-4d0d-9779-3e39d74e057a\") " pod="openshift-multus/multus-additional-cni-plugins-wbp4g" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.658618 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c18b589a-f45c-4d0d-9779-3e39d74e057a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wbp4g\" (UID: \"c18b589a-f45c-4d0d-9779-3e39d74e057a\") " pod="openshift-multus/multus-additional-cni-plugins-wbp4g" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.658958 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c18b589a-f45c-4d0d-9779-3e39d74e057a-system-cni-dir\") pod \"multus-additional-cni-plugins-wbp4g\" (UID: \"c18b589a-f45c-4d0d-9779-3e39d74e057a\") " pod="openshift-multus/multus-additional-cni-plugins-wbp4g" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.668308 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/05fde3ad-1182-4b15-bb1a-f365ecc92d75-proxy-tls\") pod \"machine-config-daemon-p56wx\" (UID: \"05fde3ad-1182-4b15-bb1a-f365ecc92d75\") " pod="openshift-machine-config-operator/machine-config-daemon-p56wx" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.669450 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.670487 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"50a72c6521f9cbb086f70a33c18c40b7e2c49b482669a081fe059af9543f61cb"} Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.672072 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ghvzg" event={"ID":"5315be33-28da-40cd-a2df-e86f5f473b98","Type":"ContainerStarted","Data":"366cd4df41a350165aea2093bae9534faa4dbeb3e162921ea7d06025f0ab3a74"} Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.673810 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"aaa9242e76f1a49cf4822bbc4b805d7e3fa4507499d79de57783f01bdc8a69fa"} Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.673841 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f3d13c98abbbff627986e99cba64d1851c259aec69c26b798335b680b6adc908"} Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.679305 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhbcx\" (UniqueName: \"kubernetes.io/projected/156929ae-cd9c-46c6-8bf1-bc28162f6917-kube-api-access-mhbcx\") pod \"multus-572n5\" (UID: \"156929ae-cd9c-46c6-8bf1-bc28162f6917\") " pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.680434 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq6kx\" (UniqueName: \"kubernetes.io/projected/c18b589a-f45c-4d0d-9779-3e39d74e057a-kube-api-access-tq6kx\") pod \"multus-additional-cni-plugins-wbp4g\" (UID: \"c18b589a-f45c-4d0d-9779-3e39d74e057a\") " pod="openshift-multus/multus-additional-cni-plugins-wbp4g" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.681038 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89k4t\" (UniqueName: \"kubernetes.io/projected/05fde3ad-1182-4b15-bb1a-f365ecc92d75-kube-api-access-89k4t\") pod \"machine-config-daemon-p56wx\" (UID: \"05fde3ad-1182-4b15-bb1a-f365ecc92d75\") " pod="openshift-machine-config-operator/machine-config-daemon-p56wx" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.686443 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.695337 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.695383 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.695396 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.695415 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.695428 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:44Z","lastTransitionTime":"2026-03-09T09:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.698930 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05fde3ad-1182-4b15-bb1a-f365ecc92d75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89k4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89k4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:21:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p56wx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.715476 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.723866 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ghvzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5315be33-28da-40cd-a2df-e86f5f473b98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mznw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:21:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ghvzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.736138 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.747953 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ghvzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5315be33-28da-40cd-a2df-e86f5f473b98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mznw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:21:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ghvzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.759009 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.759088 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:21:44 crc kubenswrapper[4971]: E0309 09:21:44.759644 4971 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:21:44 crc kubenswrapper[4971]: E0309 09:21:44.759802 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:46.759769084 +0000 UTC m=+110.319696904 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:21:44 crc kubenswrapper[4971]: E0309 09:21:44.759883 4971 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:21:44 crc kubenswrapper[4971]: E0309 09:21:44.760006 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:46.7599654 +0000 UTC m=+110.319893430 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.760503 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaa9242e76f1a49cf4822bbc4b805d7e3fa4507499d79de57783f01bdc8a69fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3d13c98abbbff627986e99cba64d1851c259aec69c26b798335b680b6adc908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.777793 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.793759 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.793843 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-572n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"156929ae-cd9c-46c6-8bf1-bc28162f6917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:21:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-572n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.797050 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.797078 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.797087 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.797102 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.797111 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:44Z","lastTransitionTime":"2026-03-09T09:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.807984 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-572n5" Mar 09 09:21:44 crc kubenswrapper[4971]: W0309 09:21:44.808589 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05fde3ad_1182_4b15_bb1a_f365ecc92d75.slice/crio-235a12d1bfcab734f7f1260d6f05a802ac2cd954b9db25c0e96c12af27168fff WatchSource:0}: Error finding container 235a12d1bfcab734f7f1260d6f05a802ac2cd954b9db25c0e96c12af27168fff: Status 404 returned error can't find the container with id 235a12d1bfcab734f7f1260d6f05a802ac2cd954b9db25c0e96c12af27168fff Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.809807 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wbp4g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18b589a-f45c-4d0d-9779-3e39d74e057a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:21:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wbp4g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.815600 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wbp4g" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.825213 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50a72c6521f9cbb086f70a33c18c40b7e2c49b482669a081fe059af9543f61cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.842854 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.845871 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9bhsp"] Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.846893 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.849177 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.850754 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.850879 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.850923 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.851590 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.851604 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.851766 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.859117 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.859408 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.859548 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.859592 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:21:44 crc kubenswrapper[4971]: E0309 09:21:44.859717 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:21:44 crc kubenswrapper[4971]: E0309 09:21:44.859739 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:21:44 crc kubenswrapper[4971]: E0309 09:21:44.859753 4971 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:21:44 crc kubenswrapper[4971]: E0309 09:21:44.859753 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:21:44 crc kubenswrapper[4971]: E0309 09:21:44.859776 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:21:44 crc kubenswrapper[4971]: E0309 09:21:44.859788 4971 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:21:44 crc kubenswrapper[4971]: E0309 09:21:44.859879 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:21:46.859860756 +0000 UTC m=+110.419788566 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:21:44 crc kubenswrapper[4971]: E0309 09:21:44.859921 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:46.859913798 +0000 UTC m=+110.419841618 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:21:44 crc kubenswrapper[4971]: E0309 09:21:44.859957 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:46.859950579 +0000 UTC m=+110.419878389 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.883820 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05fde3ad-1182-4b15-bb1a-f365ecc92d75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89k4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89k4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:21:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p56wx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.902326 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.902401 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.902415 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.902434 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.902449 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:44Z","lastTransitionTime":"2026-03-09T09:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.904452 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.920631 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ghvzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5315be33-28da-40cd-a2df-e86f5f473b98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mznw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:21:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ghvzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.937626 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-572n5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"156929ae-cd9c-46c6-8bf1-bc28162f6917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhbcx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:21:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-572n5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.953823 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wbp4g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18b589a-f45c-4d0d-9779-3e39d74e057a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:21:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wbp4g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.960477 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-etc-openvswitch\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.960515 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-run-ovn\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.960540 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-run-ovn-kubernetes\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.960564 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-run-netns\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.960594 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-systemd-units\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.960668 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-ovn-node-metrics-cert\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.960703 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-cni-netd\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.960732 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-var-lib-openvswitch\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.960751 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-cni-bin\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.960772 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-ovnkube-config\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.960791 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-ovnkube-script-lib\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.960811 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5czk\" (UniqueName: \"kubernetes.io/projected/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-kube-api-access-j5czk\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.960921 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-kubelet\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.960981 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-slash\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.961016 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-node-log\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.961063 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.961122 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-run-openvswitch\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.961153 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-env-overrides\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.961197 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-run-systemd\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.961219 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-log-socket\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.973925 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5czk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5czk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5czk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5czk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5czk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5czk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5czk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5czk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5czk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:21:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9bhsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:44 crc kubenswrapper[4971]: I0309 09:21:44.987762 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50a72c6521f9cbb086f70a33c18c40b7e2c49b482669a081fe059af9543f61cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.004872 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.004919 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.004931 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.004954 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.004969 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:45Z","lastTransitionTime":"2026-03-09T09:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.014523 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaa9242e76f1a49cf4822bbc4b805d7e3fa4507499d79de57783f01bdc8a69fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3d13c98abbbff627986e99cba64d1851c259aec69c26b798335b680b6adc908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.029063 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.045684 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.061110 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.061652 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-etc-openvswitch\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.061713 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-run-ovn\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.061740 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-run-ovn-kubernetes\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.061788 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-run-netns\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.061802 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-etc-openvswitch\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.061843 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-run-ovn\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.061860 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-systemd-units\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.061822 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-systemd-units\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.061872 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-run-ovn-kubernetes\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.061903 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-run-netns\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.061983 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-ovn-node-metrics-cert\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.062033 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-cni-netd\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.062068 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-var-lib-openvswitch\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.062090 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-cni-bin\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.062094 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-cni-netd\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.062110 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-ovnkube-config\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.062137 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-ovnkube-script-lib\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.062147 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-var-lib-openvswitch\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.062162 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5czk\" (UniqueName: \"kubernetes.io/projected/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-kube-api-access-j5czk\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.062193 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-kubelet\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.062219 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-slash\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.062247 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-node-log\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.062276 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.062304 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-run-openvswitch\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.062429 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-env-overrides\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.062457 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-run-systemd\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.062484 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-log-socket\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.062681 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-node-log\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.063693 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-log-socket\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.067932 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-ovnkube-script-lib\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.068244 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-ovnkube-config\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.068756 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-ovn-node-metrics-cert\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.069116 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-kubelet\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.062190 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-cni-bin\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.069197 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-slash\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.069236 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.069267 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-run-openvswitch\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.069644 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-env-overrides\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.069690 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-run-systemd\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.083723 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05fde3ad-1182-4b15-bb1a-f365ecc92d75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89k4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89k4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:21:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p56wx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.093100 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5czk\" (UniqueName: \"kubernetes.io/projected/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-kube-api-access-j5czk\") pod \"ovnkube-node-9bhsp\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.111631 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.111662 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.111670 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.111683 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.111691 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:45Z","lastTransitionTime":"2026-03-09T09:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.151742 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.151779 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:21:45 crc kubenswrapper[4971]: E0309 09:21:45.151861 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.151787 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:21:45 crc kubenswrapper[4971]: E0309 09:21:45.151936 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:21:45 crc kubenswrapper[4971]: E0309 09:21:45.152052 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.155703 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.156439 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.157620 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.158384 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.164076 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.164908 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.165728 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.166399 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.167120 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.168978 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.169625 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.170918 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.171555 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.173343 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.173974 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.174638 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.176987 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.177466 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.178144 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.179272 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.179862 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.180400 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.181706 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.182269 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.183635 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.184087 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.185312 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.186103 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.188520 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.189612 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.190181 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.194441 4971 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.194574 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.197694 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.199132 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.199785 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.201909 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.202754 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.203847 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.204663 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.206030 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.206681 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.207451 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.208670 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.209816 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.210926 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.212474 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.213954 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.213985 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.213996 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.214012 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.214031 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:45Z","lastTransitionTime":"2026-03-09T09:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.214777 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.216325 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.217449 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.218564 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.219301 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.220637 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.221626 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.222648 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.316294 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.316326 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.316334 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.316366 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.316379 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:45Z","lastTransitionTime":"2026-03-09T09:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.418853 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.418892 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.418900 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.418914 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.418923 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:45Z","lastTransitionTime":"2026-03-09T09:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.522076 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.522119 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.522130 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.522149 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.522160 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:45Z","lastTransitionTime":"2026-03-09T09:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.624993 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.625036 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.625045 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.625060 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.625071 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:45Z","lastTransitionTime":"2026-03-09T09:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.678839 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-572n5" event={"ID":"156929ae-cd9c-46c6-8bf1-bc28162f6917","Type":"ContainerStarted","Data":"308e2c9f1c0aef07bddf1fe2dd614efd8e69ecbd27c2f8e0029dc3838c626674"} Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.678900 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-572n5" event={"ID":"156929ae-cd9c-46c6-8bf1-bc28162f6917","Type":"ContainerStarted","Data":"4c8ddc3db0d5de8567101d4b196f31ff6dd59b33faa4413661157ef0a2d436fb"} Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.680380 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ghvzg" event={"ID":"5315be33-28da-40cd-a2df-e86f5f473b98","Type":"ContainerStarted","Data":"3d534778b617adf636285b11c052cd88d2c3e5998b763dfacde14d1bcf4dae13"} Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.682210 4971 generic.go:334] "Generic (PLEG): container finished" podID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerID="0399641bc491a3ec975fd05a9a91d1f76b5f3fed3c8e43cd06e51f67e386ea51" exitCode=0 Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.682300 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" event={"ID":"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6","Type":"ContainerDied","Data":"0399641bc491a3ec975fd05a9a91d1f76b5f3fed3c8e43cd06e51f67e386ea51"} Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.682338 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" event={"ID":"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6","Type":"ContainerStarted","Data":"2d159f020b7b482e3bb2301ad0e2eab87a3bbf0d63a10a6fa9a63438b96ee8e3"} Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.684183 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" event={"ID":"05fde3ad-1182-4b15-bb1a-f365ecc92d75","Type":"ContainerStarted","Data":"08a07ddd4acaafe7163713fb05d7a79ef1407dad29dd1df62a48946812da7654"} Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.684243 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" event={"ID":"05fde3ad-1182-4b15-bb1a-f365ecc92d75","Type":"ContainerStarted","Data":"ae9ddb9ff311e15e0bec8cf007b9275af5870d3030b314990b85d278c01e4a3e"} Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.684257 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" event={"ID":"05fde3ad-1182-4b15-bb1a-f365ecc92d75","Type":"ContainerStarted","Data":"235a12d1bfcab734f7f1260d6f05a802ac2cd954b9db25c0e96c12af27168fff"} Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.685926 4971 generic.go:334] "Generic (PLEG): container finished" podID="c18b589a-f45c-4d0d-9779-3e39d74e057a" containerID="b6a13cc5efc9bab3bd7850cdf41fd5a9ed22934d1a041f680a1e30f12775faa2" exitCode=0 Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.686132 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wbp4g" event={"ID":"c18b589a-f45c-4d0d-9779-3e39d74e057a","Type":"ContainerDied","Data":"b6a13cc5efc9bab3bd7850cdf41fd5a9ed22934d1a041f680a1e30f12775faa2"} Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.686167 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wbp4g" event={"ID":"c18b589a-f45c-4d0d-9779-3e39d74e057a","Type":"ContainerStarted","Data":"e7b2a58a3f6a7f83346d0dcb37b7869d1d87a53bb780132ad50d26d684fe5cd7"} Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.695897 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05fde3ad-1182-4b15-bb1a-f365ecc92d75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89k4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89k4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:21:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p56wx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.710301 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.722293 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ghvzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5315be33-28da-40cd-a2df-e86f5f473b98\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mznw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:21:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ghvzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.727541 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.727598 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.727610 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.727625 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.727634 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:45Z","lastTransitionTime":"2026-03-09T09:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.742669 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wbp4g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c18b589a-f45c-4d0d-9779-3e39d74e057a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq6kx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:21:44Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wbp4g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.770019 4971 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:21:44Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5czk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5czk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5czk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5czk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5czk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5czk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5czk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5czk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j5czk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:21:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9bhsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.791293 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.802136 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.803343 4971 scope.go:117] "RemoveContainer" containerID="0498fa34e162baaf3d51e00c839035dfb5a043d12e709f17f37859b8d3fbe083" Mar 09 09:21:45 crc kubenswrapper[4971]: E0309 09:21:45.803670 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.835530 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.835563 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.835573 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.835587 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.835597 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:45Z","lastTransitionTime":"2026-03-09T09:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.846367 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-572n5" podStartSLOduration=45.846335872 podStartE2EDuration="45.846335872s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:21:45.846125926 +0000 UTC m=+109.406053736" watchObservedRunningTime="2026-03-09 09:21:45.846335872 +0000 UTC m=+109.406263682" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.919546 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-ghvzg" podStartSLOduration=46.919524578 podStartE2EDuration="46.919524578s" podCreationTimestamp="2026-03-09 09:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:21:45.902515462 +0000 UTC m=+109.462443282" watchObservedRunningTime="2026-03-09 09:21:45.919524578 +0000 UTC m=+109.479452388" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.937710 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.937756 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.937765 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.937781 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.937793 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:45Z","lastTransitionTime":"2026-03-09T09:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:45 crc kubenswrapper[4971]: I0309 09:21:45.977407 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podStartSLOduration=46.977388457 podStartE2EDuration="46.977388457s" podCreationTimestamp="2026-03-09 09:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:21:45.977237013 +0000 UTC m=+109.537164823" watchObservedRunningTime="2026-03-09 09:21:45.977388457 +0000 UTC m=+109.537316267" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.040302 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.040338 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.040365 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.040382 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.040394 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:46Z","lastTransitionTime":"2026-03-09T09:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.118638 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-x2g9k"] Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.119402 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-x2g9k" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.121979 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.122403 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.122697 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.122720 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.143216 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.143256 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.143266 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.143280 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.143288 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:46Z","lastTransitionTime":"2026-03-09T09:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.245522 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.245564 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.245578 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.245596 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.245610 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:46Z","lastTransitionTime":"2026-03-09T09:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.273177 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8d930370-de32-4898-b09e-bc82901ab59c-serviceca\") pod \"node-ca-x2g9k\" (UID: \"8d930370-de32-4898-b09e-bc82901ab59c\") " pod="openshift-image-registry/node-ca-x2g9k" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.273220 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvkl9\" (UniqueName: \"kubernetes.io/projected/8d930370-de32-4898-b09e-bc82901ab59c-kube-api-access-tvkl9\") pod \"node-ca-x2g9k\" (UID: \"8d930370-de32-4898-b09e-bc82901ab59c\") " pod="openshift-image-registry/node-ca-x2g9k" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.273463 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8d930370-de32-4898-b09e-bc82901ab59c-host\") pod \"node-ca-x2g9k\" (UID: \"8d930370-de32-4898-b09e-bc82901ab59c\") " pod="openshift-image-registry/node-ca-x2g9k" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.274901 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-clfxq"] Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.275268 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-clfxq" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.278410 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.279301 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.295713 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-9lhtb"] Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.296114 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9lhtb" Mar 09 09:21:46 crc kubenswrapper[4971]: E0309 09:21:46.296169 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9lhtb" podUID="8b19b44a-0898-4886-b5d2-4bc4ff950094" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.347728 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.347770 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.347781 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.347801 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.347814 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:46Z","lastTransitionTime":"2026-03-09T09:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.375147 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9blcd\" (UniqueName: \"kubernetes.io/projected/8b19b44a-0898-4886-b5d2-4bc4ff950094-kube-api-access-9blcd\") pod \"network-metrics-daemon-9lhtb\" (UID: \"8b19b44a-0898-4886-b5d2-4bc4ff950094\") " pod="openshift-multus/network-metrics-daemon-9lhtb" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.375213 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b19b44a-0898-4886-b5d2-4bc4ff950094-metrics-certs\") pod \"network-metrics-daemon-9lhtb\" (UID: \"8b19b44a-0898-4886-b5d2-4bc4ff950094\") " pod="openshift-multus/network-metrics-daemon-9lhtb" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.375258 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mgfk\" (UniqueName: \"kubernetes.io/projected/5062a2a0-2135-43ec-914d-2e50f8541f22-kube-api-access-7mgfk\") pod \"ovnkube-control-plane-749d76644c-clfxq\" (UID: \"5062a2a0-2135-43ec-914d-2e50f8541f22\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-clfxq" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.375486 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5062a2a0-2135-43ec-914d-2e50f8541f22-env-overrides\") pod \"ovnkube-control-plane-749d76644c-clfxq\" (UID: \"5062a2a0-2135-43ec-914d-2e50f8541f22\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-clfxq" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.375556 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8d930370-de32-4898-b09e-bc82901ab59c-host\") pod \"node-ca-x2g9k\" (UID: \"8d930370-de32-4898-b09e-bc82901ab59c\") " pod="openshift-image-registry/node-ca-x2g9k" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.375592 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5062a2a0-2135-43ec-914d-2e50f8541f22-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-clfxq\" (UID: \"5062a2a0-2135-43ec-914d-2e50f8541f22\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-clfxq" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.375617 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8d930370-de32-4898-b09e-bc82901ab59c-serviceca\") pod \"node-ca-x2g9k\" (UID: \"8d930370-de32-4898-b09e-bc82901ab59c\") " pod="openshift-image-registry/node-ca-x2g9k" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.375638 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5062a2a0-2135-43ec-914d-2e50f8541f22-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-clfxq\" (UID: \"5062a2a0-2135-43ec-914d-2e50f8541f22\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-clfxq" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.375656 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8d930370-de32-4898-b09e-bc82901ab59c-host\") pod \"node-ca-x2g9k\" (UID: \"8d930370-de32-4898-b09e-bc82901ab59c\") " pod="openshift-image-registry/node-ca-x2g9k" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.375676 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvkl9\" (UniqueName: \"kubernetes.io/projected/8d930370-de32-4898-b09e-bc82901ab59c-kube-api-access-tvkl9\") pod \"node-ca-x2g9k\" (UID: \"8d930370-de32-4898-b09e-bc82901ab59c\") " pod="openshift-image-registry/node-ca-x2g9k" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.376987 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8d930370-de32-4898-b09e-bc82901ab59c-serviceca\") pod \"node-ca-x2g9k\" (UID: \"8d930370-de32-4898-b09e-bc82901ab59c\") " pod="openshift-image-registry/node-ca-x2g9k" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.397734 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvkl9\" (UniqueName: \"kubernetes.io/projected/8d930370-de32-4898-b09e-bc82901ab59c-kube-api-access-tvkl9\") pod \"node-ca-x2g9k\" (UID: \"8d930370-de32-4898-b09e-bc82901ab59c\") " pod="openshift-image-registry/node-ca-x2g9k" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.431742 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-x2g9k" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.451178 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.451226 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.451237 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.451252 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.451262 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:46Z","lastTransitionTime":"2026-03-09T09:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.477001 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b19b44a-0898-4886-b5d2-4bc4ff950094-metrics-certs\") pod \"network-metrics-daemon-9lhtb\" (UID: \"8b19b44a-0898-4886-b5d2-4bc4ff950094\") " pod="openshift-multus/network-metrics-daemon-9lhtb" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.477095 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mgfk\" (UniqueName: \"kubernetes.io/projected/5062a2a0-2135-43ec-914d-2e50f8541f22-kube-api-access-7mgfk\") pod \"ovnkube-control-plane-749d76644c-clfxq\" (UID: \"5062a2a0-2135-43ec-914d-2e50f8541f22\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-clfxq" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.477135 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5062a2a0-2135-43ec-914d-2e50f8541f22-env-overrides\") pod \"ovnkube-control-plane-749d76644c-clfxq\" (UID: \"5062a2a0-2135-43ec-914d-2e50f8541f22\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-clfxq" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.477178 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5062a2a0-2135-43ec-914d-2e50f8541f22-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-clfxq\" (UID: \"5062a2a0-2135-43ec-914d-2e50f8541f22\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-clfxq" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.477212 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5062a2a0-2135-43ec-914d-2e50f8541f22-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-clfxq\" (UID: \"5062a2a0-2135-43ec-914d-2e50f8541f22\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-clfxq" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.477261 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9blcd\" (UniqueName: \"kubernetes.io/projected/8b19b44a-0898-4886-b5d2-4bc4ff950094-kube-api-access-9blcd\") pod \"network-metrics-daemon-9lhtb\" (UID: \"8b19b44a-0898-4886-b5d2-4bc4ff950094\") " pod="openshift-multus/network-metrics-daemon-9lhtb" Mar 09 09:21:46 crc kubenswrapper[4971]: E0309 09:21:46.477766 4971 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:21:46 crc kubenswrapper[4971]: E0309 09:21:46.477883 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b19b44a-0898-4886-b5d2-4bc4ff950094-metrics-certs podName:8b19b44a-0898-4886-b5d2-4bc4ff950094 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:46.977850287 +0000 UTC m=+110.537778137 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b19b44a-0898-4886-b5d2-4bc4ff950094-metrics-certs") pod "network-metrics-daemon-9lhtb" (UID: "8b19b44a-0898-4886-b5d2-4bc4ff950094") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.478405 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5062a2a0-2135-43ec-914d-2e50f8541f22-env-overrides\") pod \"ovnkube-control-plane-749d76644c-clfxq\" (UID: \"5062a2a0-2135-43ec-914d-2e50f8541f22\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-clfxq" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.480935 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5062a2a0-2135-43ec-914d-2e50f8541f22-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-clfxq\" (UID: \"5062a2a0-2135-43ec-914d-2e50f8541f22\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-clfxq" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.484472 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5062a2a0-2135-43ec-914d-2e50f8541f22-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-clfxq\" (UID: \"5062a2a0-2135-43ec-914d-2e50f8541f22\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-clfxq" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.495446 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9blcd\" (UniqueName: \"kubernetes.io/projected/8b19b44a-0898-4886-b5d2-4bc4ff950094-kube-api-access-9blcd\") pod \"network-metrics-daemon-9lhtb\" (UID: \"8b19b44a-0898-4886-b5d2-4bc4ff950094\") " pod="openshift-multus/network-metrics-daemon-9lhtb" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.499800 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mgfk\" (UniqueName: \"kubernetes.io/projected/5062a2a0-2135-43ec-914d-2e50f8541f22-kube-api-access-7mgfk\") pod \"ovnkube-control-plane-749d76644c-clfxq\" (UID: \"5062a2a0-2135-43ec-914d-2e50f8541f22\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-clfxq" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.554176 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.554207 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.554215 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.554230 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.554239 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:46Z","lastTransitionTime":"2026-03-09T09:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.656464 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.656545 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.656565 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.656590 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.656608 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:46Z","lastTransitionTime":"2026-03-09T09:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.695233 4971 generic.go:334] "Generic (PLEG): container finished" podID="c18b589a-f45c-4d0d-9779-3e39d74e057a" containerID="8af5eb9145889f1c88fac8fbb36ddf54044bb928a2dee317b540e10440ca0905" exitCode=0 Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.695297 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wbp4g" event={"ID":"c18b589a-f45c-4d0d-9779-3e39d74e057a","Type":"ContainerDied","Data":"8af5eb9145889f1c88fac8fbb36ddf54044bb928a2dee317b540e10440ca0905"} Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.698162 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"6badba200df596a1a906290f5d33f62e6616ccd046f2190d4416aa42a562b876"} Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.704239 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-x2g9k" event={"ID":"8d930370-de32-4898-b09e-bc82901ab59c","Type":"ContainerStarted","Data":"b118a647325a7535bf5447c6815776f70a0653c3db2474ca20b83520ec0dec1e"} Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.705033 4971 scope.go:117] "RemoveContainer" containerID="0498fa34e162baaf3d51e00c839035dfb5a043d12e709f17f37859b8d3fbe083" Mar 09 09:21:46 crc kubenswrapper[4971]: E0309 09:21:46.705772 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.737932 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-clfxq" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.758521 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.758557 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.758567 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.758584 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.758595 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:46Z","lastTransitionTime":"2026-03-09T09:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:46 crc kubenswrapper[4971]: W0309 09:21:46.766151 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5062a2a0_2135_43ec_914d_2e50f8541f22.slice/crio-44a319799461eee439cb32e537824a6d45e7a6064551e30e60c35b343af9b0bf WatchSource:0}: Error finding container 44a319799461eee439cb32e537824a6d45e7a6064551e30e60c35b343af9b0bf: Status 404 returned error can't find the container with id 44a319799461eee439cb32e537824a6d45e7a6064551e30e60c35b343af9b0bf Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.780939 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.781010 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:21:46 crc kubenswrapper[4971]: E0309 09:21:46.781670 4971 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:21:46 crc kubenswrapper[4971]: E0309 09:21:46.781731 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:50.781714287 +0000 UTC m=+114.341642097 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:21:46 crc kubenswrapper[4971]: E0309 09:21:46.782040 4971 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:21:46 crc kubenswrapper[4971]: E0309 09:21:46.782072 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:50.782065127 +0000 UTC m=+114.341992937 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.862048 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.862434 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.862449 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.862465 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.862478 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:46Z","lastTransitionTime":"2026-03-09T09:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.881719 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.881867 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.881895 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:21:46 crc kubenswrapper[4971]: E0309 09:21:46.882023 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:21:46 crc kubenswrapper[4971]: E0309 09:21:46.882040 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:21:46 crc kubenswrapper[4971]: E0309 09:21:46.882050 4971 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:21:46 crc kubenswrapper[4971]: E0309 09:21:46.882091 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:50.882079427 +0000 UTC m=+114.442007237 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:21:46 crc kubenswrapper[4971]: E0309 09:21:46.882457 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:21:50.882449788 +0000 UTC m=+114.442377598 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:21:46 crc kubenswrapper[4971]: E0309 09:21:46.882503 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:21:46 crc kubenswrapper[4971]: E0309 09:21:46.882511 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:21:46 crc kubenswrapper[4971]: E0309 09:21:46.882520 4971 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:21:46 crc kubenswrapper[4971]: E0309 09:21:46.882540 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:50.88253419 +0000 UTC m=+114.442462000 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.964259 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.964301 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.964320 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.964340 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.964384 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:46Z","lastTransitionTime":"2026-03-09T09:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:46 crc kubenswrapper[4971]: I0309 09:21:46.982369 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b19b44a-0898-4886-b5d2-4bc4ff950094-metrics-certs\") pod \"network-metrics-daemon-9lhtb\" (UID: \"8b19b44a-0898-4886-b5d2-4bc4ff950094\") " pod="openshift-multus/network-metrics-daemon-9lhtb" Mar 09 09:21:46 crc kubenswrapper[4971]: E0309 09:21:46.982500 4971 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:21:46 crc kubenswrapper[4971]: E0309 09:21:46.982554 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b19b44a-0898-4886-b5d2-4bc4ff950094-metrics-certs podName:8b19b44a-0898-4886-b5d2-4bc4ff950094 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:47.982539909 +0000 UTC m=+111.542467719 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b19b44a-0898-4886-b5d2-4bc4ff950094-metrics-certs") pod "network-metrics-daemon-9lhtb" (UID: "8b19b44a-0898-4886-b5d2-4bc4ff950094") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.066557 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.066606 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.066617 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.066635 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.066649 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:47Z","lastTransitionTime":"2026-03-09T09:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.151223 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.151284 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:21:47 crc kubenswrapper[4971]: E0309 09:21:47.152834 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.153408 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:21:47 crc kubenswrapper[4971]: E0309 09:21:47.153534 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:21:47 crc kubenswrapper[4971]: E0309 09:21:47.153655 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.169264 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.169290 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.169298 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.169309 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.169318 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:47Z","lastTransitionTime":"2026-03-09T09:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.274045 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.274109 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.274120 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.274138 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.274150 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:47Z","lastTransitionTime":"2026-03-09T09:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.377008 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.377068 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.377094 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.377128 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.377153 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:47Z","lastTransitionTime":"2026-03-09T09:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.479748 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.480036 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.480045 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.480058 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.480066 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:47Z","lastTransitionTime":"2026-03-09T09:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.582722 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.582749 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.582757 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.582769 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.582777 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:47Z","lastTransitionTime":"2026-03-09T09:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.686037 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.686102 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.686111 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.686124 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.686132 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:47Z","lastTransitionTime":"2026-03-09T09:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.709602 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-clfxq" event={"ID":"5062a2a0-2135-43ec-914d-2e50f8541f22","Type":"ContainerStarted","Data":"0ecb43f6a5de0bfa48d17553efd50033532ad173ef7d3e4e587584a2abc52c88"} Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.709659 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-clfxq" event={"ID":"5062a2a0-2135-43ec-914d-2e50f8541f22","Type":"ContainerStarted","Data":"1972bf46af166d22eb573080c1c083c045f519301e41cb3290cfc899eb526d93"} Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.709674 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-clfxq" event={"ID":"5062a2a0-2135-43ec-914d-2e50f8541f22","Type":"ContainerStarted","Data":"44a319799461eee439cb32e537824a6d45e7a6064551e30e60c35b343af9b0bf"} Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.713372 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" event={"ID":"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6","Type":"ContainerStarted","Data":"bd01cd5952c339dc781705cc90d089db0e24e789af8a57a40f3362e254be1183"} Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.713420 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" event={"ID":"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6","Type":"ContainerStarted","Data":"29f020bddec94a203a1e569fae2e369d84702dcc8bcf68f016fda56798a62056"} Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.713433 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" event={"ID":"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6","Type":"ContainerStarted","Data":"59358bfee0d3ea4c0990d9d6584366a5db170d4147cf5b334ba997c5f9d7dccd"} Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.713443 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" event={"ID":"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6","Type":"ContainerStarted","Data":"ab8e58e71df7ac41fdd077b1c5bfca8fd7227fc8613cc3e2c3c8de0c0a40e293"} Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.713453 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" event={"ID":"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6","Type":"ContainerStarted","Data":"dfc0a02ca9288e6b478c1746877e3532b5930f4df108cd5d58548211cabdd334"} Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.713463 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" event={"ID":"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6","Type":"ContainerStarted","Data":"3ae752a69b1dd4eb466ba0c9bbc03f430c2b83a5c4fdf84f5e7ed37decf875fb"} Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.715074 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-x2g9k" event={"ID":"8d930370-de32-4898-b09e-bc82901ab59c","Type":"ContainerStarted","Data":"dd0df5391bfeccebeb24f04df5f398ebd2c2c3dac333def9e2e53adf19292a35"} Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.717720 4971 generic.go:334] "Generic (PLEG): container finished" podID="c18b589a-f45c-4d0d-9779-3e39d74e057a" containerID="d08bc20ce079d7c28cd93a039028da8c911f6054a3464e1b92a2e7ce3eb0ab08" exitCode=0 Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.717810 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wbp4g" event={"ID":"c18b589a-f45c-4d0d-9779-3e39d74e057a","Type":"ContainerDied","Data":"d08bc20ce079d7c28cd93a039028da8c911f6054a3464e1b92a2e7ce3eb0ab08"} Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.728327 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-clfxq" podStartSLOduration=47.728304289 podStartE2EDuration="47.728304289s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:21:47.726991171 +0000 UTC m=+111.286919021" watchObservedRunningTime="2026-03-09 09:21:47.728304289 +0000 UTC m=+111.288232139" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.783503 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-x2g9k" podStartSLOduration=48.78347601 podStartE2EDuration="48.78347601s" podCreationTimestamp="2026-03-09 09:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:21:47.753837745 +0000 UTC m=+111.313765565" watchObservedRunningTime="2026-03-09 09:21:47.78347601 +0000 UTC m=+111.343403860" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.788215 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.788256 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.788270 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.788291 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.788312 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:47Z","lastTransitionTime":"2026-03-09T09:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.890930 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.890979 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.890991 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.891008 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.891021 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:47Z","lastTransitionTime":"2026-03-09T09:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.991165 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b19b44a-0898-4886-b5d2-4bc4ff950094-metrics-certs\") pod \"network-metrics-daemon-9lhtb\" (UID: \"8b19b44a-0898-4886-b5d2-4bc4ff950094\") " pod="openshift-multus/network-metrics-daemon-9lhtb" Mar 09 09:21:47 crc kubenswrapper[4971]: E0309 09:21:47.991296 4971 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:21:47 crc kubenswrapper[4971]: E0309 09:21:47.991366 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b19b44a-0898-4886-b5d2-4bc4ff950094-metrics-certs podName:8b19b44a-0898-4886-b5d2-4bc4ff950094 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:49.991335087 +0000 UTC m=+113.551262897 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b19b44a-0898-4886-b5d2-4bc4ff950094-metrics-certs") pod "network-metrics-daemon-9lhtb" (UID: "8b19b44a-0898-4886-b5d2-4bc4ff950094") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.992630 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.992660 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.992672 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.992687 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:47 crc kubenswrapper[4971]: I0309 09:21:47.992697 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:47Z","lastTransitionTime":"2026-03-09T09:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.094770 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.094812 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.094824 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.094840 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.094852 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:48Z","lastTransitionTime":"2026-03-09T09:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.151834 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9lhtb" Mar 09 09:21:48 crc kubenswrapper[4971]: E0309 09:21:48.152034 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9lhtb" podUID="8b19b44a-0898-4886-b5d2-4bc4ff950094" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.197083 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.197122 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.197130 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.197143 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.197153 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:48Z","lastTransitionTime":"2026-03-09T09:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.300948 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.301011 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.301027 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.301049 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.301065 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:48Z","lastTransitionTime":"2026-03-09T09:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.403847 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.403901 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.403913 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.403931 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.403941 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:48Z","lastTransitionTime":"2026-03-09T09:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.506227 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.506285 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.506304 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.506343 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.506405 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:48Z","lastTransitionTime":"2026-03-09T09:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.609439 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.609501 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.609517 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.609547 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.609565 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:48Z","lastTransitionTime":"2026-03-09T09:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.712793 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.712867 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.712889 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.712919 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.712941 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:48Z","lastTransitionTime":"2026-03-09T09:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.724673 4971 generic.go:334] "Generic (PLEG): container finished" podID="c18b589a-f45c-4d0d-9779-3e39d74e057a" containerID="f21740f6011e6aa28b24d28dc3a46ce43e2f5ea6dfcd6494b6662f33c303ba1e" exitCode=0 Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.724772 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wbp4g" event={"ID":"c18b589a-f45c-4d0d-9779-3e39d74e057a","Type":"ContainerDied","Data":"f21740f6011e6aa28b24d28dc3a46ce43e2f5ea6dfcd6494b6662f33c303ba1e"} Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.816863 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.816906 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.816916 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.816932 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.816943 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:48Z","lastTransitionTime":"2026-03-09T09:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.920267 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.920300 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.920310 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.920325 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:48 crc kubenswrapper[4971]: I0309 09:21:48.920335 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:48Z","lastTransitionTime":"2026-03-09T09:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.023172 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.023204 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.023214 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.023234 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.023244 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:49Z","lastTransitionTime":"2026-03-09T09:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.125672 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.125705 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.125714 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.125728 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.125738 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:49Z","lastTransitionTime":"2026-03-09T09:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.151606 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.151621 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:21:49 crc kubenswrapper[4971]: E0309 09:21:49.151730 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:21:49 crc kubenswrapper[4971]: E0309 09:21:49.154546 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.154759 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:21:49 crc kubenswrapper[4971]: E0309 09:21:49.156690 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.229519 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.229581 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.229602 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.229628 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.229646 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:49Z","lastTransitionTime":"2026-03-09T09:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.332980 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.333058 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.333077 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.333102 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.333123 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:49Z","lastTransitionTime":"2026-03-09T09:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.436721 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.436778 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.436837 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.436864 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.436889 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:49Z","lastTransitionTime":"2026-03-09T09:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.540114 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.540222 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.542281 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.542755 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.543148 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:49Z","lastTransitionTime":"2026-03-09T09:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.646465 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.646524 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.646541 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.646559 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.646572 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:49Z","lastTransitionTime":"2026-03-09T09:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.731304 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" event={"ID":"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6","Type":"ContainerStarted","Data":"51ead60c81aa051f1a5833235079813532303907fca8e7927390f1507f388da8"} Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.733912 4971 generic.go:334] "Generic (PLEG): container finished" podID="c18b589a-f45c-4d0d-9779-3e39d74e057a" containerID="0249f5698e5b560f798cfedfd422ae0acbce99e2095c05c5dc1a37c743819c81" exitCode=0 Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.733941 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wbp4g" event={"ID":"c18b589a-f45c-4d0d-9779-3e39d74e057a","Type":"ContainerDied","Data":"0249f5698e5b560f798cfedfd422ae0acbce99e2095c05c5dc1a37c743819c81"} Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.753718 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.753895 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.753954 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.754014 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.754079 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:49Z","lastTransitionTime":"2026-03-09T09:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.856542 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.856573 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.856582 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.856597 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.856607 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:49Z","lastTransitionTime":"2026-03-09T09:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.959203 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.959236 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.959248 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.959271 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:49 crc kubenswrapper[4971]: I0309 09:21:49.959286 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:49Z","lastTransitionTime":"2026-03-09T09:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.015611 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b19b44a-0898-4886-b5d2-4bc4ff950094-metrics-certs\") pod \"network-metrics-daemon-9lhtb\" (UID: \"8b19b44a-0898-4886-b5d2-4bc4ff950094\") " pod="openshift-multus/network-metrics-daemon-9lhtb" Mar 09 09:21:50 crc kubenswrapper[4971]: E0309 09:21:50.015864 4971 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:21:50 crc kubenswrapper[4971]: E0309 09:21:50.015971 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b19b44a-0898-4886-b5d2-4bc4ff950094-metrics-certs podName:8b19b44a-0898-4886-b5d2-4bc4ff950094 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:54.015946647 +0000 UTC m=+117.575874537 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b19b44a-0898-4886-b5d2-4bc4ff950094-metrics-certs") pod "network-metrics-daemon-9lhtb" (UID: "8b19b44a-0898-4886-b5d2-4bc4ff950094") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.063331 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.063655 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.063665 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.063679 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.063691 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:50Z","lastTransitionTime":"2026-03-09T09:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.152184 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9lhtb" Mar 09 09:21:50 crc kubenswrapper[4971]: E0309 09:21:50.152448 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9lhtb" podUID="8b19b44a-0898-4886-b5d2-4bc4ff950094" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.166466 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.166515 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.166532 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.166555 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.166573 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:50Z","lastTransitionTime":"2026-03-09T09:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.269825 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.269881 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.269897 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.269920 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.269937 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:50Z","lastTransitionTime":"2026-03-09T09:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.377839 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.377908 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.377931 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.377963 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.377984 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:50Z","lastTransitionTime":"2026-03-09T09:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.481176 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.481237 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.481251 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.481281 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.481300 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:50Z","lastTransitionTime":"2026-03-09T09:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.584087 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.584144 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.584166 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.584189 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.584206 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:50Z","lastTransitionTime":"2026-03-09T09:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.686862 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.686905 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.686915 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.686928 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.686937 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:50Z","lastTransitionTime":"2026-03-09T09:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.742308 4971 generic.go:334] "Generic (PLEG): container finished" podID="c18b589a-f45c-4d0d-9779-3e39d74e057a" containerID="7aa0ced69e42638943285f0769fdb75ca150148d12d30d33cc5257310ca1da2d" exitCode=0 Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.742402 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wbp4g" event={"ID":"c18b589a-f45c-4d0d-9779-3e39d74e057a","Type":"ContainerDied","Data":"7aa0ced69e42638943285f0769fdb75ca150148d12d30d33cc5257310ca1da2d"} Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.789929 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.789971 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.789984 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.790004 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.790018 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:50Z","lastTransitionTime":"2026-03-09T09:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.830874 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.830965 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:21:50 crc kubenswrapper[4971]: E0309 09:21:50.831549 4971 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:21:50 crc kubenswrapper[4971]: E0309 09:21:50.831630 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:58.831604627 +0000 UTC m=+122.391532447 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:21:50 crc kubenswrapper[4971]: E0309 09:21:50.831748 4971 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:21:50 crc kubenswrapper[4971]: E0309 09:21:50.841817 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:58.841725843 +0000 UTC m=+122.401653663 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.892872 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.892933 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.892948 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.892969 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.892981 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:50Z","lastTransitionTime":"2026-03-09T09:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.932472 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:21:50 crc kubenswrapper[4971]: E0309 09:21:50.932743 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:21:58.932704319 +0000 UTC m=+122.492632139 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.933039 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.933158 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:21:50 crc kubenswrapper[4971]: E0309 09:21:50.933288 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:21:50 crc kubenswrapper[4971]: E0309 09:21:50.933327 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:21:50 crc kubenswrapper[4971]: E0309 09:21:50.933346 4971 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:21:50 crc kubenswrapper[4971]: E0309 09:21:50.933412 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:58.933400529 +0000 UTC m=+122.493328539 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:21:50 crc kubenswrapper[4971]: E0309 09:21:50.933636 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:21:50 crc kubenswrapper[4971]: E0309 09:21:50.933708 4971 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:21:50 crc kubenswrapper[4971]: E0309 09:21:50.933764 4971 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:21:50 crc kubenswrapper[4971]: E0309 09:21:50.933876 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:58.933857212 +0000 UTC m=+122.493785022 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.995584 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.995977 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.995989 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.996004 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:50 crc kubenswrapper[4971]: I0309 09:21:50.996017 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:50Z","lastTransitionTime":"2026-03-09T09:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.099903 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.100350 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.100456 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.100530 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.100619 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:51Z","lastTransitionTime":"2026-03-09T09:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.151662 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.152101 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:21:51 crc kubenswrapper[4971]: E0309 09:21:51.152313 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.153184 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:21:51 crc kubenswrapper[4971]: E0309 09:21:51.153353 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:21:51 crc kubenswrapper[4971]: E0309 09:21:51.153196 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.204091 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.204124 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.204135 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.204150 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.204161 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:51Z","lastTransitionTime":"2026-03-09T09:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.307724 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.308033 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.308172 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.308306 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.308476 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:51Z","lastTransitionTime":"2026-03-09T09:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.412792 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.412859 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.412880 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.412983 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.413006 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:51Z","lastTransitionTime":"2026-03-09T09:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.515696 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.516227 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.516430 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.516563 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.516684 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:51Z","lastTransitionTime":"2026-03-09T09:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.621025 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.621106 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.621126 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.621157 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.621174 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:51Z","lastTransitionTime":"2026-03-09T09:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.725394 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.725437 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.725449 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.725468 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.725480 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:51Z","lastTransitionTime":"2026-03-09T09:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.749177 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wbp4g" event={"ID":"c18b589a-f45c-4d0d-9779-3e39d74e057a","Type":"ContainerStarted","Data":"208a7d4dd5491190a55127e35a4df4812dc6d03650e466dbdc5c07c6cd0f53fe"} Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.771703 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wbp4g" podStartSLOduration=51.77168802 podStartE2EDuration="51.77168802s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:21:51.771196495 +0000 UTC m=+115.331124305" watchObservedRunningTime="2026-03-09 09:21:51.77168802 +0000 UTC m=+115.331615830" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.827669 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.827709 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.827720 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.827736 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.827745 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:51Z","lastTransitionTime":"2026-03-09T09:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.930419 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.930457 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.930470 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.930491 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:51 crc kubenswrapper[4971]: I0309 09:21:51.930509 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:51Z","lastTransitionTime":"2026-03-09T09:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.032362 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.032406 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.032417 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.032443 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.032454 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:52Z","lastTransitionTime":"2026-03-09T09:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.134219 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.134253 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.134265 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.134278 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.134287 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:52Z","lastTransitionTime":"2026-03-09T09:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.151333 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9lhtb" Mar 09 09:21:52 crc kubenswrapper[4971]: E0309 09:21:52.151477 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9lhtb" podUID="8b19b44a-0898-4886-b5d2-4bc4ff950094" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.236146 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.236203 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.236214 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.236230 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.236240 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:52Z","lastTransitionTime":"2026-03-09T09:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.339822 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.339895 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.339918 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.339936 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.339948 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:52Z","lastTransitionTime":"2026-03-09T09:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.442617 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.442669 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.442681 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.442699 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.442710 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:52Z","lastTransitionTime":"2026-03-09T09:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.544801 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.544857 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.544871 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.544917 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.544933 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:52Z","lastTransitionTime":"2026-03-09T09:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.647323 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.647373 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.647399 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.647416 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.647429 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:52Z","lastTransitionTime":"2026-03-09T09:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.750479 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.750558 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.750576 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.750600 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.750623 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:52Z","lastTransitionTime":"2026-03-09T09:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.757341 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" event={"ID":"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6","Type":"ContainerStarted","Data":"e5c7c4811b81491cc4a4d2d125c7b4af5ee22ec7af2ae7307b7f80054dd21d7b"} Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.757861 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.757941 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.757957 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.786492 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" podStartSLOduration=52.786463751 podStartE2EDuration="52.786463751s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:21:52.786462021 +0000 UTC m=+116.346389851" watchObservedRunningTime="2026-03-09 09:21:52.786463751 +0000 UTC m=+116.346391561" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.788136 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.788198 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.852718 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.852755 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.852783 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.852819 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.852829 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:52Z","lastTransitionTime":"2026-03-09T09:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.955565 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.955600 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.955608 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.955624 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:52 crc kubenswrapper[4971]: I0309 09:21:52.955633 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:52Z","lastTransitionTime":"2026-03-09T09:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.057858 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.057925 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.057946 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.057973 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.057993 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:53Z","lastTransitionTime":"2026-03-09T09:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.151060 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.151105 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.151123 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:21:53 crc kubenswrapper[4971]: E0309 09:21:53.151283 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:21:53 crc kubenswrapper[4971]: E0309 09:21:53.151424 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:21:53 crc kubenswrapper[4971]: E0309 09:21:53.151603 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.165194 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.165246 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.165256 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.165903 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.165953 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:53Z","lastTransitionTime":"2026-03-09T09:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.268384 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.268434 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.268453 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.268477 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.268497 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:53Z","lastTransitionTime":"2026-03-09T09:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.371295 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.371400 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.371427 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.371458 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.371480 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:53Z","lastTransitionTime":"2026-03-09T09:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.474327 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.474805 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.474978 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.475144 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.475312 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:53Z","lastTransitionTime":"2026-03-09T09:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.578377 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.578418 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.578427 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.578442 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.578455 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:53Z","lastTransitionTime":"2026-03-09T09:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.681167 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.681220 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.681229 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.681246 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.681259 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:53Z","lastTransitionTime":"2026-03-09T09:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.783232 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.783270 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.783279 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.783292 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.783303 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:53Z","lastTransitionTime":"2026-03-09T09:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.885710 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.885747 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.885755 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.885770 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.885779 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:53Z","lastTransitionTime":"2026-03-09T09:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.988002 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.988041 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.988053 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.988067 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.988077 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:53Z","lastTransitionTime":"2026-03-09T09:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.999567 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9lhtb"] Mar 09 09:21:53 crc kubenswrapper[4971]: I0309 09:21:53.999727 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9lhtb" Mar 09 09:21:53 crc kubenswrapper[4971]: E0309 09:21:53.999866 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9lhtb" podUID="8b19b44a-0898-4886-b5d2-4bc4ff950094" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.068532 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b19b44a-0898-4886-b5d2-4bc4ff950094-metrics-certs\") pod \"network-metrics-daemon-9lhtb\" (UID: \"8b19b44a-0898-4886-b5d2-4bc4ff950094\") " pod="openshift-multus/network-metrics-daemon-9lhtb" Mar 09 09:21:54 crc kubenswrapper[4971]: E0309 09:21:54.068702 4971 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:21:54 crc kubenswrapper[4971]: E0309 09:21:54.068764 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b19b44a-0898-4886-b5d2-4bc4ff950094-metrics-certs podName:8b19b44a-0898-4886-b5d2-4bc4ff950094 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:02.068749523 +0000 UTC m=+125.628677333 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b19b44a-0898-4886-b5d2-4bc4ff950094-metrics-certs") pod "network-metrics-daemon-9lhtb" (UID: "8b19b44a-0898-4886-b5d2-4bc4ff950094") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.090553 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.090590 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.090599 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.090611 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.090621 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:54Z","lastTransitionTime":"2026-03-09T09:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.193288 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.193332 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.193343 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.193365 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.193398 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:54Z","lastTransitionTime":"2026-03-09T09:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.295698 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.295726 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.295734 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.295748 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.295757 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:54Z","lastTransitionTime":"2026-03-09T09:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.401694 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.401997 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.402007 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.402022 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.402050 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:54Z","lastTransitionTime":"2026-03-09T09:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.504498 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.504554 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.504569 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.504588 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.504599 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:54Z","lastTransitionTime":"2026-03-09T09:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.607164 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.607243 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.607261 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.607287 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.607303 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:54Z","lastTransitionTime":"2026-03-09T09:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.710162 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.710202 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.710217 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.710234 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.710246 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:54Z","lastTransitionTime":"2026-03-09T09:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.813502 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.813587 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.813613 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.813640 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.813659 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:54Z","lastTransitionTime":"2026-03-09T09:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.904209 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.904256 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.904267 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.904282 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.904292 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:54Z","lastTransitionTime":"2026-03-09T09:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.918753 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.918788 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.918799 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.918815 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.918827 4971 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:21:54Z","lastTransitionTime":"2026-03-09T09:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.946906 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-cldcz"] Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.947226 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cldcz" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.948947 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.949309 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.949366 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 09 09:21:54 crc kubenswrapper[4971]: I0309 09:21:54.949318 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 09 09:21:55 crc kubenswrapper[4971]: I0309 09:21:55.079987 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/974a5786-65c9-4a73-9ff4-de2bf3e88ace-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cldcz\" (UID: \"974a5786-65c9-4a73-9ff4-de2bf3e88ace\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cldcz" Mar 09 09:21:55 crc kubenswrapper[4971]: I0309 09:21:55.080475 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/974a5786-65c9-4a73-9ff4-de2bf3e88ace-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cldcz\" (UID: \"974a5786-65c9-4a73-9ff4-de2bf3e88ace\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cldcz" Mar 09 09:21:55 crc kubenswrapper[4971]: I0309 09:21:55.080767 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/974a5786-65c9-4a73-9ff4-de2bf3e88ace-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cldcz\" (UID: \"974a5786-65c9-4a73-9ff4-de2bf3e88ace\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cldcz" Mar 09 09:21:55 crc kubenswrapper[4971]: I0309 09:21:55.081120 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/974a5786-65c9-4a73-9ff4-de2bf3e88ace-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cldcz\" (UID: \"974a5786-65c9-4a73-9ff4-de2bf3e88ace\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cldcz" Mar 09 09:21:55 crc kubenswrapper[4971]: I0309 09:21:55.081329 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/974a5786-65c9-4a73-9ff4-de2bf3e88ace-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cldcz\" (UID: \"974a5786-65c9-4a73-9ff4-de2bf3e88ace\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cldcz" Mar 09 09:21:55 crc kubenswrapper[4971]: I0309 09:21:55.149610 4971 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 09 09:21:55 crc kubenswrapper[4971]: I0309 09:21:55.150953 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:21:55 crc kubenswrapper[4971]: I0309 09:21:55.150966 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:21:55 crc kubenswrapper[4971]: I0309 09:21:55.151012 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:21:55 crc kubenswrapper[4971]: E0309 09:21:55.151052 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:21:55 crc kubenswrapper[4971]: E0309 09:21:55.151101 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:21:55 crc kubenswrapper[4971]: E0309 09:21:55.151157 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:21:55 crc kubenswrapper[4971]: I0309 09:21:55.161637 4971 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 09 09:21:55 crc kubenswrapper[4971]: I0309 09:21:55.182069 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/974a5786-65c9-4a73-9ff4-de2bf3e88ace-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cldcz\" (UID: \"974a5786-65c9-4a73-9ff4-de2bf3e88ace\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cldcz" Mar 09 09:21:55 crc kubenswrapper[4971]: I0309 09:21:55.182116 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/974a5786-65c9-4a73-9ff4-de2bf3e88ace-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cldcz\" (UID: \"974a5786-65c9-4a73-9ff4-de2bf3e88ace\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cldcz" Mar 09 09:21:55 crc kubenswrapper[4971]: I0309 09:21:55.182149 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/974a5786-65c9-4a73-9ff4-de2bf3e88ace-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cldcz\" (UID: \"974a5786-65c9-4a73-9ff4-de2bf3e88ace\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cldcz" Mar 09 09:21:55 crc kubenswrapper[4971]: I0309 09:21:55.182186 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/974a5786-65c9-4a73-9ff4-de2bf3e88ace-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cldcz\" (UID: \"974a5786-65c9-4a73-9ff4-de2bf3e88ace\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cldcz" Mar 09 09:21:55 crc kubenswrapper[4971]: I0309 09:21:55.182187 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/974a5786-65c9-4a73-9ff4-de2bf3e88ace-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cldcz\" (UID: \"974a5786-65c9-4a73-9ff4-de2bf3e88ace\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cldcz" Mar 09 09:21:55 crc kubenswrapper[4971]: I0309 09:21:55.182332 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/974a5786-65c9-4a73-9ff4-de2bf3e88ace-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cldcz\" (UID: \"974a5786-65c9-4a73-9ff4-de2bf3e88ace\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cldcz" Mar 09 09:21:55 crc kubenswrapper[4971]: I0309 09:21:55.182211 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/974a5786-65c9-4a73-9ff4-de2bf3e88ace-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cldcz\" (UID: \"974a5786-65c9-4a73-9ff4-de2bf3e88ace\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cldcz" Mar 09 09:21:55 crc kubenswrapper[4971]: I0309 09:21:55.183056 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/974a5786-65c9-4a73-9ff4-de2bf3e88ace-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cldcz\" (UID: \"974a5786-65c9-4a73-9ff4-de2bf3e88ace\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cldcz" Mar 09 09:21:55 crc kubenswrapper[4971]: I0309 09:21:55.196463 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/974a5786-65c9-4a73-9ff4-de2bf3e88ace-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cldcz\" (UID: \"974a5786-65c9-4a73-9ff4-de2bf3e88ace\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cldcz" Mar 09 09:21:55 crc kubenswrapper[4971]: I0309 09:21:55.202127 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/974a5786-65c9-4a73-9ff4-de2bf3e88ace-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cldcz\" (UID: \"974a5786-65c9-4a73-9ff4-de2bf3e88ace\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cldcz" Mar 09 09:21:55 crc kubenswrapper[4971]: I0309 09:21:55.265855 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cldcz" Mar 09 09:21:55 crc kubenswrapper[4971]: W0309 09:21:55.277916 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod974a5786_65c9_4a73_9ff4_de2bf3e88ace.slice/crio-439bbbdb96a51754c40afa171381eb1b0a2f87812c6ab2423a38cd1f47a94709 WatchSource:0}: Error finding container 439bbbdb96a51754c40afa171381eb1b0a2f87812c6ab2423a38cd1f47a94709: Status 404 returned error can't find the container with id 439bbbdb96a51754c40afa171381eb1b0a2f87812c6ab2423a38cd1f47a94709 Mar 09 09:21:55 crc kubenswrapper[4971]: I0309 09:21:55.767497 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cldcz" event={"ID":"974a5786-65c9-4a73-9ff4-de2bf3e88ace","Type":"ContainerStarted","Data":"49efde74f4558eb5ee32d48aaa999632881c598c4ea1abc0302c1a718f4f2d21"} Mar 09 09:21:55 crc kubenswrapper[4971]: I0309 09:21:55.767554 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cldcz" event={"ID":"974a5786-65c9-4a73-9ff4-de2bf3e88ace","Type":"ContainerStarted","Data":"439bbbdb96a51754c40afa171381eb1b0a2f87812c6ab2423a38cd1f47a94709"} Mar 09 09:21:55 crc kubenswrapper[4971]: I0309 09:21:55.784208 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cldcz" podStartSLOduration=56.784190049 podStartE2EDuration="56.784190049s" podCreationTimestamp="2026-03-09 09:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:21:55.783280283 +0000 UTC m=+119.343208123" watchObservedRunningTime="2026-03-09 09:21:55.784190049 +0000 UTC m=+119.344117859" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.152624 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9lhtb" Mar 09 09:21:56 crc kubenswrapper[4971]: E0309 09:21:56.152988 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9lhtb" podUID="8b19b44a-0898-4886-b5d2-4bc4ff950094" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.772462 4971 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.772641 4971 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.820865 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hg8qg"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.821800 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hg8qg" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.822829 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-9w96x"] Mar 09 09:21:56 crc kubenswrapper[4971]: W0309 09:21:56.823988 4971 reflector.go:561] object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w": failed to list *v1.Secret: secrets "cluster-samples-operator-dockercfg-xpp9w" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-samples-operator": no relationship found between node 'crc' and this object Mar 09 09:21:56 crc kubenswrapper[4971]: E0309 09:21:56.824073 4971 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-xpp9w\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cluster-samples-operator-dockercfg-xpp9w\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-samples-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.825382 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9w96x" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.826727 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-h78mk"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.827543 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-d25sv"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.828076 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h78mk" Mar 09 09:21:56 crc kubenswrapper[4971]: W0309 09:21:56.828196 4971 reflector.go:561] object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4": failed to list *v1.Secret: secrets "machine-approver-sa-dockercfg-nl2j4" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Mar 09 09:21:56 crc kubenswrapper[4971]: E0309 09:21:56.828257 4971 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"machine-approver-sa-dockercfg-nl2j4\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-approver-sa-dockercfg-nl2j4\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:21:56 crc kubenswrapper[4971]: W0309 09:21:56.828270 4971 reflector.go:561] object-"openshift-cluster-machine-approver"/"machine-approver-tls": failed to list *v1.Secret: secrets "machine-approver-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.828362 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-d25sv" Mar 09 09:21:56 crc kubenswrapper[4971]: E0309 09:21:56.828336 4971 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"machine-approver-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-approver-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:21:56 crc kubenswrapper[4971]: W0309 09:21:56.832328 4971 reflector.go:561] object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Mar 09 09:21:56 crc kubenswrapper[4971]: E0309 09:21:56.832406 4971 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:21:56 crc kubenswrapper[4971]: W0309 09:21:56.832540 4971 reflector.go:561] object-"openshift-cluster-machine-approver"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Mar 09 09:21:56 crc kubenswrapper[4971]: E0309 09:21:56.832564 4971 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:21:56 crc kubenswrapper[4971]: W0309 09:21:56.832665 4971 reflector.go:561] object-"openshift-cluster-samples-operator"/"samples-operator-tls": failed to list *v1.Secret: secrets "samples-operator-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-samples-operator": no relationship found between node 'crc' and this object Mar 09 09:21:56 crc kubenswrapper[4971]: E0309 09:21:56.832686 4971 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"samples-operator-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-samples-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:21:56 crc kubenswrapper[4971]: W0309 09:21:56.834190 4971 reflector.go:561] object-"openshift-cluster-samples-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-samples-operator": no relationship found between node 'crc' and this object Mar 09 09:21:56 crc kubenswrapper[4971]: E0309 09:21:56.834233 4971 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-samples-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:21:56 crc kubenswrapper[4971]: W0309 09:21:56.834353 4971 reflector.go:561] object-"openshift-cluster-machine-approver"/"machine-approver-config": failed to list *v1.ConfigMap: configmaps "machine-approver-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Mar 09 09:21:56 crc kubenswrapper[4971]: E0309 09:21:56.834413 4971 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"machine-approver-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"machine-approver-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:21:56 crc kubenswrapper[4971]: W0309 09:21:56.834511 4971 reflector.go:561] object-"openshift-console"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-console": no relationship found between node 'crc' and this object Mar 09 09:21:56 crc kubenswrapper[4971]: E0309 09:21:56.834536 4971 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:21:56 crc kubenswrapper[4971]: W0309 09:21:56.834603 4971 reflector.go:561] object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-samples-operator": no relationship found between node 'crc' and this object Mar 09 09:21:56 crc kubenswrapper[4971]: E0309 09:21:56.834630 4971 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-samples-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:21:56 crc kubenswrapper[4971]: W0309 09:21:56.834762 4971 reflector.go:561] object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z": failed to list *v1.Secret: secrets "openshift-config-operator-dockercfg-7pc5z" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-config-operator": no relationship found between node 'crc' and this object Mar 09 09:21:56 crc kubenswrapper[4971]: E0309 09:21:56.834786 4971 reflector.go:158] "Unhandled Error" err="object-\"openshift-config-operator\"/\"openshift-config-operator-dockercfg-7pc5z\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-config-operator-dockercfg-7pc5z\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:21:56 crc kubenswrapper[4971]: W0309 09:21:56.834855 4971 reflector.go:561] object-"openshift-console"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-console": no relationship found between node 'crc' and this object Mar 09 09:21:56 crc kubenswrapper[4971]: E0309 09:21:56.834877 4971 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:21:56 crc kubenswrapper[4971]: W0309 09:21:56.835112 4971 reflector.go:561] object-"openshift-cluster-machine-approver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Mar 09 09:21:56 crc kubenswrapper[4971]: E0309 09:21:56.835175 4971 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.835827 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-dnx9z"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.837254 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:21:56 crc kubenswrapper[4971]: W0309 09:21:56.843553 4971 reflector.go:561] object-"openshift-config-operator"/"config-operator-serving-cert": failed to list *v1.Secret: secrets "config-operator-serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-config-operator": no relationship found between node 'crc' and this object Mar 09 09:21:56 crc kubenswrapper[4971]: E0309 09:21:56.843660 4971 reflector.go:158] "Unhandled Error" err="object-\"openshift-config-operator\"/\"config-operator-serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"config-operator-serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:21:56 crc kubenswrapper[4971]: W0309 09:21:56.843782 4971 reflector.go:561] object-"openshift-config-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-config-operator": no relationship found between node 'crc' and this object Mar 09 09:21:56 crc kubenswrapper[4971]: E0309 09:21:56.843817 4971 reflector.go:158] "Unhandled Error" err="object-\"openshift-config-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:21:56 crc kubenswrapper[4971]: W0309 09:21:56.843985 4971 reflector.go:561] object-"openshift-config-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-config-operator": no relationship found between node 'crc' and this object Mar 09 09:21:56 crc kubenswrapper[4971]: E0309 09:21:56.844022 4971 reflector.go:158] "Unhandled Error" err="object-\"openshift-config-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:21:56 crc kubenswrapper[4971]: W0309 09:21:56.844170 4971 reflector.go:561] object-"openshift-console"/"default-dockercfg-chnjx": failed to list *v1.Secret: secrets "default-dockercfg-chnjx" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-console": no relationship found between node 'crc' and this object Mar 09 09:21:56 crc kubenswrapper[4971]: E0309 09:21:56.844200 4971 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"default-dockercfg-chnjx\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-chnjx\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:21:56 crc kubenswrapper[4971]: W0309 09:21:56.844295 4971 reflector.go:561] object-"openshift-console"/"oauth-serving-cert": failed to list *v1.ConfigMap: configmaps "oauth-serving-cert" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-console": no relationship found between node 'crc' and this object Mar 09 09:21:56 crc kubenswrapper[4971]: E0309 09:21:56.844328 4971 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"oauth-serving-cert\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"oauth-serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:21:56 crc kubenswrapper[4971]: W0309 09:21:56.844636 4971 reflector.go:561] object-"openshift-console"/"console-dockercfg-f62pw": failed to list *v1.Secret: secrets "console-dockercfg-f62pw" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-console": no relationship found between node 'crc' and this object Mar 09 09:21:56 crc kubenswrapper[4971]: E0309 09:21:56.844679 4971 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"console-dockercfg-f62pw\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"console-dockercfg-f62pw\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:21:56 crc kubenswrapper[4971]: W0309 09:21:56.844800 4971 reflector.go:561] object-"openshift-console"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-console": no relationship found between node 'crc' and this object Mar 09 09:21:56 crc kubenswrapper[4971]: E0309 09:21:56.844847 4971 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.845031 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wf8hd"] Mar 09 09:21:56 crc kubenswrapper[4971]: W0309 09:21:56.845138 4971 reflector.go:561] object-"openshift-console"/"console-oauth-config": failed to list *v1.Secret: secrets "console-oauth-config" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-console": no relationship found between node 'crc' and this object Mar 09 09:21:56 crc kubenswrapper[4971]: E0309 09:21:56.846493 4971 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"console-oauth-config\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"console-oauth-config\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:21:56 crc kubenswrapper[4971]: W0309 09:21:56.845218 4971 reflector.go:561] object-"openshift-console"/"console-config": failed to list *v1.ConfigMap: configmaps "console-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-console": no relationship found between node 'crc' and this object Mar 09 09:21:56 crc kubenswrapper[4971]: E0309 09:21:56.846777 4971 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"console-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"console-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:21:56 crc kubenswrapper[4971]: W0309 09:21:56.845399 4971 reflector.go:561] object-"openshift-console"/"console-serving-cert": failed to list *v1.Secret: secrets "console-serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-console": no relationship found between node 'crc' and this object Mar 09 09:21:56 crc kubenswrapper[4971]: E0309 09:21:56.847054 4971 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"console-serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"console-serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:21:56 crc kubenswrapper[4971]: W0309 09:21:56.845412 4971 reflector.go:561] object-"openshift-console"/"service-ca": failed to list *v1.ConfigMap: configmaps "service-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-console": no relationship found between node 'crc' and this object Mar 09 09:21:56 crc kubenswrapper[4971]: E0309 09:21:56.847346 4971 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"service-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"service-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.853801 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wf8hd" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.857305 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7xwd6"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.857946 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-txc2s"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.858054 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.858729 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.859667 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.859960 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rqlbq"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.860024 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-7xwd6" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.860554 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.860787 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.862067 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cm5pk"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.862515 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cm5pk" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.864478 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.865642 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-txc2s" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.868387 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.869306 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nw59v"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.869715 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.870686 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.871257 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.877569 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.878409 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.878551 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.878596 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.878622 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.878788 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.878882 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.879018 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.879078 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.879123 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.879162 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.879237 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.879297 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.879301 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.879395 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.879437 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.879469 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.879560 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.879601 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.879667 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.879696 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.879575 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.879016 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.879804 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.880310 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.882936 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t9hb6"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.883615 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-t9hb6" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.885001 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.887137 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pf6b7"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.888057 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pf6b7" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.888974 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mct42"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.901465 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kd9w\" (UniqueName: \"kubernetes.io/projected/d28dca1b-efeb-4b15-833b-8bc78aa16238-kube-api-access-4kd9w\") pod \"cluster-samples-operator-665b6dd947-hg8qg\" (UID: \"d28dca1b-efeb-4b15-833b-8bc78aa16238\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hg8qg" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.901523 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79d6be06-8c45-4058-a2ff-5daf63d0404e-serving-cert\") pod \"console-operator-58897d9998-wf8hd\" (UID: \"79d6be06-8c45-4058-a2ff-5daf63d0404e\") " pod="openshift-console-operator/console-operator-58897d9998-wf8hd" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.901637 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-trusted-ca-bundle\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.901677 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-node-pullsecrets\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.901713 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-etcd-serving-ca\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.901746 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-oauth-config\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.901771 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.901801 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d28dca1b-efeb-4b15-833b-8bc78aa16238-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-hg8qg\" (UID: \"d28dca1b-efeb-4b15-833b-8bc78aa16238\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hg8qg" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.901835 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23bc42fe-aadb-4283-a679-b07d87b04a15-config\") pod \"machine-approver-56656f9798-9w96x\" (UID: \"23bc42fe-aadb-4283-a679-b07d87b04a15\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9w96x" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.901864 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-encryption-config\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.901896 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brnkp\" (UniqueName: \"kubernetes.io/projected/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-kube-api-access-brnkp\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.901921 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-audit-dir\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.901955 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/23bc42fe-aadb-4283-a679-b07d87b04a15-auth-proxy-config\") pod \"machine-approver-56656f9798-9w96x\" (UID: \"23bc42fe-aadb-4283-a679-b07d87b04a15\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9w96x" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.901982 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-service-ca\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.902011 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-etcd-client\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.902037 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7e1d5ee3-5d9c-4d44-bf5a-343216e8803e-metrics-tls\") pod \"dns-operator-744455d44c-7xwd6\" (UID: \"7e1d5ee3-5d9c-4d44-bf5a-343216e8803e\") " pod="openshift-dns-operator/dns-operator-744455d44c-7xwd6" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.902069 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvz99\" (UniqueName: \"kubernetes.io/projected/a5371ca7-5f2f-4b51-add8-021a77d93c9c-kube-api-access-zvz99\") pod \"openshift-controller-manager-operator-756b6f6bc6-txc2s\" (UID: \"a5371ca7-5f2f-4b51-add8-021a77d93c9c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-txc2s" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.902101 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5371ca7-5f2f-4b51-add8-021a77d93c9c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-txc2s\" (UID: \"a5371ca7-5f2f-4b51-add8-021a77d93c9c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-txc2s" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.902144 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-config\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.902171 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-image-import-ca\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.902202 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfsh6\" (UniqueName: \"kubernetes.io/projected/7e1d5ee3-5d9c-4d44-bf5a-343216e8803e-kube-api-access-kfsh6\") pod \"dns-operator-744455d44c-7xwd6\" (UID: \"7e1d5ee3-5d9c-4d44-bf5a-343216e8803e\") " pod="openshift-dns-operator/dns-operator-744455d44c-7xwd6" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.902244 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnhl5\" (UniqueName: \"kubernetes.io/projected/23bc42fe-aadb-4283-a679-b07d87b04a15-kube-api-access-dnhl5\") pod \"machine-approver-56656f9798-9w96x\" (UID: \"23bc42fe-aadb-4283-a679-b07d87b04a15\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9w96x" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.902277 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79d6be06-8c45-4058-a2ff-5daf63d0404e-config\") pod \"console-operator-58897d9998-wf8hd\" (UID: \"79d6be06-8c45-4058-a2ff-5daf63d0404e\") " pod="openshift-console-operator/console-operator-58897d9998-wf8hd" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.902309 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-oauth-serving-cert\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.902346 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftt99\" (UniqueName: \"kubernetes.io/projected/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-kube-api-access-ftt99\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.902399 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkn5d\" (UniqueName: \"kubernetes.io/projected/79d6be06-8c45-4058-a2ff-5daf63d0404e-kube-api-access-mkn5d\") pod \"console-operator-58897d9998-wf8hd\" (UID: \"79d6be06-8c45-4058-a2ff-5daf63d0404e\") " pod="openshift-console-operator/console-operator-58897d9998-wf8hd" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.902432 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5371ca7-5f2f-4b51-add8-021a77d93c9c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-txc2s\" (UID: \"a5371ca7-5f2f-4b51-add8-021a77d93c9c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-txc2s" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.902464 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80f7e4a7-4617-4978-b42e-8a33b6465690-serving-cert\") pod \"openshift-config-operator-7777fb866f-h78mk\" (UID: \"80f7e4a7-4617-4978-b42e-8a33b6465690\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h78mk" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.902659 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-audit\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.902699 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pvnt\" (UniqueName: \"kubernetes.io/projected/afc88ae6-e5b1-4da0-b10a-a6bf1816e6fa-kube-api-access-2pvnt\") pod \"downloads-7954f5f757-d25sv\" (UID: \"afc88ae6-e5b1-4da0-b10a-a6bf1816e6fa\") " pod="openshift-console/downloads-7954f5f757-d25sv" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.902736 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmzhb\" (UniqueName: \"kubernetes.io/projected/80f7e4a7-4617-4978-b42e-8a33b6465690-kube-api-access-hmzhb\") pod \"openshift-config-operator-7777fb866f-h78mk\" (UID: \"80f7e4a7-4617-4978-b42e-8a33b6465690\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h78mk" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.902786 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/23bc42fe-aadb-4283-a679-b07d87b04a15-machine-approver-tls\") pod \"machine-approver-56656f9798-9w96x\" (UID: \"23bc42fe-aadb-4283-a679-b07d87b04a15\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9w96x" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.902821 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-serving-cert\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.902853 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/80f7e4a7-4617-4978-b42e-8a33b6465690-available-featuregates\") pod \"openshift-config-operator-7777fb866f-h78mk\" (UID: \"80f7e4a7-4617-4978-b42e-8a33b6465690\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h78mk" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.903081 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-serving-cert\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.903113 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-config\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.903142 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79d6be06-8c45-4058-a2ff-5daf63d0404e-trusted-ca\") pod \"console-operator-58897d9998-wf8hd\" (UID: \"79d6be06-8c45-4058-a2ff-5daf63d0404e\") " pod="openshift-console-operator/console-operator-58897d9998-wf8hd" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.905425 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwp59"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.905593 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.905914 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.906702 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.906894 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.907603 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.907853 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.908207 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.908404 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.909175 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-mct42" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.909182 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.909865 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.910022 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.912835 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.912934 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.913122 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.913227 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.913298 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.912854 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.912938 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.913915 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.914466 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.915030 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.919146 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.919509 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.924499 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nvzgg"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.926797 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwp59" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.927204 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.942228 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.942856 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.943018 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.943170 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.943294 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.951506 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.952058 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.952814 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-w4c8h"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.953533 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-w4c8h" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.953988 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bzmmh"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.954282 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bzmmh" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.954684 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vsdnk"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.955426 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdnk" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.956386 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-d6xhv"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.956987 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.957873 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.960144 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.962725 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lmr9s"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.965859 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.966269 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.966505 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.966686 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.968812 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.969120 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.969298 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.977594 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xhbrd"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.977721 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.977808 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.981384 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-48g6z"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.982006 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-h8b5s"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.982590 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dvp8t"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.982980 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-brkss"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.983417 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j9sc4"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.983817 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zqnt8"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.984192 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2kmr7"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.984571 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4zczm"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.985088 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qz4l7"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.985586 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dvp8t" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.985692 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550795-nmbwp"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.985824 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zqnt8" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.986095 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-mbv68"] Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.986569 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zczm" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.988214 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.988439 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.988544 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 09 09:21:56 crc kubenswrapper[4971]: I0309 09:21:56.988731 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.010054 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.017159 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2kmr7" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.017238 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-48g6z" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.017278 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xhbrd" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.017304 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qz4l7" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.017327 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-brkss" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.017641 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-nmbwp" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.017721 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j9sc4" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.017875 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8b5s" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.018161 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.018660 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.018913 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.018914 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.019363 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.019567 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.020352 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca599a0a-36fb-4040-9304-01eb9d4c19d0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bzmmh\" (UID: \"ca599a0a-36fb-4040-9304-01eb9d4c19d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bzmmh" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.020866 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnhl5\" (UniqueName: \"kubernetes.io/projected/23bc42fe-aadb-4283-a679-b07d87b04a15-kube-api-access-dnhl5\") pod \"machine-approver-56656f9798-9w96x\" (UID: \"23bc42fe-aadb-4283-a679-b07d87b04a15\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9w96x" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.020922 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79d6be06-8c45-4058-a2ff-5daf63d0404e-config\") pod \"console-operator-58897d9998-wf8hd\" (UID: \"79d6be06-8c45-4058-a2ff-5daf63d0404e\") " pod="openshift-console-operator/console-operator-58897d9998-wf8hd" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.020978 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-oauth-serving-cert\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.021017 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca599a0a-36fb-4040-9304-01eb9d4c19d0-config\") pod \"kube-apiserver-operator-766d6c64bb-bzmmh\" (UID: \"ca599a0a-36fb-4040-9304-01eb9d4c19d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bzmmh" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.021067 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftt99\" (UniqueName: \"kubernetes.io/projected/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-kube-api-access-ftt99\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.021090 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-etcd-ca\") pod \"etcd-operator-b45778765-nw59v\" (UID: \"2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.021110 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkn5d\" (UniqueName: \"kubernetes.io/projected/79d6be06-8c45-4058-a2ff-5daf63d0404e-kube-api-access-mkn5d\") pod \"console-operator-58897d9998-wf8hd\" (UID: \"79d6be06-8c45-4058-a2ff-5daf63d0404e\") " pod="openshift-console-operator/console-operator-58897d9998-wf8hd" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.021154 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80f7e4a7-4617-4978-b42e-8a33b6465690-serving-cert\") pod \"openshift-config-operator-7777fb866f-h78mk\" (UID: \"80f7e4a7-4617-4978-b42e-8a33b6465690\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h78mk" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.021178 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5371ca7-5f2f-4b51-add8-021a77d93c9c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-txc2s\" (UID: \"a5371ca7-5f2f-4b51-add8-021a77d93c9c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-txc2s" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.021199 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjgtf\" (UniqueName: \"kubernetes.io/projected/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-kube-api-access-jjgtf\") pod \"etcd-operator-b45778765-nw59v\" (UID: \"2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.021291 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pvnt\" (UniqueName: \"kubernetes.io/projected/afc88ae6-e5b1-4da0-b10a-a6bf1816e6fa-kube-api-access-2pvnt\") pod \"downloads-7954f5f757-d25sv\" (UID: \"afc88ae6-e5b1-4da0-b10a-a6bf1816e6fa\") " pod="openshift-console/downloads-7954f5f757-d25sv" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.021403 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmzhb\" (UniqueName: \"kubernetes.io/projected/80f7e4a7-4617-4978-b42e-8a33b6465690-kube-api-access-hmzhb\") pod \"openshift-config-operator-7777fb866f-h78mk\" (UID: \"80f7e4a7-4617-4978-b42e-8a33b6465690\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h78mk" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.021446 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-audit\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.021539 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/23bc42fe-aadb-4283-a679-b07d87b04a15-machine-approver-tls\") pod \"machine-approver-56656f9798-9w96x\" (UID: \"23bc42fe-aadb-4283-a679-b07d87b04a15\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9w96x" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.021591 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-serving-cert\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.021630 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-serving-cert\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.021666 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-config\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.021703 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79d6be06-8c45-4058-a2ff-5daf63d0404e-trusted-ca\") pod \"console-operator-58897d9998-wf8hd\" (UID: \"79d6be06-8c45-4058-a2ff-5daf63d0404e\") " pod="openshift-console-operator/console-operator-58897d9998-wf8hd" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.021749 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d4feaecd-f674-489f-a6d3-12e5f433d90e-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cm5pk\" (UID: \"d4feaecd-f674-489f-a6d3-12e5f433d90e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cm5pk" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.021783 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2crc\" (UniqueName: \"kubernetes.io/projected/d4feaecd-f674-489f-a6d3-12e5f433d90e-kube-api-access-l2crc\") pod \"cluster-image-registry-operator-dc59b4c8b-cm5pk\" (UID: \"d4feaecd-f674-489f-a6d3-12e5f433d90e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cm5pk" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.021823 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/80f7e4a7-4617-4978-b42e-8a33b6465690-available-featuregates\") pod \"openshift-config-operator-7777fb866f-h78mk\" (UID: \"80f7e4a7-4617-4978-b42e-8a33b6465690\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h78mk" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.021938 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca599a0a-36fb-4040-9304-01eb9d4c19d0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bzmmh\" (UID: \"ca599a0a-36fb-4040-9304-01eb9d4c19d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bzmmh" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.021987 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kd9w\" (UniqueName: \"kubernetes.io/projected/d28dca1b-efeb-4b15-833b-8bc78aa16238-kube-api-access-4kd9w\") pod \"cluster-samples-operator-665b6dd947-hg8qg\" (UID: \"d28dca1b-efeb-4b15-833b-8bc78aa16238\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hg8qg" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.022017 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79d6be06-8c45-4058-a2ff-5daf63d0404e-serving-cert\") pod \"console-operator-58897d9998-wf8hd\" (UID: \"79d6be06-8c45-4058-a2ff-5daf63d0404e\") " pod="openshift-console-operator/console-operator-58897d9998-wf8hd" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.022065 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-trusted-ca-bundle\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.022110 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-node-pullsecrets\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.022138 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79d6be06-8c45-4058-a2ff-5daf63d0404e-config\") pod \"console-operator-58897d9998-wf8hd\" (UID: \"79d6be06-8c45-4058-a2ff-5daf63d0404e\") " pod="openshift-console-operator/console-operator-58897d9998-wf8hd" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.022161 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-etcd-serving-ca\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.022281 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b65bx"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.022497 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5371ca7-5f2f-4b51-add8-021a77d93c9c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-txc2s\" (UID: \"a5371ca7-5f2f-4b51-add8-021a77d93c9c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-txc2s" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.022798 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-config\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.022289 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-etcd-client\") pod \"etcd-operator-b45778765-nw59v\" (UID: \"2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.029814 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4feaecd-f674-489f-a6d3-12e5f433d90e-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cm5pk\" (UID: \"d4feaecd-f674-489f-a6d3-12e5f433d90e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cm5pk" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.023207 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mbv68" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.029972 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-oauth-config\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.024016 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-node-pullsecrets\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.024147 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b65bx" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.024830 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-etcd-serving-ca\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.025076 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-audit\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.029431 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.030017 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.030989 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23bc42fe-aadb-4283-a679-b07d87b04a15-config\") pod \"machine-approver-56656f9798-9w96x\" (UID: \"23bc42fe-aadb-4283-a679-b07d87b04a15\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9w96x" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.031108 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-encryption-config\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.031138 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d28dca1b-efeb-4b15-833b-8bc78aa16238-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-hg8qg\" (UID: \"d28dca1b-efeb-4b15-833b-8bc78aa16238\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hg8qg" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.031165 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/23bc42fe-aadb-4283-a679-b07d87b04a15-auth-proxy-config\") pod \"machine-approver-56656f9798-9w96x\" (UID: \"23bc42fe-aadb-4283-a679-b07d87b04a15\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9w96x" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.031297 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brnkp\" (UniqueName: \"kubernetes.io/projected/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-kube-api-access-brnkp\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.031491 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-audit-dir\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.031327 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-audit-dir\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.031957 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-service-ca\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.032092 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-etcd-client\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.032122 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7e1d5ee3-5d9c-4d44-bf5a-343216e8803e-metrics-tls\") pod \"dns-operator-744455d44c-7xwd6\" (UID: \"7e1d5ee3-5d9c-4d44-bf5a-343216e8803e\") " pod="openshift-dns-operator/dns-operator-744455d44c-7xwd6" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.029588 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.032146 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d4feaecd-f674-489f-a6d3-12e5f433d90e-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cm5pk\" (UID: \"d4feaecd-f674-489f-a6d3-12e5f433d90e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cm5pk" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.032281 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvz99\" (UniqueName: \"kubernetes.io/projected/a5371ca7-5f2f-4b51-add8-021a77d93c9c-kube-api-access-zvz99\") pod \"openshift-controller-manager-operator-756b6f6bc6-txc2s\" (UID: \"a5371ca7-5f2f-4b51-add8-021a77d93c9c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-txc2s" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.032305 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-config\") pod \"etcd-operator-b45778765-nw59v\" (UID: \"2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.032481 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5371ca7-5f2f-4b51-add8-021a77d93c9c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-txc2s\" (UID: \"a5371ca7-5f2f-4b51-add8-021a77d93c9c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-txc2s" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.032503 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-serving-cert\") pod \"etcd-operator-b45778765-nw59v\" (UID: \"2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.032794 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-config\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.032819 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-image-import-ca\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.032857 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-etcd-service-ca\") pod \"etcd-operator-b45778765-nw59v\" (UID: \"2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.032897 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfsh6\" (UniqueName: \"kubernetes.io/projected/7e1d5ee3-5d9c-4d44-bf5a-343216e8803e-kube-api-access-kfsh6\") pod \"dns-operator-744455d44c-7xwd6\" (UID: \"7e1d5ee3-5d9c-4d44-bf5a-343216e8803e\") " pod="openshift-dns-operator/dns-operator-744455d44c-7xwd6" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.034540 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-image-import-ca\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.034725 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.038321 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79d6be06-8c45-4058-a2ff-5daf63d0404e-trusted-ca\") pod \"console-operator-58897d9998-wf8hd\" (UID: \"79d6be06-8c45-4058-a2ff-5daf63d0404e\") " pod="openshift-console-operator/console-operator-58897d9998-wf8hd" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.023416 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/80f7e4a7-4617-4978-b42e-8a33b6465690-available-featuregates\") pod \"openshift-config-operator-7777fb866f-h78mk\" (UID: \"80f7e4a7-4617-4978-b42e-8a33b6465690\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h78mk" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.024097 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bzw7"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.040435 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-h78mk"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.040477 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-d25sv"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.040490 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-fg4hj"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.040947 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-dnx9z"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.040963 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wf8hd"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.040974 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-n8lbv"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.043711 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-fg4hj" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.044507 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bzw7" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.044844 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7xwd6"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.044897 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-57xjq"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.045724 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.049834 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.060874 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-57xjq" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.061162 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-serving-cert\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.063423 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-etcd-client\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.063528 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bzmmh"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.063929 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-encryption-config\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.064018 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7e1d5ee3-5d9c-4d44-bf5a-343216e8803e-metrics-tls\") pod \"dns-operator-744455d44c-7xwd6\" (UID: \"7e1d5ee3-5d9c-4d44-bf5a-343216e8803e\") " pod="openshift-dns-operator/dns-operator-744455d44c-7xwd6" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.064319 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5371ca7-5f2f-4b51-add8-021a77d93c9c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-txc2s\" (UID: \"a5371ca7-5f2f-4b51-add8-021a77d93c9c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-txc2s" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.066588 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pf6b7"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.067569 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79d6be06-8c45-4058-a2ff-5daf63d0404e-serving-cert\") pod \"console-operator-58897d9998-wf8hd\" (UID: \"79d6be06-8c45-4058-a2ff-5daf63d0404e\") " pod="openshift-console-operator/console-operator-58897d9998-wf8hd" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.067769 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.068010 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rqlbq"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.072154 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lmr9s"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.072182 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-txc2s"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.072194 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.077799 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-mbv68"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.077825 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-48g6z"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.077834 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t9hb6"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.084744 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cm5pk"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.085460 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.086699 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mct42"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.086749 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zqnt8"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.088791 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4zczm"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.089733 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vsdnk"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.091691 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dvp8t"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.093966 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.095379 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j9sc4"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.100159 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nw59v"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.101295 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hg8qg"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.102230 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-h8b5s"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.103099 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nvzgg"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.104397 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b65bx"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.106428 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.109486 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-brkss"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.110821 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550795-nmbwp"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.112578 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qz4l7"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.115896 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-krqgm"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.117138 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-krqgm" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.118704 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-4g4cr"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.119323 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-4g4cr" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.120516 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-fg4hj"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.121834 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2kmr7"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.123731 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bzw7"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.125528 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwp59"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.126090 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.126651 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-d6xhv"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.128478 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xhbrd"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.131198 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-krqgm"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.135919 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-th2ls"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.138894 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-57xjq"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.139082 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-etcd-ca\") pod \"etcd-operator-b45778765-nw59v\" (UID: \"2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.139166 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjgtf\" (UniqueName: \"kubernetes.io/projected/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-kube-api-access-jjgtf\") pod \"etcd-operator-b45778765-nw59v\" (UID: \"2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.139259 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d4feaecd-f674-489f-a6d3-12e5f433d90e-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cm5pk\" (UID: \"d4feaecd-f674-489f-a6d3-12e5f433d90e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cm5pk" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.139275 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-th2ls" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.139473 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2crc\" (UniqueName: \"kubernetes.io/projected/d4feaecd-f674-489f-a6d3-12e5f433d90e-kube-api-access-l2crc\") pod \"cluster-image-registry-operator-dc59b4c8b-cm5pk\" (UID: \"d4feaecd-f674-489f-a6d3-12e5f433d90e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cm5pk" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.139546 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca599a0a-36fb-4040-9304-01eb9d4c19d0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bzmmh\" (UID: \"ca599a0a-36fb-4040-9304-01eb9d4c19d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bzmmh" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.139648 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-etcd-client\") pod \"etcd-operator-b45778765-nw59v\" (UID: \"2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.139702 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4feaecd-f674-489f-a6d3-12e5f433d90e-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cm5pk\" (UID: \"d4feaecd-f674-489f-a6d3-12e5f433d90e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cm5pk" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.139838 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d4feaecd-f674-489f-a6d3-12e5f433d90e-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cm5pk\" (UID: \"d4feaecd-f674-489f-a6d3-12e5f433d90e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cm5pk" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.139900 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-serving-cert\") pod \"etcd-operator-b45778765-nw59v\" (UID: \"2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.139946 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-config\") pod \"etcd-operator-b45778765-nw59v\" (UID: \"2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.140006 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-etcd-service-ca\") pod \"etcd-operator-b45778765-nw59v\" (UID: \"2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.139906 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-th2ls"] Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.140208 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca599a0a-36fb-4040-9304-01eb9d4c19d0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bzmmh\" (UID: \"ca599a0a-36fb-4040-9304-01eb9d4c19d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bzmmh" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.140316 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca599a0a-36fb-4040-9304-01eb9d4c19d0-config\") pod \"kube-apiserver-operator-766d6c64bb-bzmmh\" (UID: \"ca599a0a-36fb-4040-9304-01eb9d4c19d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bzmmh" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.145902 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.151168 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.151255 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.151235 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.151826 4971 scope.go:117] "RemoveContainer" containerID="0498fa34e162baaf3d51e00c839035dfb5a043d12e709f17f37859b8d3fbe083" Mar 09 09:21:57 crc kubenswrapper[4971]: E0309 09:21:57.151990 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.164284 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.183860 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.203916 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.237189 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.244430 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.263834 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.284982 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.326179 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.329631 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.347385 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.363513 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.374403 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca599a0a-36fb-4040-9304-01eb9d4c19d0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bzmmh\" (UID: \"ca599a0a-36fb-4040-9304-01eb9d4c19d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bzmmh" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.384036 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.391313 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca599a0a-36fb-4040-9304-01eb9d4c19d0-config\") pod \"kube-apiserver-operator-766d6c64bb-bzmmh\" (UID: \"ca599a0a-36fb-4040-9304-01eb9d4c19d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bzmmh" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.403851 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.423292 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.443769 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.469124 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.483498 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.503548 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.523879 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.546383 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.613631 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.624124 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.643774 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.664566 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.684140 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.704674 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.724502 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.743886 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.764569 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.784306 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.804310 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.823818 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.843776 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.872119 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.883649 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.903867 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 09 09:21:57 crc kubenswrapper[4971]: I0309 09:21:57.924115 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.021785 4971 configmap.go:193] Couldn't get configMap openshift-console/oauth-serving-cert: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.021879 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-oauth-serving-cert podName:c8c3ac1c-4896-4db2-8917-0a57667a1fa8 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:58.521858999 +0000 UTC m=+122.081786819 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "oauth-serving-cert" (UniqueName: "kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-oauth-serving-cert") pod "console-f9d7485db-dnx9z" (UID: "c8c3ac1c-4896-4db2-8917-0a57667a1fa8") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.024301 4971 configmap.go:193] Couldn't get configMap openshift-console/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.024429 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-trusted-ca-bundle podName:c8c3ac1c-4896-4db2-8917-0a57667a1fa8 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:58.524407383 +0000 UTC m=+122.084335193 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-trusted-ca-bundle") pod "console-f9d7485db-dnx9z" (UID: "c8c3ac1c-4896-4db2-8917-0a57667a1fa8") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.032892 4971 secret.go:188] Couldn't get secret openshift-console/console-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.032906 4971 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/machine-approver-config: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.033012 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-serving-cert podName:c8c3ac1c-4896-4db2-8917-0a57667a1fa8 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:58.532984474 +0000 UTC m=+122.092912314 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-serving-cert") pod "console-f9d7485db-dnx9z" (UID: "c8c3ac1c-4896-4db2-8917-0a57667a1fa8") : failed to sync secret cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.033054 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/23bc42fe-aadb-4283-a679-b07d87b04a15-config podName:23bc42fe-aadb-4283-a679-b07d87b04a15 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:58.533037565 +0000 UTC m=+122.092965455 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/23bc42fe-aadb-4283-a679-b07d87b04a15-config") pod "machine-approver-56656f9798-9w96x" (UID: "23bc42fe-aadb-4283-a679-b07d87b04a15") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.033222 4971 secret.go:188] Couldn't get secret openshift-cluster-machine-approver/machine-approver-tls: failed to sync secret cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.033254 4971 configmap.go:193] Couldn't get configMap openshift-console/service-ca: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.033383 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23bc42fe-aadb-4283-a679-b07d87b04a15-machine-approver-tls podName:23bc42fe-aadb-4283-a679-b07d87b04a15 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:58.533350154 +0000 UTC m=+122.093278044 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-approver-tls" (UniqueName: "kubernetes.io/secret/23bc42fe-aadb-4283-a679-b07d87b04a15-machine-approver-tls") pod "machine-approver-56656f9798-9w96x" (UID: "23bc42fe-aadb-4283-a679-b07d87b04a15") : failed to sync secret cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.033643 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-service-ca podName:c8c3ac1c-4896-4db2-8917-0a57667a1fa8 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:58.533626162 +0000 UTC m=+122.093554042 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca" (UniqueName: "kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-service-ca") pod "console-f9d7485db-dnx9z" (UID: "c8c3ac1c-4896-4db2-8917-0a57667a1fa8") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.033685 4971 configmap.go:193] Couldn't get configMap openshift-console/console-config: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.033851 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-config podName:c8c3ac1c-4896-4db2-8917-0a57667a1fa8 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:58.533839659 +0000 UTC m=+122.093767579 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "console-config" (UniqueName: "kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-config") pod "console-f9d7485db-dnx9z" (UID: "c8c3ac1c-4896-4db2-8917-0a57667a1fa8") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.036066 4971 secret.go:188] Couldn't get secret openshift-config-operator/config-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.036138 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80f7e4a7-4617-4978-b42e-8a33b6465690-serving-cert podName:80f7e4a7-4617-4978-b42e-8a33b6465690 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:58.536115215 +0000 UTC m=+122.096043035 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/80f7e4a7-4617-4978-b42e-8a33b6465690-serving-cert") pod "openshift-config-operator-7777fb866f-h78mk" (UID: "80f7e4a7-4617-4978-b42e-8a33b6465690") : failed to sync secret cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.039722 4971 secret.go:188] Couldn't get secret openshift-console/console-oauth-config: failed to sync secret cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.039733 4971 secret.go:188] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.039796 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-oauth-config podName:c8c3ac1c-4896-4db2-8917-0a57667a1fa8 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:58.539780322 +0000 UTC m=+122.099708212 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "console-oauth-config" (UniqueName: "kubernetes.io/secret/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-oauth-config") pod "console-f9d7485db-dnx9z" (UID: "c8c3ac1c-4896-4db2-8917-0a57667a1fa8") : failed to sync secret cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.039884 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d28dca1b-efeb-4b15-833b-8bc78aa16238-samples-operator-tls podName:d28dca1b-efeb-4b15-833b-8bc78aa16238 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:58.539866305 +0000 UTC m=+122.099794125 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d28dca1b-efeb-4b15-833b-8bc78aa16238-samples-operator-tls") pod "cluster-samples-operator-665b6dd947-hg8qg" (UID: "d28dca1b-efeb-4b15-833b-8bc78aa16238") : failed to sync secret cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.041785 4971 request.go:700] Waited for 1.009960422s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/serviceaccounts/console/token Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.041799 4971 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.041880 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/23bc42fe-aadb-4283-a679-b07d87b04a15-auth-proxy-config podName:23bc42fe-aadb-4283-a679-b07d87b04a15 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:58.541862083 +0000 UTC m=+122.101789993 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/23bc42fe-aadb-4283-a679-b07d87b04a15-auth-proxy-config") pod "machine-approver-56656f9798-9w96x" (UID: "23bc42fe-aadb-4283-a679-b07d87b04a15") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.065904 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.084206 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.123670 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.140057 4971 configmap.go:193] Couldn't get configMap openshift-image-registry/trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.140165 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d4feaecd-f674-489f-a6d3-12e5f433d90e-trusted-ca podName:d4feaecd-f674-489f-a6d3-12e5f433d90e nodeName:}" failed. No retries permitted until 2026-03-09 09:21:58.640135362 +0000 UTC m=+122.200063172 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/d4feaecd-f674-489f-a6d3-12e5f433d90e-trusted-ca") pod "cluster-image-registry-operator-dc59b4c8b-cm5pk" (UID: "d4feaecd-f674-489f-a6d3-12e5f433d90e") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.140191 4971 secret.go:188] Couldn't get secret openshift-image-registry/image-registry-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.140279 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4feaecd-f674-489f-a6d3-12e5f433d90e-image-registry-operator-tls podName:d4feaecd-f674-489f-a6d3-12e5f433d90e nodeName:}" failed. No retries permitted until 2026-03-09 09:21:58.640257435 +0000 UTC m=+122.200185305 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/d4feaecd-f674-489f-a6d3-12e5f433d90e-image-registry-operator-tls") pod "cluster-image-registry-operator-dc59b4c8b-cm5pk" (UID: "d4feaecd-f674-489f-a6d3-12e5f433d90e") : failed to sync secret cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.140325 4971 configmap.go:193] Couldn't get configMap openshift-etcd-operator/etcd-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.140378 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-config podName:2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da nodeName:}" failed. No retries permitted until 2026-03-09 09:21:58.640367328 +0000 UTC m=+122.200295138 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-config") pod "etcd-operator-b45778765-nw59v" (UID: "2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.141484 4971 configmap.go:193] Couldn't get configMap openshift-etcd-operator/etcd-service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.141509 4971 secret.go:188] Couldn't get secret openshift-etcd-operator/etcd-client: failed to sync secret cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.141534 4971 secret.go:188] Couldn't get secret openshift-etcd-operator/etcd-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.141538 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-etcd-service-ca podName:2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da nodeName:}" failed. No retries permitted until 2026-03-09 09:21:58.641528172 +0000 UTC m=+122.201455982 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-service-ca" (UniqueName: "kubernetes.io/configmap/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-etcd-service-ca") pod "etcd-operator-b45778765-nw59v" (UID: "2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.141584 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-etcd-client podName:2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da nodeName:}" failed. No retries permitted until 2026-03-09 09:21:58.641573834 +0000 UTC m=+122.201501644 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-etcd-client") pod "etcd-operator-b45778765-nw59v" (UID: "2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da") : failed to sync secret cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.141596 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-serving-cert podName:2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da nodeName:}" failed. No retries permitted until 2026-03-09 09:21:58.641590164 +0000 UTC m=+122.201517974 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-serving-cert") pod "etcd-operator-b45778765-nw59v" (UID: "2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da") : failed to sync secret cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.142657 4971 configmap.go:193] Couldn't get configMap openshift-etcd-operator/etcd-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.142699 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-etcd-ca podName:2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da nodeName:}" failed. No retries permitted until 2026-03-09 09:21:58.642690766 +0000 UTC m=+122.202618576 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-ca" (UniqueName: "kubernetes.io/configmap/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-etcd-ca") pod "etcd-operator-b45778765-nw59v" (UID: "2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.144075 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.150927 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9lhtb" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.163272 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.183675 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.203421 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.225190 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.243470 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.263303 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.283230 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.305117 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.323986 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.344373 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.365238 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.384206 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.404604 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.423302 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.443428 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.464953 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.484584 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.504409 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.524483 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.543641 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.556434 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-config\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.556511 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-oauth-serving-cert\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.556555 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80f7e4a7-4617-4978-b42e-8a33b6465690-serving-cert\") pod \"openshift-config-operator-7777fb866f-h78mk\" (UID: \"80f7e4a7-4617-4978-b42e-8a33b6465690\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h78mk" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.556611 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/23bc42fe-aadb-4283-a679-b07d87b04a15-machine-approver-tls\") pod \"machine-approver-56656f9798-9w96x\" (UID: \"23bc42fe-aadb-4283-a679-b07d87b04a15\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9w96x" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.556639 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-serving-cert\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.556696 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-trusted-ca-bundle\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.556739 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-oauth-config\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.556781 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23bc42fe-aadb-4283-a679-b07d87b04a15-config\") pod \"machine-approver-56656f9798-9w96x\" (UID: \"23bc42fe-aadb-4283-a679-b07d87b04a15\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9w96x" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.556813 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d28dca1b-efeb-4b15-833b-8bc78aa16238-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-hg8qg\" (UID: \"d28dca1b-efeb-4b15-833b-8bc78aa16238\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hg8qg" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.556840 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/23bc42fe-aadb-4283-a679-b07d87b04a15-auth-proxy-config\") pod \"machine-approver-56656f9798-9w96x\" (UID: \"23bc42fe-aadb-4283-a679-b07d87b04a15\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9w96x" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.556872 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-service-ca\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.564144 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.583790 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.603327 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.624470 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.644609 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.657942 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d4feaecd-f674-489f-a6d3-12e5f433d90e-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cm5pk\" (UID: \"d4feaecd-f674-489f-a6d3-12e5f433d90e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cm5pk" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.658148 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-etcd-client\") pod \"etcd-operator-b45778765-nw59v\" (UID: \"2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.658217 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4feaecd-f674-489f-a6d3-12e5f433d90e-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cm5pk\" (UID: \"d4feaecd-f674-489f-a6d3-12e5f433d90e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cm5pk" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.658455 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-serving-cert\") pod \"etcd-operator-b45778765-nw59v\" (UID: \"2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.658504 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-config\") pod \"etcd-operator-b45778765-nw59v\" (UID: \"2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.658552 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-etcd-service-ca\") pod \"etcd-operator-b45778765-nw59v\" (UID: \"2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.658734 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-etcd-ca\") pod \"etcd-operator-b45778765-nw59v\" (UID: \"2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.683641 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.704405 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.723761 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.743417 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.764071 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.784233 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.804471 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.844959 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.860781 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.860914 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.863857 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.883141 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.903673 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.924031 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.943908 4971 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.958179 4971 projected.go:288] Couldn't get configMap openshift-cluster-machine-approver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.962021 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.962188 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:14.962169408 +0000 UTC m=+138.522097218 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.962337 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.962722 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.964427 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 09 09:21:58 crc kubenswrapper[4971]: E0309 09:21:58.978746 4971 projected.go:288] Couldn't get configMap openshift-console-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:58 crc kubenswrapper[4971]: I0309 09:21:58.986702 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.005230 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.008521 4971 projected.go:288] Couldn't get configMap openshift-cluster-samples-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.017896 4971 projected.go:288] Couldn't get configMap openshift-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.024004 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.037996 4971 projected.go:288] Couldn't get configMap openshift-console/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.042659 4971 request.go:700] Waited for 1.902747334s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-dockercfg-jwfmh&limit=500&resourceVersion=0 Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.044592 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.054904 4971 projected.go:288] Couldn't get configMap openshift-console/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.078694 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2crc\" (UniqueName: \"kubernetes.io/projected/d4feaecd-f674-489f-a6d3-12e5f433d90e-kube-api-access-l2crc\") pod \"cluster-image-registry-operator-dc59b4c8b-cm5pk\" (UID: \"d4feaecd-f674-489f-a6d3-12e5f433d90e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cm5pk" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.083760 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.104135 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.116154 4971 projected.go:288] Couldn't get configMap openshift-dns-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.129926 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.140483 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d4feaecd-f674-489f-a6d3-12e5f433d90e-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cm5pk\" (UID: \"d4feaecd-f674-489f-a6d3-12e5f433d90e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cm5pk" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.165005 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.166851 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca599a0a-36fb-4040-9304-01eb9d4c19d0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bzmmh\" (UID: \"ca599a0a-36fb-4040-9304-01eb9d4c19d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bzmmh" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.172805 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4feaecd-f674-489f-a6d3-12e5f433d90e-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cm5pk\" (UID: \"d4feaecd-f674-489f-a6d3-12e5f433d90e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cm5pk" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.204068 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.210137 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-config\") pod \"etcd-operator-b45778765-nw59v\" (UID: \"2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.223458 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.229390 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-etcd-service-ca\") pod \"etcd-operator-b45778765-nw59v\" (UID: \"2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.260592 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d4feaecd-f674-489f-a6d3-12e5f433d90e-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cm5pk\" (UID: \"d4feaecd-f674-489f-a6d3-12e5f433d90e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cm5pk" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.263696 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.275871 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-serving-cert\") pod \"etcd-operator-b45778765-nw59v\" (UID: \"2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.283426 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.291452 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-etcd-client\") pod \"etcd-operator-b45778765-nw59v\" (UID: \"2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.297548 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bzmmh" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.304070 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.313953 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-etcd-ca\") pod \"etcd-operator-b45778765-nw59v\" (UID: \"2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.323461 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.333294 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.343962 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.354433 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.364301 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.383719 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.397840 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.399610 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.444182 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.463564 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.467779 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23bc42fe-aadb-4283-a679-b07d87b04a15-config\") pod \"machine-approver-56656f9798-9w96x\" (UID: \"23bc42fe-aadb-4283-a679-b07d87b04a15\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9w96x" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.468266 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9f38918-24d8-44d6-9ed5-0d9e69ddc590-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pwp59\" (UID: \"f9f38918-24d8-44d6-9ed5-0d9e69ddc590\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwp59" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.468330 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50b5a937-bb23-4b89-86a3-6ad4944f5440-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mct42\" (UID: \"50b5a937-bb23-4b89-86a3-6ad4944f5440\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mct42" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.468406 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5073d2d2-177a-4e70-9638-7fe56084c301-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.468430 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5073d2d2-177a-4e70-9638-7fe56084c301-encryption-config\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.468456 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.468493 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.468531 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ddfae4b-5893-4e15-a983-1adb19c5970e-trusted-ca\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.468554 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.468577 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9e0270a9-8b08-4abf-88da-75319c5e6f48-stats-auth\") pod \"router-default-5444994796-w4c8h\" (UID: \"9e0270a9-8b08-4abf-88da-75319c5e6f48\") " pod="openshift-ingress/router-default-5444994796-w4c8h" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.468599 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0325b4dc-fe2a-4685-8e37-621a96f6b976-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lmr9s\" (UID: \"0325b4dc-fe2a-4685-8e37-621a96f6b976\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.468623 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5073d2d2-177a-4e70-9638-7fe56084c301-audit-dir\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.468728 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc9xf\" (UniqueName: \"kubernetes.io/projected/70b1c95e-1326-4a4d-92f8-12df76f6a23a-kube-api-access-pc9xf\") pod \"route-controller-manager-6576b87f9c-t9sxl\" (UID: \"70b1c95e-1326-4a4d-92f8-12df76f6a23a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.468771 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2fbb93d1-04cc-4152-b593-4fc23ebfa1ac-trusted-ca\") pod \"ingress-operator-5b745b69d9-vsdnk\" (UID: \"2fbb93d1-04cc-4152-b593-4fc23ebfa1ac\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdnk" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.468814 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8rps\" (UniqueName: \"kubernetes.io/projected/f9f38918-24d8-44d6-9ed5-0d9e69ddc590-kube-api-access-l8rps\") pod \"kube-storage-version-migrator-operator-b67b599dd-pwp59\" (UID: \"f9f38918-24d8-44d6-9ed5-0d9e69ddc590\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwp59" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.468868 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50b5a937-bb23-4b89-86a3-6ad4944f5440-config\") pod \"authentication-operator-69f744f599-mct42\" (UID: \"50b5a937-bb23-4b89-86a3-6ad4944f5440\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mct42" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.468905 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0325b4dc-fe2a-4685-8e37-621a96f6b976-config\") pod \"controller-manager-879f6c89f-lmr9s\" (UID: \"0325b4dc-fe2a-4685-8e37-621a96f6b976\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.468928 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/db64f07f-f1cb-4754-8e1f-33951a826f78-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t9hb6\" (UID: \"db64f07f-f1cb-4754-8e1f-33951a826f78\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t9hb6" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.468957 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.468982 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.469024 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.469088 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50b5a937-bb23-4b89-86a3-6ad4944f5440-service-ca-bundle\") pod \"authentication-operator-69f744f599-mct42\" (UID: \"50b5a937-bb23-4b89-86a3-6ad4944f5440\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mct42" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.469114 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e0270a9-8b08-4abf-88da-75319c5e6f48-service-ca-bundle\") pod \"router-default-5444994796-w4c8h\" (UID: \"9e0270a9-8b08-4abf-88da-75319c5e6f48\") " pod="openshift-ingress/router-default-5444994796-w4c8h" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.469131 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpzdw\" (UniqueName: \"kubernetes.io/projected/5073d2d2-177a-4e70-9638-7fe56084c301-kube-api-access-rpzdw\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.469151 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7ddfae4b-5893-4e15-a983-1adb19c5970e-registry-tls\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.469307 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7ddfae4b-5893-4e15-a983-1adb19c5970e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.469338 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7ddfae4b-5893-4e15-a983-1adb19c5970e-registry-certificates\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.469381 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50b5a937-bb23-4b89-86a3-6ad4944f5440-serving-cert\") pod \"authentication-operator-69f744f599-mct42\" (UID: \"50b5a937-bb23-4b89-86a3-6ad4944f5440\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mct42" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.469407 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.469432 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk7tf\" (UniqueName: \"kubernetes.io/projected/2555712b-fa0a-4831-90ca-78d22b2e48b9-kube-api-access-jk7tf\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.469459 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70b1c95e-1326-4a4d-92f8-12df76f6a23a-config\") pod \"route-controller-manager-6576b87f9c-t9sxl\" (UID: \"70b1c95e-1326-4a4d-92f8-12df76f6a23a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.469525 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5073d2d2-177a-4e70-9638-7fe56084c301-serving-cert\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.469542 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:21:59.969529938 +0000 UTC m=+123.529457748 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.469573 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.469591 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/db64f07f-f1cb-4754-8e1f-33951a826f78-images\") pod \"machine-api-operator-5694c8668f-t9hb6\" (UID: \"db64f07f-f1cb-4754-8e1f-33951a826f78\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t9hb6" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.469614 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.469629 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0325b4dc-fe2a-4685-8e37-621a96f6b976-client-ca\") pod \"controller-manager-879f6c89f-lmr9s\" (UID: \"0325b4dc-fe2a-4685-8e37-621a96f6b976\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.469648 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5073d2d2-177a-4e70-9638-7fe56084c301-etcd-client\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.469708 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7ddfae4b-5893-4e15-a983-1adb19c5970e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.469753 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2555712b-fa0a-4831-90ca-78d22b2e48b9-audit-dir\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.469789 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0325b4dc-fe2a-4685-8e37-621a96f6b976-serving-cert\") pod \"controller-manager-879f6c89f-lmr9s\" (UID: \"0325b4dc-fe2a-4685-8e37-621a96f6b976\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.469823 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f037869a-34a5-43d9-8c27-6ac17e4fe6b1-config\") pod \"openshift-apiserver-operator-796bbdcf4f-pf6b7\" (UID: \"f037869a-34a5-43d9-8c27-6ac17e4fe6b1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pf6b7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.469844 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2fbb93d1-04cc-4152-b593-4fc23ebfa1ac-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vsdnk\" (UID: \"2fbb93d1-04cc-4152-b593-4fc23ebfa1ac\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdnk" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.469875 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xlzb\" (UniqueName: \"kubernetes.io/projected/7ddfae4b-5893-4e15-a983-1adb19c5970e-kube-api-access-8xlzb\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.469925 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92vq4\" (UniqueName: \"kubernetes.io/projected/f037869a-34a5-43d9-8c27-6ac17e4fe6b1-kube-api-access-92vq4\") pod \"openshift-apiserver-operator-796bbdcf4f-pf6b7\" (UID: \"f037869a-34a5-43d9-8c27-6ac17e4fe6b1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pf6b7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.469997 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.470022 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2555712b-fa0a-4831-90ca-78d22b2e48b9-audit-policies\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.470054 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtn24\" (UniqueName: \"kubernetes.io/projected/9e0270a9-8b08-4abf-88da-75319c5e6f48-kube-api-access-vtn24\") pod \"router-default-5444994796-w4c8h\" (UID: \"9e0270a9-8b08-4abf-88da-75319c5e6f48\") " pod="openshift-ingress/router-default-5444994796-w4c8h" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.470102 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph782\" (UniqueName: \"kubernetes.io/projected/2fbb93d1-04cc-4152-b593-4fc23ebfa1ac-kube-api-access-ph782\") pod \"ingress-operator-5b745b69d9-vsdnk\" (UID: \"2fbb93d1-04cc-4152-b593-4fc23ebfa1ac\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdnk" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.470492 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.470585 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e0270a9-8b08-4abf-88da-75319c5e6f48-metrics-certs\") pod \"router-default-5444994796-w4c8h\" (UID: \"9e0270a9-8b08-4abf-88da-75319c5e6f48\") " pod="openshift-ingress/router-default-5444994796-w4c8h" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.470610 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70b1c95e-1326-4a4d-92f8-12df76f6a23a-client-ca\") pod \"route-controller-manager-6576b87f9c-t9sxl\" (UID: \"70b1c95e-1326-4a4d-92f8-12df76f6a23a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.470704 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7ddfae4b-5893-4e15-a983-1adb19c5970e-bound-sa-token\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.470737 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9e0270a9-8b08-4abf-88da-75319c5e6f48-default-certificate\") pod \"router-default-5444994796-w4c8h\" (UID: \"9e0270a9-8b08-4abf-88da-75319c5e6f48\") " pod="openshift-ingress/router-default-5444994796-w4c8h" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.470755 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htn6w\" (UniqueName: \"kubernetes.io/projected/0325b4dc-fe2a-4685-8e37-621a96f6b976-kube-api-access-htn6w\") pod \"controller-manager-879f6c89f-lmr9s\" (UID: \"0325b4dc-fe2a-4685-8e37-621a96f6b976\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.470916 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f037869a-34a5-43d9-8c27-6ac17e4fe6b1-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-pf6b7\" (UID: \"f037869a-34a5-43d9-8c27-6ac17e4fe6b1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pf6b7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.470974 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5mrq\" (UniqueName: \"kubernetes.io/projected/50b5a937-bb23-4b89-86a3-6ad4944f5440-kube-api-access-p5mrq\") pod \"authentication-operator-69f744f599-mct42\" (UID: \"50b5a937-bb23-4b89-86a3-6ad4944f5440\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mct42" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.471005 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g9hh\" (UniqueName: \"kubernetes.io/projected/db64f07f-f1cb-4754-8e1f-33951a826f78-kube-api-access-5g9hh\") pod \"machine-api-operator-5694c8668f-t9hb6\" (UID: \"db64f07f-f1cb-4754-8e1f-33951a826f78\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t9hb6" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.471049 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.471098 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5073d2d2-177a-4e70-9638-7fe56084c301-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.471121 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70b1c95e-1326-4a4d-92f8-12df76f6a23a-serving-cert\") pod \"route-controller-manager-6576b87f9c-t9sxl\" (UID: \"70b1c95e-1326-4a4d-92f8-12df76f6a23a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.471180 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5073d2d2-177a-4e70-9638-7fe56084c301-audit-policies\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.471201 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db64f07f-f1cb-4754-8e1f-33951a826f78-config\") pod \"machine-api-operator-5694c8668f-t9hb6\" (UID: \"db64f07f-f1cb-4754-8e1f-33951a826f78\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t9hb6" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.471223 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2fbb93d1-04cc-4152-b593-4fc23ebfa1ac-metrics-tls\") pod \"ingress-operator-5b745b69d9-vsdnk\" (UID: \"2fbb93d1-04cc-4152-b593-4fc23ebfa1ac\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdnk" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.471245 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9f38918-24d8-44d6-9ed5-0d9e69ddc590-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pwp59\" (UID: \"f9f38918-24d8-44d6-9ed5-0d9e69ddc590\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwp59" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.476783 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bzmmh"] Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.484693 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 09 09:21:59 crc kubenswrapper[4971]: W0309 09:21:59.487105 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca599a0a_36fb_4040_9304_01eb9d4c19d0.slice/crio-6b4152f684aff152a0c84dd7638e7403fa09e7463d5bb999367b129655395091 WatchSource:0}: Error finding container 6b4152f684aff152a0c84dd7638e7403fa09e7463d5bb999367b129655395091: Status 404 returned error can't find the container with id 6b4152f684aff152a0c84dd7638e7403fa09e7463d5bb999367b129655395091 Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.503063 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.508459 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-oauth-serving-cert\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.530444 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.543510 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.549050 4971 projected.go:194] Error preparing data for projected volume kube-api-access-4kd9w for pod openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hg8qg: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.549138 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d28dca1b-efeb-4b15-833b-8bc78aa16238-kube-api-access-4kd9w podName:d28dca1b-efeb-4b15-833b-8bc78aa16238 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.049118841 +0000 UTC m=+123.609046651 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4kd9w" (UniqueName: "kubernetes.io/projected/d28dca1b-efeb-4b15-833b-8bc78aa16238-kube-api-access-4kd9w") pod "cluster-samples-operator-665b6dd947-hg8qg" (UID: "d28dca1b-efeb-4b15-833b-8bc78aa16238") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.555031 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.556699 4971 configmap.go:193] Couldn't get configMap openshift-console/console-config: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.556761 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-config podName:c8c3ac1c-4896-4db2-8917-0a57667a1fa8 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.556748214 +0000 UTC m=+124.116676024 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "console-config" (UniqueName: "kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-config") pod "console-f9d7485db-dnx9z" (UID: "c8c3ac1c-4896-4db2-8917-0a57667a1fa8") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.556713 4971 secret.go:188] Couldn't get secret openshift-config-operator/config-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.556848 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80f7e4a7-4617-4978-b42e-8a33b6465690-serving-cert podName:80f7e4a7-4617-4978-b42e-8a33b6465690 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.556835916 +0000 UTC m=+124.116763726 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/80f7e4a7-4617-4978-b42e-8a33b6465690-serving-cert") pod "openshift-config-operator-7777fb866f-h78mk" (UID: "80f7e4a7-4617-4978-b42e-8a33b6465690") : failed to sync secret cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.556885 4971 secret.go:188] Couldn't get secret openshift-console/console-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.556903 4971 secret.go:188] Couldn't get secret openshift-console/console-oauth-config: failed to sync secret cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.556937 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-oauth-config podName:c8c3ac1c-4896-4db2-8917-0a57667a1fa8 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.556928359 +0000 UTC m=+124.116856169 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "console-oauth-config" (UniqueName: "kubernetes.io/secret/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-oauth-config") pod "console-f9d7485db-dnx9z" (UID: "c8c3ac1c-4896-4db2-8917-0a57667a1fa8") : failed to sync secret cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.556954 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-serving-cert podName:c8c3ac1c-4896-4db2-8917-0a57667a1fa8 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.556945669 +0000 UTC m=+124.116873479 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-serving-cert") pod "console-f9d7485db-dnx9z" (UID: "c8c3ac1c-4896-4db2-8917-0a57667a1fa8") : failed to sync secret cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.556963 4971 configmap.go:193] Couldn't get configMap openshift-console/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.556979 4971 secret.go:188] Couldn't get secret openshift-cluster-machine-approver/machine-approver-tls: failed to sync secret cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.556996 4971 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.557016 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23bc42fe-aadb-4283-a679-b07d87b04a15-machine-approver-tls podName:23bc42fe-aadb-4283-a679-b07d87b04a15 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.557006321 +0000 UTC m=+124.116934131 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "machine-approver-tls" (UniqueName: "kubernetes.io/secret/23bc42fe-aadb-4283-a679-b07d87b04a15-machine-approver-tls") pod "machine-approver-56656f9798-9w96x" (UID: "23bc42fe-aadb-4283-a679-b07d87b04a15") : failed to sync secret cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.556966 4971 configmap.go:193] Couldn't get configMap openshift-console/service-ca: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.556891 4971 secret.go:188] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.557057 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-trusted-ca-bundle podName:c8c3ac1c-4896-4db2-8917-0a57667a1fa8 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.557023722 +0000 UTC m=+124.116951562 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-trusted-ca-bundle") pod "console-f9d7485db-dnx9z" (UID: "c8c3ac1c-4896-4db2-8917-0a57667a1fa8") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.557076 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d28dca1b-efeb-4b15-833b-8bc78aa16238-samples-operator-tls podName:d28dca1b-efeb-4b15-833b-8bc78aa16238 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.557068503 +0000 UTC m=+124.116996313 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d28dca1b-efeb-4b15-833b-8bc78aa16238-samples-operator-tls") pod "cluster-samples-operator-665b6dd947-hg8qg" (UID: "d28dca1b-efeb-4b15-833b-8bc78aa16238") : failed to sync secret cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.557094 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/23bc42fe-aadb-4283-a679-b07d87b04a15-auth-proxy-config podName:23bc42fe-aadb-4283-a679-b07d87b04a15 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.557086883 +0000 UTC m=+124.117014873 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/23bc42fe-aadb-4283-a679-b07d87b04a15-auth-proxy-config") pod "machine-approver-56656f9798-9w96x" (UID: "23bc42fe-aadb-4283-a679-b07d87b04a15") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.557107 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-service-ca podName:c8c3ac1c-4896-4db2-8917-0a57667a1fa8 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.557102524 +0000 UTC m=+124.117030334 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca" (UniqueName: "kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-service-ca") pod "console-f9d7485db-dnx9z" (UID: "c8c3ac1c-4896-4db2-8917-0a57667a1fa8") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.562713 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.562742 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.570613 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.571672 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.571817 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.071797343 +0000 UTC m=+123.631725163 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.571865 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f037869a-34a5-43d9-8c27-6ac17e4fe6b1-config\") pod \"openshift-apiserver-operator-796bbdcf4f-pf6b7\" (UID: \"f037869a-34a5-43d9-8c27-6ac17e4fe6b1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pf6b7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.571897 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xlzb\" (UniqueName: \"kubernetes.io/projected/7ddfae4b-5893-4e15-a983-1adb19c5970e-kube-api-access-8xlzb\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.571941 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92vq4\" (UniqueName: \"kubernetes.io/projected/f037869a-34a5-43d9-8c27-6ac17e4fe6b1-kube-api-access-92vq4\") pod \"openshift-apiserver-operator-796bbdcf4f-pf6b7\" (UID: \"f037869a-34a5-43d9-8c27-6ac17e4fe6b1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pf6b7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.571961 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.571981 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2555712b-fa0a-4831-90ca-78d22b2e48b9-audit-policies\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572020 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95c0f392-b6e8-4719-8cad-0d267cf8b955-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2kmr7\" (UID: \"95c0f392-b6e8-4719-8cad-0d267cf8b955\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2kmr7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572040 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwg7w\" (UniqueName: \"kubernetes.io/projected/56909779-30d4-4350-810d-9675796d96ad-kube-api-access-hwg7w\") pod \"olm-operator-6b444d44fb-5bzw7\" (UID: \"56909779-30d4-4350-810d-9675796d96ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bzw7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572055 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/43af91cb-669b-473a-a92f-d6b8fffa0cc7-certs\") pod \"machine-config-server-4g4cr\" (UID: \"43af91cb-669b-473a-a92f-d6b8fffa0cc7\") " pod="openshift-machine-config-operator/machine-config-server-4g4cr" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572072 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph782\" (UniqueName: \"kubernetes.io/projected/2fbb93d1-04cc-4152-b593-4fc23ebfa1ac-kube-api-access-ph782\") pod \"ingress-operator-5b745b69d9-vsdnk\" (UID: \"2fbb93d1-04cc-4152-b593-4fc23ebfa1ac\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdnk" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572110 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npm8w\" (UniqueName: \"kubernetes.io/projected/1707bff4-eb31-4ed0-bbc5-054813b1a34a-kube-api-access-npm8w\") pod \"collect-profiles-29550795-nmbwp\" (UID: \"1707bff4-eb31-4ed0-bbc5-054813b1a34a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-nmbwp" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572127 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/91b993d3-35bb-4b9b-9e1a-ca96fa6f8162-csi-data-dir\") pod \"csi-hostpathplugin-krqgm\" (UID: \"91b993d3-35bb-4b9b-9e1a-ca96fa6f8162\") " pod="hostpath-provisioner/csi-hostpathplugin-krqgm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572143 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxf26\" (UniqueName: \"kubernetes.io/projected/43af91cb-669b-473a-a92f-d6b8fffa0cc7-kube-api-access-jxf26\") pod \"machine-config-server-4g4cr\" (UID: \"43af91cb-669b-473a-a92f-d6b8fffa0cc7\") " pod="openshift-machine-config-operator/machine-config-server-4g4cr" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572178 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f372b9ff-41d6-4712-bf7a-9c229f1f7673-config-volume\") pod \"dns-default-th2ls\" (UID: \"f372b9ff-41d6-4712-bf7a-9c229f1f7673\") " pod="openshift-dns/dns-default-th2ls" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572197 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572213 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7ddfae4b-5893-4e15-a983-1adb19c5970e-bound-sa-token\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572231 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9e0270a9-8b08-4abf-88da-75319c5e6f48-default-certificate\") pod \"router-default-5444994796-w4c8h\" (UID: \"9e0270a9-8b08-4abf-88da-75319c5e6f48\") " pod="openshift-ingress/router-default-5444994796-w4c8h" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572285 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f037869a-34a5-43d9-8c27-6ac17e4fe6b1-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-pf6b7\" (UID: \"f037869a-34a5-43d9-8c27-6ac17e4fe6b1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pf6b7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572302 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5mrq\" (UniqueName: \"kubernetes.io/projected/50b5a937-bb23-4b89-86a3-6ad4944f5440-kube-api-access-p5mrq\") pod \"authentication-operator-69f744f599-mct42\" (UID: \"50b5a937-bb23-4b89-86a3-6ad4944f5440\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mct42" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572339 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9wjh\" (UniqueName: \"kubernetes.io/projected/1ed6451f-4bc6-4dcc-b84c-413dbb95114b-kube-api-access-j9wjh\") pod \"marketplace-operator-79b997595-zqnt8\" (UID: \"1ed6451f-4bc6-4dcc-b84c-413dbb95114b\") " pod="openshift-marketplace/marketplace-operator-79b997595-zqnt8" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572392 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfbb9\" (UniqueName: \"kubernetes.io/projected/83f8f490-e050-4721-8784-0879496323ad-kube-api-access-xfbb9\") pod \"machine-config-operator-74547568cd-4zczm\" (UID: \"83f8f490-e050-4721-8784-0879496323ad\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zczm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572411 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572430 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5073d2d2-177a-4e70-9638-7fe56084c301-audit-policies\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572498 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db64f07f-f1cb-4754-8e1f-33951a826f78-config\") pod \"machine-api-operator-5694c8668f-t9hb6\" (UID: \"db64f07f-f1cb-4754-8e1f-33951a826f78\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t9hb6" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572538 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9f38918-24d8-44d6-9ed5-0d9e69ddc590-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pwp59\" (UID: \"f9f38918-24d8-44d6-9ed5-0d9e69ddc590\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwp59" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572569 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/91b993d3-35bb-4b9b-9e1a-ca96fa6f8162-plugins-dir\") pod \"csi-hostpathplugin-krqgm\" (UID: \"91b993d3-35bb-4b9b-9e1a-ca96fa6f8162\") " pod="hostpath-provisioner/csi-hostpathplugin-krqgm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572617 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9f38918-24d8-44d6-9ed5-0d9e69ddc590-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pwp59\" (UID: \"f9f38918-24d8-44d6-9ed5-0d9e69ddc590\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwp59" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572636 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5073d2d2-177a-4e70-9638-7fe56084c301-encryption-config\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572653 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/43af91cb-669b-473a-a92f-d6b8fffa0cc7-node-bootstrap-token\") pod \"machine-config-server-4g4cr\" (UID: \"43af91cb-669b-473a-a92f-d6b8fffa0cc7\") " pod="openshift-machine-config-operator/machine-config-server-4g4cr" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572669 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtqcm\" (UniqueName: \"kubernetes.io/projected/91b993d3-35bb-4b9b-9e1a-ca96fa6f8162-kube-api-access-jtqcm\") pod \"csi-hostpathplugin-krqgm\" (UID: \"91b993d3-35bb-4b9b-9e1a-ca96fa6f8162\") " pod="hostpath-provisioner/csi-hostpathplugin-krqgm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572686 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btv9h\" (UniqueName: \"kubernetes.io/projected/e6ecdd40-f0c8-4f9d-9b8a-90c9941d159a-kube-api-access-btv9h\") pod \"service-ca-9c57cc56f-fg4hj\" (UID: \"e6ecdd40-f0c8-4f9d-9b8a-90c9941d159a\") " pod="openshift-service-ca/service-ca-9c57cc56f-fg4hj" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572703 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572721 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ddfae4b-5893-4e15-a983-1adb19c5970e-trusted-ca\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572737 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572753 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9e0270a9-8b08-4abf-88da-75319c5e6f48-stats-auth\") pod \"router-default-5444994796-w4c8h\" (UID: \"9e0270a9-8b08-4abf-88da-75319c5e6f48\") " pod="openshift-ingress/router-default-5444994796-w4c8h" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572770 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0325b4dc-fe2a-4685-8e37-621a96f6b976-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lmr9s\" (UID: \"0325b4dc-fe2a-4685-8e37-621a96f6b976\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572787 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5073d2d2-177a-4e70-9638-7fe56084c301-audit-dir\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572803 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/13a19b2e-fdd8-41cc-89ac-ed182fa3a449-tmpfs\") pod \"packageserver-d55dfcdfc-b65bx\" (UID: \"13a19b2e-fdd8-41cc-89ac-ed182fa3a449\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b65bx" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572821 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/91b993d3-35bb-4b9b-9e1a-ca96fa6f8162-registration-dir\") pod \"csi-hostpathplugin-krqgm\" (UID: \"91b993d3-35bb-4b9b-9e1a-ca96fa6f8162\") " pod="hostpath-provisioner/csi-hostpathplugin-krqgm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572838 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7vj4\" (UniqueName: \"kubernetes.io/projected/6dba6300-591c-4fc5-8544-b208731d2dc6-kube-api-access-x7vj4\") pod \"ingress-canary-57xjq\" (UID: \"6dba6300-591c-4fc5-8544-b208731d2dc6\") " pod="openshift-ingress-canary/ingress-canary-57xjq" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572855 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2fbb93d1-04cc-4152-b593-4fc23ebfa1ac-trusted-ca\") pod \"ingress-operator-5b745b69d9-vsdnk\" (UID: \"2fbb93d1-04cc-4152-b593-4fc23ebfa1ac\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdnk" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572878 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95c0f392-b6e8-4719-8cad-0d267cf8b955-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2kmr7\" (UID: \"95c0f392-b6e8-4719-8cad-0d267cf8b955\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2kmr7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572893 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/83f8f490-e050-4721-8784-0879496323ad-images\") pod \"machine-config-operator-74547568cd-4zczm\" (UID: \"83f8f490-e050-4721-8784-0879496323ad\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zczm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572919 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50b5a937-bb23-4b89-86a3-6ad4944f5440-config\") pod \"authentication-operator-69f744f599-mct42\" (UID: \"50b5a937-bb23-4b89-86a3-6ad4944f5440\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mct42" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572941 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/56909779-30d4-4350-810d-9675796d96ad-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5bzw7\" (UID: \"56909779-30d4-4350-810d-9675796d96ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bzw7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.572960 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b5fx\" (UniqueName: \"kubernetes.io/projected/7c2058d7-0c77-4f28-a103-679184ed575c-kube-api-access-2b5fx\") pod \"service-ca-operator-777779d784-brkss\" (UID: \"7c2058d7-0c77-4f28-a103-679184ed575c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-brkss" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573014 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573031 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a3dd14-c5c3-4251-88bf-31dcafe04ef1-config\") pod \"kube-controller-manager-operator-78b949d7b-xhbrd\" (UID: \"e8a3dd14-c5c3-4251-88bf-31dcafe04ef1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xhbrd" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573047 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m7rf\" (UniqueName: \"kubernetes.io/projected/13a19b2e-fdd8-41cc-89ac-ed182fa3a449-kube-api-access-4m7rf\") pod \"packageserver-d55dfcdfc-b65bx\" (UID: \"13a19b2e-fdd8-41cc-89ac-ed182fa3a449\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b65bx" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573092 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c91cf18b-1765-48d3-9e00-66747b628f33-srv-cert\") pod \"catalog-operator-68c6474976-j9sc4\" (UID: \"c91cf18b-1765-48d3-9e00-66747b628f33\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j9sc4" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573209 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/39079cdf-1b40-4f77-ad11-3816fc89e3df-proxy-tls\") pod \"machine-config-controller-84d6567774-h8b5s\" (UID: \"39079cdf-1b40-4f77-ad11-3816fc89e3df\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8b5s" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573235 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c2058d7-0c77-4f28-a103-679184ed575c-config\") pod \"service-ca-operator-777779d784-brkss\" (UID: \"7c2058d7-0c77-4f28-a103-679184ed575c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-brkss" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573268 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1707bff4-eb31-4ed0-bbc5-054813b1a34a-config-volume\") pod \"collect-profiles-29550795-nmbwp\" (UID: \"1707bff4-eb31-4ed0-bbc5-054813b1a34a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-nmbwp" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573307 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpzdw\" (UniqueName: \"kubernetes.io/projected/5073d2d2-177a-4e70-9638-7fe56084c301-kube-api-access-rpzdw\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573342 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7ddfae4b-5893-4e15-a983-1adb19c5970e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573388 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7ddfae4b-5893-4e15-a983-1adb19c5970e-registry-certificates\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573404 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573422 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk7tf\" (UniqueName: \"kubernetes.io/projected/2555712b-fa0a-4831-90ca-78d22b2e48b9-kube-api-access-jk7tf\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573462 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70b1c95e-1326-4a4d-92f8-12df76f6a23a-config\") pod \"route-controller-manager-6576b87f9c-t9sxl\" (UID: \"70b1c95e-1326-4a4d-92f8-12df76f6a23a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573483 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fmfp\" (UniqueName: \"kubernetes.io/projected/f372b9ff-41d6-4712-bf7a-9c229f1f7673-kube-api-access-8fmfp\") pod \"dns-default-th2ls\" (UID: \"f372b9ff-41d6-4712-bf7a-9c229f1f7673\") " pod="openshift-dns/dns-default-th2ls" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573506 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e6ecdd40-f0c8-4f9d-9b8a-90c9941d159a-signing-cabundle\") pod \"service-ca-9c57cc56f-fg4hj\" (UID: \"e6ecdd40-f0c8-4f9d-9b8a-90c9941d159a\") " pod="openshift-service-ca/service-ca-9c57cc56f-fg4hj" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573554 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ed6451f-4bc6-4dcc-b84c-413dbb95114b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zqnt8\" (UID: \"1ed6451f-4bc6-4dcc-b84c-413dbb95114b\") " pod="openshift-marketplace/marketplace-operator-79b997595-zqnt8" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573571 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83f8f490-e050-4721-8784-0879496323ad-proxy-tls\") pod \"machine-config-operator-74547568cd-4zczm\" (UID: \"83f8f490-e050-4721-8784-0879496323ad\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zczm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573622 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573639 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5073d2d2-177a-4e70-9638-7fe56084c301-etcd-client\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573664 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/db64f07f-f1cb-4754-8e1f-33951a826f78-images\") pod \"machine-api-operator-5694c8668f-t9hb6\" (UID: \"db64f07f-f1cb-4754-8e1f-33951a826f78\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t9hb6" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573703 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx2kb\" (UniqueName: \"kubernetes.io/projected/39079cdf-1b40-4f77-ad11-3816fc89e3df-kube-api-access-vx2kb\") pod \"machine-config-controller-84d6567774-h8b5s\" (UID: \"39079cdf-1b40-4f77-ad11-3816fc89e3df\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8b5s" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573721 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftf96\" (UniqueName: \"kubernetes.io/projected/339ca768-fe61-40dd-8a4f-93363aa23972-kube-api-access-ftf96\") pod \"package-server-manager-789f6589d5-qz4l7\" (UID: \"339ca768-fe61-40dd-8a4f-93363aa23972\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qz4l7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573741 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjmnd\" (UniqueName: \"kubernetes.io/projected/f7b8f2b8-0607-467d-8ba2-3b823817b639-kube-api-access-hjmnd\") pod \"multus-admission-controller-857f4d67dd-48g6z\" (UID: \"f7b8f2b8-0607-467d-8ba2-3b823817b639\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-48g6z" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573779 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7ddfae4b-5893-4e15-a983-1adb19c5970e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573800 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0325b4dc-fe2a-4685-8e37-621a96f6b976-serving-cert\") pod \"controller-manager-879f6c89f-lmr9s\" (UID: \"0325b4dc-fe2a-4685-8e37-621a96f6b976\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573818 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9cdbff0-0cca-4375-8c92-1117ce5d1dea-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dvp8t\" (UID: \"c9cdbff0-0cca-4375-8c92-1117ce5d1dea\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dvp8t" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573854 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2fbb93d1-04cc-4152-b593-4fc23ebfa1ac-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vsdnk\" (UID: \"2fbb93d1-04cc-4152-b593-4fc23ebfa1ac\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdnk" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573873 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6dba6300-591c-4fc5-8544-b208731d2dc6-cert\") pod \"ingress-canary-57xjq\" (UID: \"6dba6300-591c-4fc5-8544-b208731d2dc6\") " pod="openshift-ingress-canary/ingress-canary-57xjq" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573899 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtn24\" (UniqueName: \"kubernetes.io/projected/9e0270a9-8b08-4abf-88da-75319c5e6f48-kube-api-access-vtn24\") pod \"router-default-5444994796-w4c8h\" (UID: \"9e0270a9-8b08-4abf-88da-75319c5e6f48\") " pod="openshift-ingress/router-default-5444994796-w4c8h" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573944 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e0270a9-8b08-4abf-88da-75319c5e6f48-metrics-certs\") pod \"router-default-5444994796-w4c8h\" (UID: \"9e0270a9-8b08-4abf-88da-75319c5e6f48\") " pod="openshift-ingress/router-default-5444994796-w4c8h" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573964 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ed6451f-4bc6-4dcc-b84c-413dbb95114b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zqnt8\" (UID: \"1ed6451f-4bc6-4dcc-b84c-413dbb95114b\") " pod="openshift-marketplace/marketplace-operator-79b997595-zqnt8" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.573990 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htn6w\" (UniqueName: \"kubernetes.io/projected/0325b4dc-fe2a-4685-8e37-621a96f6b976-kube-api-access-htn6w\") pod \"controller-manager-879f6c89f-lmr9s\" (UID: \"0325b4dc-fe2a-4685-8e37-621a96f6b976\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.574027 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70b1c95e-1326-4a4d-92f8-12df76f6a23a-client-ca\") pod \"route-controller-manager-6576b87f9c-t9sxl\" (UID: \"70b1c95e-1326-4a4d-92f8-12df76f6a23a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.574045 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e6ecdd40-f0c8-4f9d-9b8a-90c9941d159a-signing-key\") pod \"service-ca-9c57cc56f-fg4hj\" (UID: \"e6ecdd40-f0c8-4f9d-9b8a-90c9941d159a\") " pod="openshift-service-ca/service-ca-9c57cc56f-fg4hj" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.574063 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/91b993d3-35bb-4b9b-9e1a-ca96fa6f8162-socket-dir\") pod \"csi-hostpathplugin-krqgm\" (UID: \"91b993d3-35bb-4b9b-9e1a-ca96fa6f8162\") " pod="hostpath-provisioner/csi-hostpathplugin-krqgm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.574100 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2wnj\" (UniqueName: \"kubernetes.io/projected/2432a454-1fbc-4fe4-a6cb-27292e8b670d-kube-api-access-j2wnj\") pod \"migrator-59844c95c7-mbv68\" (UID: \"2432a454-1fbc-4fe4-a6cb-27292e8b670d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mbv68" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.574118 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f372b9ff-41d6-4712-bf7a-9c229f1f7673-metrics-tls\") pod \"dns-default-th2ls\" (UID: \"f372b9ff-41d6-4712-bf7a-9c229f1f7673\") " pod="openshift-dns/dns-default-th2ls" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.574177 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g9hh\" (UniqueName: \"kubernetes.io/projected/db64f07f-f1cb-4754-8e1f-33951a826f78-kube-api-access-5g9hh\") pod \"machine-api-operator-5694c8668f-t9hb6\" (UID: \"db64f07f-f1cb-4754-8e1f-33951a826f78\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t9hb6" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.574198 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1a0999c2-4d90-4197-8075-e11790a0ed9b-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-n8lbv\" (UID: \"1a0999c2-4d90-4197-8075-e11790a0ed9b\") " pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.574215 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5073d2d2-177a-4e70-9638-7fe56084c301-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.574252 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c2058d7-0c77-4f28-a103-679184ed575c-serving-cert\") pod \"service-ca-operator-777779d784-brkss\" (UID: \"7c2058d7-0c77-4f28-a103-679184ed575c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-brkss" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.574298 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70b1c95e-1326-4a4d-92f8-12df76f6a23a-serving-cert\") pod \"route-controller-manager-6576b87f9c-t9sxl\" (UID: \"70b1c95e-1326-4a4d-92f8-12df76f6a23a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.574332 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2fbb93d1-04cc-4152-b593-4fc23ebfa1ac-metrics-tls\") pod \"ingress-operator-5b745b69d9-vsdnk\" (UID: \"2fbb93d1-04cc-4152-b593-4fc23ebfa1ac\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdnk" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.574505 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50b5a937-bb23-4b89-86a3-6ad4944f5440-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mct42\" (UID: \"50b5a937-bb23-4b89-86a3-6ad4944f5440\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mct42" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.574570 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5073d2d2-177a-4e70-9638-7fe56084c301-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.574589 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78kq2\" (UniqueName: \"kubernetes.io/projected/c91cf18b-1765-48d3-9e00-66747b628f33-kube-api-access-78kq2\") pod \"catalog-operator-68c6474976-j9sc4\" (UID: \"c91cf18b-1765-48d3-9e00-66747b628f33\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j9sc4" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.574615 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/13a19b2e-fdd8-41cc-89ac-ed182fa3a449-webhook-cert\") pod \"packageserver-d55dfcdfc-b65bx\" (UID: \"13a19b2e-fdd8-41cc-89ac-ed182fa3a449\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b65bx" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.574659 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.574753 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc9xf\" (UniqueName: \"kubernetes.io/projected/70b1c95e-1326-4a4d-92f8-12df76f6a23a-kube-api-access-pc9xf\") pod \"route-controller-manager-6576b87f9c-t9sxl\" (UID: \"70b1c95e-1326-4a4d-92f8-12df76f6a23a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.574800 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8rps\" (UniqueName: \"kubernetes.io/projected/f9f38918-24d8-44d6-9ed5-0d9e69ddc590-kube-api-access-l8rps\") pod \"kube-storage-version-migrator-operator-b67b599dd-pwp59\" (UID: \"f9f38918-24d8-44d6-9ed5-0d9e69ddc590\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwp59" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.574819 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f7b8f2b8-0607-467d-8ba2-3b823817b639-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-48g6z\" (UID: \"f7b8f2b8-0607-467d-8ba2-3b823817b639\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-48g6z" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.574835 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/db64f07f-f1cb-4754-8e1f-33951a826f78-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t9hb6\" (UID: \"db64f07f-f1cb-4754-8e1f-33951a826f78\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t9hb6" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.574852 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/39079cdf-1b40-4f77-ad11-3816fc89e3df-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-h8b5s\" (UID: \"39079cdf-1b40-4f77-ad11-3816fc89e3df\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8b5s" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.574892 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/13a19b2e-fdd8-41cc-89ac-ed182fa3a449-apiservice-cert\") pod \"packageserver-d55dfcdfc-b65bx\" (UID: \"13a19b2e-fdd8-41cc-89ac-ed182fa3a449\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b65bx" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.574914 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0325b4dc-fe2a-4685-8e37-621a96f6b976-config\") pod \"controller-manager-879f6c89f-lmr9s\" (UID: \"0325b4dc-fe2a-4685-8e37-621a96f6b976\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.574958 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.574990 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.575008 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c91cf18b-1765-48d3-9e00-66747b628f33-profile-collector-cert\") pod \"catalog-operator-68c6474976-j9sc4\" (UID: \"c91cf18b-1765-48d3-9e00-66747b628f33\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j9sc4" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.575046 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95c0f392-b6e8-4719-8cad-0d267cf8b955-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2kmr7\" (UID: \"95c0f392-b6e8-4719-8cad-0d267cf8b955\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2kmr7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.575074 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/83f8f490-e050-4721-8784-0879496323ad-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4zczm\" (UID: \"83f8f490-e050-4721-8784-0879496323ad\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zczm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.575089 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/91b993d3-35bb-4b9b-9e1a-ca96fa6f8162-mountpoint-dir\") pod \"csi-hostpathplugin-krqgm\" (UID: \"91b993d3-35bb-4b9b-9e1a-ca96fa6f8162\") " pod="hostpath-provisioner/csi-hostpathplugin-krqgm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.575132 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1707bff4-eb31-4ed0-bbc5-054813b1a34a-secret-volume\") pod \"collect-profiles-29550795-nmbwp\" (UID: \"1707bff4-eb31-4ed0-bbc5-054813b1a34a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-nmbwp" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.575161 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50b5a937-bb23-4b89-86a3-6ad4944f5440-service-ca-bundle\") pod \"authentication-operator-69f744f599-mct42\" (UID: \"50b5a937-bb23-4b89-86a3-6ad4944f5440\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mct42" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.575210 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e0270a9-8b08-4abf-88da-75319c5e6f48-service-ca-bundle\") pod \"router-default-5444994796-w4c8h\" (UID: \"9e0270a9-8b08-4abf-88da-75319c5e6f48\") " pod="openshift-ingress/router-default-5444994796-w4c8h" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.575227 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8a3dd14-c5c3-4251-88bf-31dcafe04ef1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xhbrd\" (UID: \"e8a3dd14-c5c3-4251-88bf-31dcafe04ef1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xhbrd" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.575244 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7ddfae4b-5893-4e15-a983-1adb19c5970e-registry-tls\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.575286 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/339ca768-fe61-40dd-8a4f-93363aa23972-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-qz4l7\" (UID: \"339ca768-fe61-40dd-8a4f-93363aa23972\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qz4l7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.575306 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsdx6\" (UniqueName: \"kubernetes.io/projected/c9cdbff0-0cca-4375-8c92-1117ce5d1dea-kube-api-access-gsdx6\") pod \"control-plane-machine-set-operator-78cbb6b69f-dvp8t\" (UID: \"c9cdbff0-0cca-4375-8c92-1117ce5d1dea\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dvp8t" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.575323 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8a3dd14-c5c3-4251-88bf-31dcafe04ef1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xhbrd\" (UID: \"e8a3dd14-c5c3-4251-88bf-31dcafe04ef1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xhbrd" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.575341 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/1a0999c2-4d90-4197-8075-e11790a0ed9b-ready\") pod \"cni-sysctl-allowlist-ds-n8lbv\" (UID: \"1a0999c2-4d90-4197-8075-e11790a0ed9b\") " pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.575372 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50b5a937-bb23-4b89-86a3-6ad4944f5440-serving-cert\") pod \"authentication-operator-69f744f599-mct42\" (UID: \"50b5a937-bb23-4b89-86a3-6ad4944f5440\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mct42" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.575403 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5073d2d2-177a-4e70-9638-7fe56084c301-serving-cert\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.575425 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.575489 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1a0999c2-4d90-4197-8075-e11790a0ed9b-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-n8lbv\" (UID: \"1a0999c2-4d90-4197-8075-e11790a0ed9b\") " pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.575530 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrlsz\" (UniqueName: \"kubernetes.io/projected/1a0999c2-4d90-4197-8075-e11790a0ed9b-kube-api-access-lrlsz\") pod \"cni-sysctl-allowlist-ds-n8lbv\" (UID: \"1a0999c2-4d90-4197-8075-e11790a0ed9b\") " pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.575554 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0325b4dc-fe2a-4685-8e37-621a96f6b976-client-ca\") pod \"controller-manager-879f6c89f-lmr9s\" (UID: \"0325b4dc-fe2a-4685-8e37-621a96f6b976\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.575568 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/56909779-30d4-4350-810d-9675796d96ad-srv-cert\") pod \"olm-operator-6b444d44fb-5bzw7\" (UID: \"56909779-30d4-4350-810d-9675796d96ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bzw7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.575591 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2555712b-fa0a-4831-90ca-78d22b2e48b9-audit-dir\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.575659 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2555712b-fa0a-4831-90ca-78d22b2e48b9-audit-dir\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.577232 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ddfae4b-5893-4e15-a983-1adb19c5970e-trusted-ca\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.578597 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5073d2d2-177a-4e70-9638-7fe56084c301-audit-dir\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.578750 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0325b4dc-fe2a-4685-8e37-621a96f6b976-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lmr9s\" (UID: \"0325b4dc-fe2a-4685-8e37-621a96f6b976\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.579349 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.579751 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.580302 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2fbb93d1-04cc-4152-b593-4fc23ebfa1ac-trusted-ca\") pod \"ingress-operator-5b745b69d9-vsdnk\" (UID: \"2fbb93d1-04cc-4152-b593-4fc23ebfa1ac\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdnk" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.580517 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.581282 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.081249859 +0000 UTC m=+123.641177659 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.581383 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0325b4dc-fe2a-4685-8e37-621a96f6b976-client-ca\") pod \"controller-manager-879f6c89f-lmr9s\" (UID: \"0325b4dc-fe2a-4685-8e37-621a96f6b976\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.581647 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e0270a9-8b08-4abf-88da-75319c5e6f48-service-ca-bundle\") pod \"router-default-5444994796-w4c8h\" (UID: \"9e0270a9-8b08-4abf-88da-75319c5e6f48\") " pod="openshift-ingress/router-default-5444994796-w4c8h" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.581889 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0325b4dc-fe2a-4685-8e37-621a96f6b976-config\") pod \"controller-manager-879f6c89f-lmr9s\" (UID: \"0325b4dc-fe2a-4685-8e37-621a96f6b976\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.582143 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9e0270a9-8b08-4abf-88da-75319c5e6f48-stats-auth\") pod \"router-default-5444994796-w4c8h\" (UID: \"9e0270a9-8b08-4abf-88da-75319c5e6f48\") " pod="openshift-ingress/router-default-5444994796-w4c8h" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.583217 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7ddfae4b-5893-4e15-a983-1adb19c5970e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.583312 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7ddfae4b-5893-4e15-a983-1adb19c5970e-registry-certificates\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.583916 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.584372 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9e0270a9-8b08-4abf-88da-75319c5e6f48-default-certificate\") pod \"router-default-5444994796-w4c8h\" (UID: \"9e0270a9-8b08-4abf-88da-75319c5e6f48\") " pod="openshift-ingress/router-default-5444994796-w4c8h" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.585502 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0325b4dc-fe2a-4685-8e37-621a96f6b976-serving-cert\") pod \"controller-manager-879f6c89f-lmr9s\" (UID: \"0325b4dc-fe2a-4685-8e37-621a96f6b976\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.585666 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.587828 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.588479 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e0270a9-8b08-4abf-88da-75319c5e6f48-metrics-certs\") pod \"router-default-5444994796-w4c8h\" (UID: \"9e0270a9-8b08-4abf-88da-75319c5e6f48\") " pod="openshift-ingress/router-default-5444994796-w4c8h" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.588972 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.590374 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7ddfae4b-5893-4e15-a983-1adb19c5970e-registry-tls\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.592616 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7ddfae4b-5893-4e15-a983-1adb19c5970e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.593595 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2fbb93d1-04cc-4152-b593-4fc23ebfa1ac-metrics-tls\") pod \"ingress-operator-5b745b69d9-vsdnk\" (UID: \"2fbb93d1-04cc-4152-b593-4fc23ebfa1ac\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdnk" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.604105 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.605713 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.624561 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.645235 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.667654 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.676834 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677024 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e6ecdd40-f0c8-4f9d-9b8a-90c9941d159a-signing-cabundle\") pod \"service-ca-9c57cc56f-fg4hj\" (UID: \"e6ecdd40-f0c8-4f9d-9b8a-90c9941d159a\") " pod="openshift-service-ca/service-ca-9c57cc56f-fg4hj" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677057 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83f8f490-e050-4721-8784-0879496323ad-proxy-tls\") pod \"machine-config-operator-74547568cd-4zczm\" (UID: \"83f8f490-e050-4721-8784-0879496323ad\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zczm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677081 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ed6451f-4bc6-4dcc-b84c-413dbb95114b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zqnt8\" (UID: \"1ed6451f-4bc6-4dcc-b84c-413dbb95114b\") " pod="openshift-marketplace/marketplace-operator-79b997595-zqnt8" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677123 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx2kb\" (UniqueName: \"kubernetes.io/projected/39079cdf-1b40-4f77-ad11-3816fc89e3df-kube-api-access-vx2kb\") pod \"machine-config-controller-84d6567774-h8b5s\" (UID: \"39079cdf-1b40-4f77-ad11-3816fc89e3df\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8b5s" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677146 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftf96\" (UniqueName: \"kubernetes.io/projected/339ca768-fe61-40dd-8a4f-93363aa23972-kube-api-access-ftf96\") pod \"package-server-manager-789f6589d5-qz4l7\" (UID: \"339ca768-fe61-40dd-8a4f-93363aa23972\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qz4l7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677170 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjmnd\" (UniqueName: \"kubernetes.io/projected/f7b8f2b8-0607-467d-8ba2-3b823817b639-kube-api-access-hjmnd\") pod \"multus-admission-controller-857f4d67dd-48g6z\" (UID: \"f7b8f2b8-0607-467d-8ba2-3b823817b639\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-48g6z" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677214 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9cdbff0-0cca-4375-8c92-1117ce5d1dea-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dvp8t\" (UID: \"c9cdbff0-0cca-4375-8c92-1117ce5d1dea\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dvp8t" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677240 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6dba6300-591c-4fc5-8544-b208731d2dc6-cert\") pod \"ingress-canary-57xjq\" (UID: \"6dba6300-591c-4fc5-8544-b208731d2dc6\") " pod="openshift-ingress-canary/ingress-canary-57xjq" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677281 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ed6451f-4bc6-4dcc-b84c-413dbb95114b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zqnt8\" (UID: \"1ed6451f-4bc6-4dcc-b84c-413dbb95114b\") " pod="openshift-marketplace/marketplace-operator-79b997595-zqnt8" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677333 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e6ecdd40-f0c8-4f9d-9b8a-90c9941d159a-signing-key\") pod \"service-ca-9c57cc56f-fg4hj\" (UID: \"e6ecdd40-f0c8-4f9d-9b8a-90c9941d159a\") " pod="openshift-service-ca/service-ca-9c57cc56f-fg4hj" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677375 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/91b993d3-35bb-4b9b-9e1a-ca96fa6f8162-socket-dir\") pod \"csi-hostpathplugin-krqgm\" (UID: \"91b993d3-35bb-4b9b-9e1a-ca96fa6f8162\") " pod="hostpath-provisioner/csi-hostpathplugin-krqgm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677398 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2wnj\" (UniqueName: \"kubernetes.io/projected/2432a454-1fbc-4fe4-a6cb-27292e8b670d-kube-api-access-j2wnj\") pod \"migrator-59844c95c7-mbv68\" (UID: \"2432a454-1fbc-4fe4-a6cb-27292e8b670d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mbv68" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677419 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f372b9ff-41d6-4712-bf7a-9c229f1f7673-metrics-tls\") pod \"dns-default-th2ls\" (UID: \"f372b9ff-41d6-4712-bf7a-9c229f1f7673\") " pod="openshift-dns/dns-default-th2ls" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677448 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1a0999c2-4d90-4197-8075-e11790a0ed9b-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-n8lbv\" (UID: \"1a0999c2-4d90-4197-8075-e11790a0ed9b\") " pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677477 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c2058d7-0c77-4f28-a103-679184ed575c-serving-cert\") pod \"service-ca-operator-777779d784-brkss\" (UID: \"7c2058d7-0c77-4f28-a103-679184ed575c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-brkss" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677546 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78kq2\" (UniqueName: \"kubernetes.io/projected/c91cf18b-1765-48d3-9e00-66747b628f33-kube-api-access-78kq2\") pod \"catalog-operator-68c6474976-j9sc4\" (UID: \"c91cf18b-1765-48d3-9e00-66747b628f33\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j9sc4" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677578 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/13a19b2e-fdd8-41cc-89ac-ed182fa3a449-webhook-cert\") pod \"packageserver-d55dfcdfc-b65bx\" (UID: \"13a19b2e-fdd8-41cc-89ac-ed182fa3a449\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b65bx" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677644 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f7b8f2b8-0607-467d-8ba2-3b823817b639-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-48g6z\" (UID: \"f7b8f2b8-0607-467d-8ba2-3b823817b639\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-48g6z" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677669 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/39079cdf-1b40-4f77-ad11-3816fc89e3df-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-h8b5s\" (UID: \"39079cdf-1b40-4f77-ad11-3816fc89e3df\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8b5s" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677690 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/13a19b2e-fdd8-41cc-89ac-ed182fa3a449-apiservice-cert\") pod \"packageserver-d55dfcdfc-b65bx\" (UID: \"13a19b2e-fdd8-41cc-89ac-ed182fa3a449\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b65bx" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677720 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c91cf18b-1765-48d3-9e00-66747b628f33-profile-collector-cert\") pod \"catalog-operator-68c6474976-j9sc4\" (UID: \"c91cf18b-1765-48d3-9e00-66747b628f33\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j9sc4" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677741 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95c0f392-b6e8-4719-8cad-0d267cf8b955-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2kmr7\" (UID: \"95c0f392-b6e8-4719-8cad-0d267cf8b955\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2kmr7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677762 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/83f8f490-e050-4721-8784-0879496323ad-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4zczm\" (UID: \"83f8f490-e050-4721-8784-0879496323ad\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zczm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677790 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/91b993d3-35bb-4b9b-9e1a-ca96fa6f8162-mountpoint-dir\") pod \"csi-hostpathplugin-krqgm\" (UID: \"91b993d3-35bb-4b9b-9e1a-ca96fa6f8162\") " pod="hostpath-provisioner/csi-hostpathplugin-krqgm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677810 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1707bff4-eb31-4ed0-bbc5-054813b1a34a-secret-volume\") pod \"collect-profiles-29550795-nmbwp\" (UID: \"1707bff4-eb31-4ed0-bbc5-054813b1a34a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-nmbwp" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677842 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8a3dd14-c5c3-4251-88bf-31dcafe04ef1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xhbrd\" (UID: \"e8a3dd14-c5c3-4251-88bf-31dcafe04ef1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xhbrd" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677868 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/339ca768-fe61-40dd-8a4f-93363aa23972-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-qz4l7\" (UID: \"339ca768-fe61-40dd-8a4f-93363aa23972\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qz4l7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677893 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsdx6\" (UniqueName: \"kubernetes.io/projected/c9cdbff0-0cca-4375-8c92-1117ce5d1dea-kube-api-access-gsdx6\") pod \"control-plane-machine-set-operator-78cbb6b69f-dvp8t\" (UID: \"c9cdbff0-0cca-4375-8c92-1117ce5d1dea\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dvp8t" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677919 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/1a0999c2-4d90-4197-8075-e11790a0ed9b-ready\") pod \"cni-sysctl-allowlist-ds-n8lbv\" (UID: \"1a0999c2-4d90-4197-8075-e11790a0ed9b\") " pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.677977 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8a3dd14-c5c3-4251-88bf-31dcafe04ef1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xhbrd\" (UID: \"e8a3dd14-c5c3-4251-88bf-31dcafe04ef1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xhbrd" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678012 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1a0999c2-4d90-4197-8075-e11790a0ed9b-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-n8lbv\" (UID: \"1a0999c2-4d90-4197-8075-e11790a0ed9b\") " pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678036 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrlsz\" (UniqueName: \"kubernetes.io/projected/1a0999c2-4d90-4197-8075-e11790a0ed9b-kube-api-access-lrlsz\") pod \"cni-sysctl-allowlist-ds-n8lbv\" (UID: \"1a0999c2-4d90-4197-8075-e11790a0ed9b\") " pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678072 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/56909779-30d4-4350-810d-9675796d96ad-srv-cert\") pod \"olm-operator-6b444d44fb-5bzw7\" (UID: \"56909779-30d4-4350-810d-9675796d96ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bzw7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678158 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95c0f392-b6e8-4719-8cad-0d267cf8b955-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2kmr7\" (UID: \"95c0f392-b6e8-4719-8cad-0d267cf8b955\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2kmr7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678183 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwg7w\" (UniqueName: \"kubernetes.io/projected/56909779-30d4-4350-810d-9675796d96ad-kube-api-access-hwg7w\") pod \"olm-operator-6b444d44fb-5bzw7\" (UID: \"56909779-30d4-4350-810d-9675796d96ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bzw7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678205 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/43af91cb-669b-473a-a92f-d6b8fffa0cc7-certs\") pod \"machine-config-server-4g4cr\" (UID: \"43af91cb-669b-473a-a92f-d6b8fffa0cc7\") " pod="openshift-machine-config-operator/machine-config-server-4g4cr" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678234 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npm8w\" (UniqueName: \"kubernetes.io/projected/1707bff4-eb31-4ed0-bbc5-054813b1a34a-kube-api-access-npm8w\") pod \"collect-profiles-29550795-nmbwp\" (UID: \"1707bff4-eb31-4ed0-bbc5-054813b1a34a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-nmbwp" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678257 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/91b993d3-35bb-4b9b-9e1a-ca96fa6f8162-csi-data-dir\") pod \"csi-hostpathplugin-krqgm\" (UID: \"91b993d3-35bb-4b9b-9e1a-ca96fa6f8162\") " pod="hostpath-provisioner/csi-hostpathplugin-krqgm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678282 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxf26\" (UniqueName: \"kubernetes.io/projected/43af91cb-669b-473a-a92f-d6b8fffa0cc7-kube-api-access-jxf26\") pod \"machine-config-server-4g4cr\" (UID: \"43af91cb-669b-473a-a92f-d6b8fffa0cc7\") " pod="openshift-machine-config-operator/machine-config-server-4g4cr" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678305 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f372b9ff-41d6-4712-bf7a-9c229f1f7673-config-volume\") pod \"dns-default-th2ls\" (UID: \"f372b9ff-41d6-4712-bf7a-9c229f1f7673\") " pod="openshift-dns/dns-default-th2ls" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678382 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9wjh\" (UniqueName: \"kubernetes.io/projected/1ed6451f-4bc6-4dcc-b84c-413dbb95114b-kube-api-access-j9wjh\") pod \"marketplace-operator-79b997595-zqnt8\" (UID: \"1ed6451f-4bc6-4dcc-b84c-413dbb95114b\") " pod="openshift-marketplace/marketplace-operator-79b997595-zqnt8" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678419 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfbb9\" (UniqueName: \"kubernetes.io/projected/83f8f490-e050-4721-8784-0879496323ad-kube-api-access-xfbb9\") pod \"machine-config-operator-74547568cd-4zczm\" (UID: \"83f8f490-e050-4721-8784-0879496323ad\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zczm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678458 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/91b993d3-35bb-4b9b-9e1a-ca96fa6f8162-plugins-dir\") pod \"csi-hostpathplugin-krqgm\" (UID: \"91b993d3-35bb-4b9b-9e1a-ca96fa6f8162\") " pod="hostpath-provisioner/csi-hostpathplugin-krqgm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678503 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtqcm\" (UniqueName: \"kubernetes.io/projected/91b993d3-35bb-4b9b-9e1a-ca96fa6f8162-kube-api-access-jtqcm\") pod \"csi-hostpathplugin-krqgm\" (UID: \"91b993d3-35bb-4b9b-9e1a-ca96fa6f8162\") " pod="hostpath-provisioner/csi-hostpathplugin-krqgm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678526 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/43af91cb-669b-473a-a92f-d6b8fffa0cc7-node-bootstrap-token\") pod \"machine-config-server-4g4cr\" (UID: \"43af91cb-669b-473a-a92f-d6b8fffa0cc7\") " pod="openshift-machine-config-operator/machine-config-server-4g4cr" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678550 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btv9h\" (UniqueName: \"kubernetes.io/projected/e6ecdd40-f0c8-4f9d-9b8a-90c9941d159a-kube-api-access-btv9h\") pod \"service-ca-9c57cc56f-fg4hj\" (UID: \"e6ecdd40-f0c8-4f9d-9b8a-90c9941d159a\") " pod="openshift-service-ca/service-ca-9c57cc56f-fg4hj" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678587 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/13a19b2e-fdd8-41cc-89ac-ed182fa3a449-tmpfs\") pod \"packageserver-d55dfcdfc-b65bx\" (UID: \"13a19b2e-fdd8-41cc-89ac-ed182fa3a449\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b65bx" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678612 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/91b993d3-35bb-4b9b-9e1a-ca96fa6f8162-registration-dir\") pod \"csi-hostpathplugin-krqgm\" (UID: \"91b993d3-35bb-4b9b-9e1a-ca96fa6f8162\") " pod="hostpath-provisioner/csi-hostpathplugin-krqgm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678634 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7vj4\" (UniqueName: \"kubernetes.io/projected/6dba6300-591c-4fc5-8544-b208731d2dc6-kube-api-access-x7vj4\") pod \"ingress-canary-57xjq\" (UID: \"6dba6300-591c-4fc5-8544-b208731d2dc6\") " pod="openshift-ingress-canary/ingress-canary-57xjq" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678666 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95c0f392-b6e8-4719-8cad-0d267cf8b955-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2kmr7\" (UID: \"95c0f392-b6e8-4719-8cad-0d267cf8b955\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2kmr7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678688 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/83f8f490-e050-4721-8784-0879496323ad-images\") pod \"machine-config-operator-74547568cd-4zczm\" (UID: \"83f8f490-e050-4721-8784-0879496323ad\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zczm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678723 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/56909779-30d4-4350-810d-9675796d96ad-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5bzw7\" (UID: \"56909779-30d4-4350-810d-9675796d96ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bzw7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678746 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b5fx\" (UniqueName: \"kubernetes.io/projected/7c2058d7-0c77-4f28-a103-679184ed575c-kube-api-access-2b5fx\") pod \"service-ca-operator-777779d784-brkss\" (UID: \"7c2058d7-0c77-4f28-a103-679184ed575c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-brkss" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678775 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a3dd14-c5c3-4251-88bf-31dcafe04ef1-config\") pod \"kube-controller-manager-operator-78b949d7b-xhbrd\" (UID: \"e8a3dd14-c5c3-4251-88bf-31dcafe04ef1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xhbrd" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678797 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m7rf\" (UniqueName: \"kubernetes.io/projected/13a19b2e-fdd8-41cc-89ac-ed182fa3a449-kube-api-access-4m7rf\") pod \"packageserver-d55dfcdfc-b65bx\" (UID: \"13a19b2e-fdd8-41cc-89ac-ed182fa3a449\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b65bx" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678821 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c91cf18b-1765-48d3-9e00-66747b628f33-srv-cert\") pod \"catalog-operator-68c6474976-j9sc4\" (UID: \"c91cf18b-1765-48d3-9e00-66747b628f33\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j9sc4" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678845 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/39079cdf-1b40-4f77-ad11-3816fc89e3df-proxy-tls\") pod \"machine-config-controller-84d6567774-h8b5s\" (UID: \"39079cdf-1b40-4f77-ad11-3816fc89e3df\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8b5s" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678867 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c2058d7-0c77-4f28-a103-679184ed575c-config\") pod \"service-ca-operator-777779d784-brkss\" (UID: \"7c2058d7-0c77-4f28-a103-679184ed575c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-brkss" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678887 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1707bff4-eb31-4ed0-bbc5-054813b1a34a-config-volume\") pod \"collect-profiles-29550795-nmbwp\" (UID: \"1707bff4-eb31-4ed0-bbc5-054813b1a34a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-nmbwp" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.678948 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fmfp\" (UniqueName: \"kubernetes.io/projected/f372b9ff-41d6-4712-bf7a-9c229f1f7673-kube-api-access-8fmfp\") pod \"dns-default-th2ls\" (UID: \"f372b9ff-41d6-4712-bf7a-9c229f1f7673\") " pod="openshift-dns/dns-default-th2ls" Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.679323 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.179299291 +0000 UTC m=+123.739227091 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.679677 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/13a19b2e-fdd8-41cc-89ac-ed182fa3a449-tmpfs\") pod \"packageserver-d55dfcdfc-b65bx\" (UID: \"13a19b2e-fdd8-41cc-89ac-ed182fa3a449\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b65bx" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.679760 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95c0f392-b6e8-4719-8cad-0d267cf8b955-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2kmr7\" (UID: \"95c0f392-b6e8-4719-8cad-0d267cf8b955\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2kmr7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.680003 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/91b993d3-35bb-4b9b-9e1a-ca96fa6f8162-registration-dir\") pod \"csi-hostpathplugin-krqgm\" (UID: \"91b993d3-35bb-4b9b-9e1a-ca96fa6f8162\") " pod="hostpath-provisioner/csi-hostpathplugin-krqgm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.681172 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/83f8f490-e050-4721-8784-0879496323ad-images\") pod \"machine-config-operator-74547568cd-4zczm\" (UID: \"83f8f490-e050-4721-8784-0879496323ad\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zczm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.681650 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e6ecdd40-f0c8-4f9d-9b8a-90c9941d159a-signing-cabundle\") pod \"service-ca-9c57cc56f-fg4hj\" (UID: \"e6ecdd40-f0c8-4f9d-9b8a-90c9941d159a\") " pod="openshift-service-ca/service-ca-9c57cc56f-fg4hj" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.682171 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ed6451f-4bc6-4dcc-b84c-413dbb95114b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zqnt8\" (UID: \"1ed6451f-4bc6-4dcc-b84c-413dbb95114b\") " pod="openshift-marketplace/marketplace-operator-79b997595-zqnt8" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.682401 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/91b993d3-35bb-4b9b-9e1a-ca96fa6f8162-csi-data-dir\") pod \"csi-hostpathplugin-krqgm\" (UID: \"91b993d3-35bb-4b9b-9e1a-ca96fa6f8162\") " pod="hostpath-provisioner/csi-hostpathplugin-krqgm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.683401 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f372b9ff-41d6-4712-bf7a-9c229f1f7673-config-volume\") pod \"dns-default-th2ls\" (UID: \"f372b9ff-41d6-4712-bf7a-9c229f1f7673\") " pod="openshift-dns/dns-default-th2ls" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.683535 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1a0999c2-4d90-4197-8075-e11790a0ed9b-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-n8lbv\" (UID: \"1a0999c2-4d90-4197-8075-e11790a0ed9b\") " pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.685138 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1707bff4-eb31-4ed0-bbc5-054813b1a34a-config-volume\") pod \"collect-profiles-29550795-nmbwp\" (UID: \"1707bff4-eb31-4ed0-bbc5-054813b1a34a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-nmbwp" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.686096 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ed6451f-4bc6-4dcc-b84c-413dbb95114b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zqnt8\" (UID: \"1ed6451f-4bc6-4dcc-b84c-413dbb95114b\") " pod="openshift-marketplace/marketplace-operator-79b997595-zqnt8" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.686164 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83f8f490-e050-4721-8784-0879496323ad-proxy-tls\") pod \"machine-config-operator-74547568cd-4zczm\" (UID: \"83f8f490-e050-4721-8784-0879496323ad\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zczm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.686685 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/13a19b2e-fdd8-41cc-89ac-ed182fa3a449-apiservice-cert\") pod \"packageserver-d55dfcdfc-b65bx\" (UID: \"13a19b2e-fdd8-41cc-89ac-ed182fa3a449\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b65bx" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.687741 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.688541 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/91b993d3-35bb-4b9b-9e1a-ca96fa6f8162-socket-dir\") pod \"csi-hostpathplugin-krqgm\" (UID: \"91b993d3-35bb-4b9b-9e1a-ca96fa6f8162\") " pod="hostpath-provisioner/csi-hostpathplugin-krqgm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.689686 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a3dd14-c5c3-4251-88bf-31dcafe04ef1-config\") pod \"kube-controller-manager-operator-78b949d7b-xhbrd\" (UID: \"e8a3dd14-c5c3-4251-88bf-31dcafe04ef1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xhbrd" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.689966 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/91b993d3-35bb-4b9b-9e1a-ca96fa6f8162-plugins-dir\") pod \"csi-hostpathplugin-krqgm\" (UID: \"91b993d3-35bb-4b9b-9e1a-ca96fa6f8162\") " pod="hostpath-provisioner/csi-hostpathplugin-krqgm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.690396 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6dba6300-591c-4fc5-8544-b208731d2dc6-cert\") pod \"ingress-canary-57xjq\" (UID: \"6dba6300-591c-4fc5-8544-b208731d2dc6\") " pod="openshift-ingress-canary/ingress-canary-57xjq" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.690899 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/56909779-30d4-4350-810d-9675796d96ad-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5bzw7\" (UID: \"56909779-30d4-4350-810d-9675796d96ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bzw7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.691126 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/43af91cb-669b-473a-a92f-d6b8fffa0cc7-certs\") pod \"machine-config-server-4g4cr\" (UID: \"43af91cb-669b-473a-a92f-d6b8fffa0cc7\") " pod="openshift-machine-config-operator/machine-config-server-4g4cr" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.691226 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f372b9ff-41d6-4712-bf7a-9c229f1f7673-metrics-tls\") pod \"dns-default-th2ls\" (UID: \"f372b9ff-41d6-4712-bf7a-9c229f1f7673\") " pod="openshift-dns/dns-default-th2ls" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.691884 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c91cf18b-1765-48d3-9e00-66747b628f33-profile-collector-cert\") pod \"catalog-operator-68c6474976-j9sc4\" (UID: \"c91cf18b-1765-48d3-9e00-66747b628f33\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j9sc4" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.691946 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/39079cdf-1b40-4f77-ad11-3816fc89e3df-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-h8b5s\" (UID: \"39079cdf-1b40-4f77-ad11-3816fc89e3df\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8b5s" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.692110 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c2058d7-0c77-4f28-a103-679184ed575c-serving-cert\") pod \"service-ca-operator-777779d784-brkss\" (UID: \"7c2058d7-0c77-4f28-a103-679184ed575c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-brkss" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.692273 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9cdbff0-0cca-4375-8c92-1117ce5d1dea-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dvp8t\" (UID: \"c9cdbff0-0cca-4375-8c92-1117ce5d1dea\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dvp8t" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.692528 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/56909779-30d4-4350-810d-9675796d96ad-srv-cert\") pod \"olm-operator-6b444d44fb-5bzw7\" (UID: \"56909779-30d4-4350-810d-9675796d96ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bzw7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.692675 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c91cf18b-1765-48d3-9e00-66747b628f33-srv-cert\") pod \"catalog-operator-68c6474976-j9sc4\" (UID: \"c91cf18b-1765-48d3-9e00-66747b628f33\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j9sc4" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.692618 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e6ecdd40-f0c8-4f9d-9b8a-90c9941d159a-signing-key\") pod \"service-ca-9c57cc56f-fg4hj\" (UID: \"e6ecdd40-f0c8-4f9d-9b8a-90c9941d159a\") " pod="openshift-service-ca/service-ca-9c57cc56f-fg4hj" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.693106 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c2058d7-0c77-4f28-a103-679184ed575c-config\") pod \"service-ca-operator-777779d784-brkss\" (UID: \"7c2058d7-0c77-4f28-a103-679184ed575c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-brkss" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.693780 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/91b993d3-35bb-4b9b-9e1a-ca96fa6f8162-mountpoint-dir\") pod \"csi-hostpathplugin-krqgm\" (UID: \"91b993d3-35bb-4b9b-9e1a-ca96fa6f8162\") " pod="hostpath-provisioner/csi-hostpathplugin-krqgm" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.694070 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1a0999c2-4d90-4197-8075-e11790a0ed9b-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-n8lbv\" (UID: \"1a0999c2-4d90-4197-8075-e11790a0ed9b\") " pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.694518 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/13a19b2e-fdd8-41cc-89ac-ed182fa3a449-webhook-cert\") pod \"packageserver-d55dfcdfc-b65bx\" (UID: \"13a19b2e-fdd8-41cc-89ac-ed182fa3a449\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b65bx" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.694569 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/83f8f490-e050-4721-8784-0879496323ad-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4zczm\" (UID: \"83f8f490-e050-4721-8784-0879496323ad\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zczm" Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.694755 4971 projected.go:288] Couldn't get configMap openshift-controller-manager-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.694872 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/1a0999c2-4d90-4197-8075-e11790a0ed9b-ready\") pod \"cni-sysctl-allowlist-ds-n8lbv\" (UID: \"1a0999c2-4d90-4197-8075-e11790a0ed9b\") " pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.695759 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95c0f392-b6e8-4719-8cad-0d267cf8b955-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2kmr7\" (UID: \"95c0f392-b6e8-4719-8cad-0d267cf8b955\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2kmr7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.696392 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8a3dd14-c5c3-4251-88bf-31dcafe04ef1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xhbrd\" (UID: \"e8a3dd14-c5c3-4251-88bf-31dcafe04ef1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xhbrd" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.696518 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/39079cdf-1b40-4f77-ad11-3816fc89e3df-proxy-tls\") pod \"machine-config-controller-84d6567774-h8b5s\" (UID: \"39079cdf-1b40-4f77-ad11-3816fc89e3df\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8b5s" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.697125 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/43af91cb-669b-473a-a92f-d6b8fffa0cc7-node-bootstrap-token\") pod \"machine-config-server-4g4cr\" (UID: \"43af91cb-669b-473a-a92f-d6b8fffa0cc7\") " pod="openshift-machine-config-operator/machine-config-server-4g4cr" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.698308 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f7b8f2b8-0607-467d-8ba2-3b823817b639-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-48g6z\" (UID: \"f7b8f2b8-0607-467d-8ba2-3b823817b639\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-48g6z" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.698473 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1707bff4-eb31-4ed0-bbc5-054813b1a34a-secret-volume\") pod \"collect-profiles-29550795-nmbwp\" (UID: \"1707bff4-eb31-4ed0-bbc5-054813b1a34a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-nmbwp" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.698948 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/339ca768-fe61-40dd-8a4f-93363aa23972-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-qz4l7\" (UID: \"339ca768-fe61-40dd-8a4f-93363aa23972\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qz4l7" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.703914 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.723303 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.743243 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.748349 4971 projected.go:194] Error preparing data for projected volume kube-api-access-dnhl5 for pod openshift-cluster-machine-approver/machine-approver-56656f9798-9w96x: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.748476 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/23bc42fe-aadb-4283-a679-b07d87b04a15-kube-api-access-dnhl5 podName:23bc42fe-aadb-4283-a679-b07d87b04a15 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.248421019 +0000 UTC m=+123.808348829 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dnhl5" (UniqueName: "kubernetes.io/projected/23bc42fe-aadb-4283-a679-b07d87b04a15-kube-api-access-dnhl5") pod "machine-approver-56656f9798-9w96x" (UID: "23bc42fe-aadb-4283-a679-b07d87b04a15") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.770609 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.781003 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.782037 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.28201744 +0000 UTC m=+123.841945250 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.784009 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.785616 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3475c2cd7919f4f19793ac4e41a8f71a679e4439fdb6aa70a9a4f1942cdd4df7"} Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.787308 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bzmmh" event={"ID":"ca599a0a-36fb-4040-9304-01eb9d4c19d0","Type":"ContainerStarted","Data":"6b4152f684aff152a0c84dd7638e7403fa09e7463d5bb999367b129655395091"} Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.803906 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 09 09:21:59 crc kubenswrapper[4971]: W0309 09:21:59.809669 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-29313920f02b746bfd0f9c40de4ff611d91d160670475cb918cb3a2778f0f3ff WatchSource:0}: Error finding container 29313920f02b746bfd0f9c40de4ff611d91d160670475cb918cb3a2778f0f3ff: Status 404 returned error can't find the container with id 29313920f02b746bfd0f9c40de4ff611d91d160670475cb918cb3a2778f0f3ff Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.823636 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.837098 4971 projected.go:288] Couldn't get configMap openshift-config-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.845888 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 09 09:21:59 crc kubenswrapper[4971]: W0309 09:21:59.846203 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-22966eed4f8a655459212910318d94860c57d4472fd158cc8c5e9f4d7cb7834b WatchSource:0}: Error finding container 22966eed4f8a655459212910318d94860c57d4472fd158cc8c5e9f4d7cb7834b: Status 404 returned error can't find the container with id 22966eed4f8a655459212910318d94860c57d4472fd158cc8c5e9f4d7cb7834b Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.864071 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.882508 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.883821 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.383800411 +0000 UTC m=+123.943728231 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.889922 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.904771 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.923895 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.947352 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.964460 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.965232 4971 projected.go:194] Error preparing data for projected volume kube-api-access-brnkp for pod openshift-console/console-f9d7485db-dnx9z: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.965343 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-kube-api-access-brnkp podName:c8c3ac1c-4896-4db2-8917-0a57667a1fa8 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.4653188 +0000 UTC m=+124.025246630 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-brnkp" (UniqueName: "kubernetes.io/projected/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-kube-api-access-brnkp") pod "console-f9d7485db-dnx9z" (UID: "c8c3ac1c-4896-4db2-8917-0a57667a1fa8") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.968797 4971 projected.go:194] Error preparing data for projected volume kube-api-access-2pvnt for pod openshift-console/downloads-7954f5f757-d25sv: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.968862 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/afc88ae6-e5b1-4da0-b10a-a6bf1816e6fa-kube-api-access-2pvnt podName:afc88ae6-e5b1-4da0-b10a-a6bf1816e6fa nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.468848473 +0000 UTC m=+124.028776283 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2pvnt" (UniqueName: "kubernetes.io/projected/afc88ae6-e5b1-4da0-b10a-a6bf1816e6fa-kube-api-access-2pvnt") pod "downloads-7954f5f757-d25sv" (UID: "afc88ae6-e5b1-4da0-b10a-a6bf1816e6fa") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.979146 4971 projected.go:288] Couldn't get configMap openshift-console-operator/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.979577 4971 projected.go:194] Error preparing data for projected volume kube-api-access-mkn5d for pod openshift-console-operator/console-operator-58897d9998-wf8hd: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.979664 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/79d6be06-8c45-4058-a2ff-5daf63d0404e-kube-api-access-mkn5d podName:79d6be06-8c45-4058-a2ff-5daf63d0404e nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.479640858 +0000 UTC m=+124.039568738 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mkn5d" (UniqueName: "kubernetes.io/projected/79d6be06-8c45-4058-a2ff-5daf63d0404e-kube-api-access-mkn5d") pod "console-operator-58897d9998-wf8hd" (UID: "79d6be06-8c45-4058-a2ff-5daf63d0404e") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.984887 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:21:59 crc kubenswrapper[4971]: E0309 09:21:59.985292 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.485274063 +0000 UTC m=+124.045201873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:21:59 crc kubenswrapper[4971]: I0309 09:21:59.988796 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.003394 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.007324 4971 projected.go:194] Error preparing data for projected volume kube-api-access-hmzhb for pod openshift-config-operator/openshift-config-operator-7777fb866f-h78mk: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.007452 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/80f7e4a7-4617-4978-b42e-8a33b6465690-kube-api-access-hmzhb podName:80f7e4a7-4617-4978-b42e-8a33b6465690 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.50742834 +0000 UTC m=+124.067356170 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hmzhb" (UniqueName: "kubernetes.io/projected/80f7e4a7-4617-4978-b42e-8a33b6465690-kube-api-access-hmzhb") pod "openshift-config-operator-7777fb866f-h78mk" (UID: "80f7e4a7-4617-4978-b42e-8a33b6465690") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.018429 4971 projected.go:288] Couldn't get configMap openshift-apiserver/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.018485 4971 projected.go:194] Error preparing data for projected volume kube-api-access-ftt99 for pod openshift-apiserver/apiserver-76f77b778f-rqlbq: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.018550 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-kube-api-access-ftt99 podName:bcd6b63d-8557-4c0b-b000-7d9e14cd229e nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.518529604 +0000 UTC m=+124.078457414 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ftt99" (UniqueName: "kubernetes.io/projected/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-kube-api-access-ftt99") pod "apiserver-76f77b778f-rqlbq" (UID: "bcd6b63d-8557-4c0b-b000-7d9e14cd229e") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.023330 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.045179 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.061792 4971 request.go:700] Waited for 1.043631524s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27078 Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.063681 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.086272 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.086482 4971 projected.go:194] Error preparing data for projected volume kube-api-access-kfsh6 for pod openshift-dns-operator/dns-operator-744455d44c-7xwd6: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.086536 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e1d5ee3-5d9c-4d44-bf5a-343216e8803e-kube-api-access-kfsh6 podName:7e1d5ee3-5d9c-4d44-bf5a-343216e8803e nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.586519638 +0000 UTC m=+124.146447448 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kfsh6" (UniqueName: "kubernetes.io/projected/7e1d5ee3-5d9c-4d44-bf5a-343216e8803e-kube-api-access-kfsh6") pod "dns-operator-744455d44c-7xwd6" (UID: "7e1d5ee3-5d9c-4d44-bf5a-343216e8803e") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.086794 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.086931 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.58691935 +0000 UTC m=+124.146847150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.087396 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.087644 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kd9w\" (UniqueName: \"kubernetes.io/projected/d28dca1b-efeb-4b15-833b-8bc78aa16238-kube-api-access-4kd9w\") pod \"cluster-samples-operator-665b6dd947-hg8qg\" (UID: \"d28dca1b-efeb-4b15-833b-8bc78aa16238\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hg8qg" Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.087703 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.587696543 +0000 UTC m=+124.147624353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.090991 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kd9w\" (UniqueName: \"kubernetes.io/projected/d28dca1b-efeb-4b15-833b-8bc78aa16238-kube-api-access-4kd9w\") pod \"cluster-samples-operator-665b6dd947-hg8qg\" (UID: \"d28dca1b-efeb-4b15-833b-8bc78aa16238\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hg8qg" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.104447 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.123310 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550802-d8cbz"] Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.124231 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550802-d8cbz" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.132067 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550802-d8cbz"] Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.144382 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.148634 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cm5pk" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.181068 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xlzb\" (UniqueName: \"kubernetes.io/projected/7ddfae4b-5893-4e15-a983-1adb19c5970e-kube-api-access-8xlzb\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.188715 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.189373 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlrhf\" (UniqueName: \"kubernetes.io/projected/603b9f27-06c0-4fe8-8cc3-416122462369-kube-api-access-tlrhf\") pod \"auto-csr-approver-29550802-d8cbz\" (UID: \"603b9f27-06c0-4fe8-8cc3-416122462369\") " pod="openshift-infra/auto-csr-approver-29550802-d8cbz" Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.189730 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.689713431 +0000 UTC m=+124.249641241 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.226873 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph782\" (UniqueName: \"kubernetes.io/projected/2fbb93d1-04cc-4152-b593-4fc23ebfa1ac-kube-api-access-ph782\") pod \"ingress-operator-5b745b69d9-vsdnk\" (UID: \"2fbb93d1-04cc-4152-b593-4fc23ebfa1ac\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdnk" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.243657 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.246879 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7ddfae4b-5893-4e15-a983-1adb19c5970e-bound-sa-token\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.250227 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9f38918-24d8-44d6-9ed5-0d9e69ddc590-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pwp59\" (UID: \"f9f38918-24d8-44d6-9ed5-0d9e69ddc590\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwp59" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.271631 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.279471 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5073d2d2-177a-4e70-9638-7fe56084c301-audit-policies\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.284735 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.290418 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db64f07f-f1cb-4754-8e1f-33951a826f78-config\") pod \"machine-api-operator-5694c8668f-t9hb6\" (UID: \"db64f07f-f1cb-4754-8e1f-33951a826f78\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t9hb6" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.290994 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.291050 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlrhf\" (UniqueName: \"kubernetes.io/projected/603b9f27-06c0-4fe8-8cc3-416122462369-kube-api-access-tlrhf\") pod \"auto-csr-approver-29550802-d8cbz\" (UID: \"603b9f27-06c0-4fe8-8cc3-416122462369\") " pod="openshift-infra/auto-csr-approver-29550802-d8cbz" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.291135 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnhl5\" (UniqueName: \"kubernetes.io/projected/23bc42fe-aadb-4283-a679-b07d87b04a15-kube-api-access-dnhl5\") pod \"machine-approver-56656f9798-9w96x\" (UID: \"23bc42fe-aadb-4283-a679-b07d87b04a15\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9w96x" Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.291450 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.79142974 +0000 UTC m=+124.351357590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.295964 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnhl5\" (UniqueName: \"kubernetes.io/projected/23bc42fe-aadb-4283-a679-b07d87b04a15-kube-api-access-dnhl5\") pod \"machine-approver-56656f9798-9w96x\" (UID: \"23bc42fe-aadb-4283-a679-b07d87b04a15\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9w96x" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.315729 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cm5pk"] Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.323403 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.325452 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2555712b-fa0a-4831-90ca-78d22b2e48b9-audit-policies\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.345218 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.353883 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f037869a-34a5-43d9-8c27-6ac17e4fe6b1-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-pf6b7\" (UID: \"f037869a-34a5-43d9-8c27-6ac17e4fe6b1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pf6b7" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.363891 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.368778 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.392395 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.392578 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.892550172 +0000 UTC m=+124.452477982 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.393048 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.393551 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.89352845 +0000 UTC m=+124.453456280 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.409853 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.413967 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f037869a-34a5-43d9-8c27-6ac17e4fe6b1-config\") pod \"openshift-apiserver-operator-796bbdcf4f-pf6b7\" (UID: \"f037869a-34a5-43d9-8c27-6ac17e4fe6b1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pf6b7" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.423536 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.433851 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50b5a937-bb23-4b89-86a3-6ad4944f5440-serving-cert\") pod \"authentication-operator-69f744f599-mct42\" (UID: \"50b5a937-bb23-4b89-86a3-6ad4944f5440\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mct42" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.443693 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.450686 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5073d2d2-177a-4e70-9638-7fe56084c301-encryption-config\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.465593 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.469460 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9f38918-24d8-44d6-9ed5-0d9e69ddc590-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pwp59\" (UID: \"f9f38918-24d8-44d6-9ed5-0d9e69ddc590\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwp59" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.483189 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.489765 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.495247 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.496132 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brnkp\" (UniqueName: \"kubernetes.io/projected/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-kube-api-access-brnkp\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.496322 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkn5d\" (UniqueName: \"kubernetes.io/projected/79d6be06-8c45-4058-a2ff-5daf63d0404e-kube-api-access-mkn5d\") pod \"console-operator-58897d9998-wf8hd\" (UID: \"79d6be06-8c45-4058-a2ff-5daf63d0404e\") " pod="openshift-console-operator/console-operator-58897d9998-wf8hd" Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.496517 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.996487266 +0000 UTC m=+124.556415086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.498574 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pvnt\" (UniqueName: \"kubernetes.io/projected/afc88ae6-e5b1-4da0-b10a-a6bf1816e6fa-kube-api-access-2pvnt\") pod \"downloads-7954f5f757-d25sv\" (UID: \"afc88ae6-e5b1-4da0-b10a-a6bf1816e6fa\") " pod="openshift-console/downloads-7954f5f757-d25sv" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.499031 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.499724 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:00.99970023 +0000 UTC m=+124.559628040 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.503246 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkn5d\" (UniqueName: \"kubernetes.io/projected/79d6be06-8c45-4058-a2ff-5daf63d0404e-kube-api-access-mkn5d\") pod \"console-operator-58897d9998-wf8hd\" (UID: \"79d6be06-8c45-4058-a2ff-5daf63d0404e\") " pod="openshift-console-operator/console-operator-58897d9998-wf8hd" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.504076 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pvnt\" (UniqueName: \"kubernetes.io/projected/afc88ae6-e5b1-4da0-b10a-a6bf1816e6fa-kube-api-access-2pvnt\") pod \"downloads-7954f5f757-d25sv\" (UID: \"afc88ae6-e5b1-4da0-b10a-a6bf1816e6fa\") " pod="openshift-console/downloads-7954f5f757-d25sv" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.506930 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brnkp\" (UniqueName: \"kubernetes.io/projected/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-kube-api-access-brnkp\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.511053 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.521068 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.546046 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.554065 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/db64f07f-f1cb-4754-8e1f-33951a826f78-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t9hb6\" (UID: \"db64f07f-f1cb-4754-8e1f-33951a826f78\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t9hb6" Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.579872 4971 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.579966 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db64f07f-f1cb-4754-8e1f-33951a826f78-images podName:db64f07f-f1cb-4754-8e1f-33951a826f78 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:01.079946412 +0000 UTC m=+124.639874222 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/db64f07f-f1cb-4754-8e1f-33951a826f78-images") pod "machine-api-operator-5694c8668f-t9hb6" (UID: "db64f07f-f1cb-4754-8e1f-33951a826f78") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.579899 4971 secret.go:188] Couldn't get secret openshift-oauth-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.580319 4971 configmap.go:193] Couldn't get configMap openshift-authentication-operator/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.580371 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/50b5a937-bb23-4b89-86a3-6ad4944f5440-service-ca-bundle podName:50b5a937-bb23-4b89-86a3-6ad4944f5440 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:01.080348834 +0000 UTC m=+124.640276644 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/50b5a937-bb23-4b89-86a3-6ad4944f5440-service-ca-bundle") pod "authentication-operator-69f744f599-mct42" (UID: "50b5a937-bb23-4b89-86a3-6ad4944f5440") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.580392 4971 secret.go:188] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.580416 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70b1c95e-1326-4a4d-92f8-12df76f6a23a-serving-cert podName:70b1c95e-1326-4a4d-92f8-12df76f6a23a nodeName:}" failed. No retries permitted until 2026-03-09 09:22:01.080409996 +0000 UTC m=+124.640337806 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/70b1c95e-1326-4a4d-92f8-12df76f6a23a-serving-cert") pod "route-controller-manager-6576b87f9c-t9sxl" (UID: "70b1c95e-1326-4a4d-92f8-12df76f6a23a") : failed to sync secret cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.580435 4971 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.580451 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5073d2d2-177a-4e70-9638-7fe56084c301-etcd-serving-ca podName:5073d2d2-177a-4e70-9638-7fe56084c301 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:01.080446517 +0000 UTC m=+124.640374327 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/5073d2d2-177a-4e70-9638-7fe56084c301-etcd-serving-ca") pod "apiserver-7bbb656c7d-brs7r" (UID: "5073d2d2-177a-4e70-9638-7fe56084c301") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.579873 4971 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.580558 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-serving-cert podName:2555712b-fa0a-4831-90ca-78d22b2e48b9 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:01.0805522 +0000 UTC m=+124.640480010 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-serving-cert" (UniqueName: "kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-serving-cert") pod "oauth-openshift-558db77b4-nvzgg" (UID: "2555712b-fa0a-4831-90ca-78d22b2e48b9") : failed to sync secret cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.580271 4971 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.580582 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5073d2d2-177a-4e70-9638-7fe56084c301-trusted-ca-bundle podName:5073d2d2-177a-4e70-9638-7fe56084c301 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:01.080577291 +0000 UTC m=+124.640505101 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/5073d2d2-177a-4e70-9638-7fe56084c301-trusted-ca-bundle") pod "apiserver-7bbb656c7d-brs7r" (UID: "5073d2d2-177a-4e70-9638-7fe56084c301") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.580290 4971 configmap.go:193] Couldn't get configMap openshift-authentication-operator/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.580605 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/50b5a937-bb23-4b89-86a3-6ad4944f5440-trusted-ca-bundle podName:50b5a937-bb23-4b89-86a3-6ad4944f5440 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:01.080601101 +0000 UTC m=+124.640528911 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/50b5a937-bb23-4b89-86a3-6ad4944f5440-trusted-ca-bundle") pod "authentication-operator-69f744f599-mct42" (UID: "50b5a937-bb23-4b89-86a3-6ad4944f5440") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.580620 4971 secret.go:188] Couldn't get secret openshift-oauth-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.580636 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5073d2d2-177a-4e70-9638-7fe56084c301-serving-cert podName:5073d2d2-177a-4e70-9638-7fe56084c301 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:01.080631872 +0000 UTC m=+124.640559682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/5073d2d2-177a-4e70-9638-7fe56084c301-serving-cert") pod "apiserver-7bbb656c7d-brs7r" (UID: "5073d2d2-177a-4e70-9638-7fe56084c301") : failed to sync secret cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.580992 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5073d2d2-177a-4e70-9638-7fe56084c301-etcd-client podName:5073d2d2-177a-4e70-9638-7fe56084c301 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:01.080974332 +0000 UTC m=+124.640902162 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/5073d2d2-177a-4e70-9638-7fe56084c301-etcd-client") pod "apiserver-7bbb656c7d-brs7r" (UID: "5073d2d2-177a-4e70-9638-7fe56084c301") : failed to sync secret cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.581794 4971 configmap.go:193] Couldn't get configMap openshift-authentication-operator/authentication-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.581936 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/50b5a937-bb23-4b89-86a3-6ad4944f5440-config podName:50b5a937-bb23-4b89-86a3-6ad4944f5440 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:01.08192395 +0000 UTC m=+124.641851770 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/50b5a937-bb23-4b89-86a3-6ad4944f5440-config") pod "authentication-operator-69f744f599-mct42" (UID: "50b5a937-bb23-4b89-86a3-6ad4944f5440") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.582071 4971 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.582250 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/70b1c95e-1326-4a4d-92f8-12df76f6a23a-client-ca podName:70b1c95e-1326-4a4d-92f8-12df76f6a23a nodeName:}" failed. No retries permitted until 2026-03-09 09:22:01.082237859 +0000 UTC m=+124.642165679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/70b1c95e-1326-4a4d-92f8-12df76f6a23a-client-ca") pod "route-controller-manager-6576b87f9c-t9sxl" (UID: "70b1c95e-1326-4a4d-92f8-12df76f6a23a") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.582929 4971 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.582970 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/70b1c95e-1326-4a4d-92f8-12df76f6a23a-config podName:70b1c95e-1326-4a4d-92f8-12df76f6a23a nodeName:}" failed. No retries permitted until 2026-03-09 09:22:01.08296072 +0000 UTC m=+124.642888530 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/70b1c95e-1326-4a4d-92f8-12df76f6a23a-config") pod "route-controller-manager-6576b87f9c-t9sxl" (UID: "70b1c95e-1326-4a4d-92f8-12df76f6a23a") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.591179 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.600105 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.600273 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-serving-cert\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.600342 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/23bc42fe-aadb-4283-a679-b07d87b04a15-auth-proxy-config\") pod \"machine-approver-56656f9798-9w96x\" (UID: \"23bc42fe-aadb-4283-a679-b07d87b04a15\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9w96x" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.600407 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-service-ca\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.600461 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-config\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.600486 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfsh6\" (UniqueName: \"kubernetes.io/projected/7e1d5ee3-5d9c-4d44-bf5a-343216e8803e-kube-api-access-kfsh6\") pod \"dns-operator-744455d44c-7xwd6\" (UID: \"7e1d5ee3-5d9c-4d44-bf5a-343216e8803e\") " pod="openshift-dns-operator/dns-operator-744455d44c-7xwd6" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.600580 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmzhb\" (UniqueName: \"kubernetes.io/projected/80f7e4a7-4617-4978-b42e-8a33b6465690-kube-api-access-hmzhb\") pod \"openshift-config-operator-7777fb866f-h78mk\" (UID: \"80f7e4a7-4617-4978-b42e-8a33b6465690\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h78mk" Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.600731 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:01.100694288 +0000 UTC m=+124.660622098 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.601092 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/23bc42fe-aadb-4283-a679-b07d87b04a15-machine-approver-tls\") pod \"machine-approver-56656f9798-9w96x\" (UID: \"23bc42fe-aadb-4283-a679-b07d87b04a15\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9w96x" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.601152 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-trusted-ca-bundle\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.601194 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-oauth-config\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.601212 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d28dca1b-efeb-4b15-833b-8bc78aa16238-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-hg8qg\" (UID: \"d28dca1b-efeb-4b15-833b-8bc78aa16238\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hg8qg" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.601281 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftt99\" (UniqueName: \"kubernetes.io/projected/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-kube-api-access-ftt99\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.601291 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/23bc42fe-aadb-4283-a679-b07d87b04a15-auth-proxy-config\") pod \"machine-approver-56656f9798-9w96x\" (UID: \"23bc42fe-aadb-4283-a679-b07d87b04a15\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9w96x" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.601324 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80f7e4a7-4617-4978-b42e-8a33b6465690-serving-cert\") pod \"openshift-config-operator-7777fb866f-h78mk\" (UID: \"80f7e4a7-4617-4978-b42e-8a33b6465690\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h78mk" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.602055 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-service-ca\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.602086 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-config\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.603758 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.603879 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-serving-cert\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.604121 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmzhb\" (UniqueName: \"kubernetes.io/projected/80f7e4a7-4617-4978-b42e-8a33b6465690-kube-api-access-hmzhb\") pod \"openshift-config-operator-7777fb866f-h78mk\" (UID: \"80f7e4a7-4617-4978-b42e-8a33b6465690\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h78mk" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.604648 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/23bc42fe-aadb-4283-a679-b07d87b04a15-machine-approver-tls\") pod \"machine-approver-56656f9798-9w96x\" (UID: \"23bc42fe-aadb-4283-a679-b07d87b04a15\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9w96x" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.605007 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfsh6\" (UniqueName: \"kubernetes.io/projected/7e1d5ee3-5d9c-4d44-bf5a-343216e8803e-kube-api-access-kfsh6\") pod \"dns-operator-744455d44c-7xwd6\" (UID: \"7e1d5ee3-5d9c-4d44-bf5a-343216e8803e\") " pod="openshift-dns-operator/dns-operator-744455d44c-7xwd6" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.605089 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-console-oauth-config\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.605747 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c3ac1c-4896-4db2-8917-0a57667a1fa8-trusted-ca-bundle\") pod \"console-f9d7485db-dnx9z\" (UID: \"c8c3ac1c-4896-4db2-8917-0a57667a1fa8\") " pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.606424 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftt99\" (UniqueName: \"kubernetes.io/projected/bcd6b63d-8557-4c0b-b000-7d9e14cd229e-kube-api-access-ftt99\") pod \"apiserver-76f77b778f-rqlbq\" (UID: \"bcd6b63d-8557-4c0b-b000-7d9e14cd229e\") " pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.606720 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d28dca1b-efeb-4b15-833b-8bc78aa16238-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-hg8qg\" (UID: \"d28dca1b-efeb-4b15-833b-8bc78aa16238\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hg8qg" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.610947 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80f7e4a7-4617-4978-b42e-8a33b6465690-serving-cert\") pod \"openshift-config-operator-7777fb866f-h78mk\" (UID: \"80f7e4a7-4617-4978-b42e-8a33b6465690\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h78mk" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.624190 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.644333 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.663827 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.683495 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.694854 4971 projected.go:288] Couldn't get configMap openshift-controller-manager-operator/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.695113 4971 projected.go:194] Error preparing data for projected volume kube-api-access-zvz99 for pod openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-txc2s: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.695228 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a5371ca7-5f2f-4b51-add8-021a77d93c9c-kube-api-access-zvz99 podName:a5371ca7-5f2f-4b51-add8-021a77d93c9c nodeName:}" failed. No retries permitted until 2026-03-09 09:22:01.195209687 +0000 UTC m=+124.755137497 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zvz99" (UniqueName: "kubernetes.io/projected/a5371ca7-5f2f-4b51-add8-021a77d93c9c-kube-api-access-zvz99") pod "openshift-controller-manager-operator-756b6f6bc6-txc2s" (UID: "a5371ca7-5f2f-4b51-add8-021a77d93c9c") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.702032 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.702716 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:01.202705676 +0000 UTC m=+124.762633486 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.703746 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.724095 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.743623 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.746321 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hg8qg" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.763606 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.766244 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9w96x" Mar 09 09:22:00 crc kubenswrapper[4971]: W0309 09:22:00.779636 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23bc42fe_aadb_4283_a679_b07d87b04a15.slice/crio-2064a0b975894aa3f4653031eea194bea67581bea2d1e41732b48a3857c9d980 WatchSource:0}: Error finding container 2064a0b975894aa3f4653031eea194bea67581bea2d1e41732b48a3857c9d980: Status 404 returned error can't find the container with id 2064a0b975894aa3f4653031eea194bea67581bea2d1e41732b48a3857c9d980 Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.781264 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h78mk" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.795690 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bzmmh" event={"ID":"ca599a0a-36fb-4040-9304-01eb9d4c19d0","Type":"ContainerStarted","Data":"15ad1be64259ddb6c5c62788f6c48e84d2042fe39be9fee5dc4d7284c00c2b42"} Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.798909 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9w96x" event={"ID":"23bc42fe-aadb-4283-a679-b07d87b04a15","Type":"ContainerStarted","Data":"2064a0b975894aa3f4653031eea194bea67581bea2d1e41732b48a3857c9d980"} Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.802383 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-d25sv" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.802798 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.802891 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:01.30286966 +0000 UTC m=+124.862797470 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.803044 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.803463 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:01.303453457 +0000 UTC m=+124.863381357 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.803818 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.805774 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cm5pk" event={"ID":"d4feaecd-f674-489f-a6d3-12e5f433d90e","Type":"ContainerStarted","Data":"4e935a373af097ed675ceeaa2818b0d1ea4b9457e327a7a38473e799b110487e"} Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.805810 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cm5pk" event={"ID":"d4feaecd-f674-489f-a6d3-12e5f433d90e","Type":"ContainerStarted","Data":"a6a85d2b598656c8522e5bd95cbc97b27083670d0d3ce104430cda314445fede"} Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.808802 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b6b53728c5fb4faf7bd4d3801adcc33057595481fa7f5456c48f6f8fdb347e28"} Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.808836 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"29313920f02b746bfd0f9c40de4ff611d91d160670475cb918cb3a2778f0f3ff"} Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.809031 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.810336 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"dddfeefb9cf733cbb2ccbe655c60a8c1a7104c7d608b71448fee0a9f4cb0a395"} Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.810401 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"22966eed4f8a655459212910318d94860c57d4472fd158cc8c5e9f4d7cb7834b"} Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.813998 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.829587 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2f1942ce7be616456ca328196060792d58d478f65a2af0f49c24b931ae04e2c0"} Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.847937 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2fbb93d1-04cc-4152-b593-4fc23ebfa1ac-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vsdnk\" (UID: \"2fbb93d1-04cc-4152-b593-4fc23ebfa1ac\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdnk" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.892114 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.906005 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.908786 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:01.408758301 +0000 UTC m=+124.968686121 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.909164 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:00 crc kubenswrapper[4971]: E0309 09:22:00.911509 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:01.4114934 +0000 UTC m=+124.971421210 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.919417 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htn6w\" (UniqueName: \"kubernetes.io/projected/0325b4dc-fe2a-4685-8e37-621a96f6b976-kube-api-access-htn6w\") pod \"controller-manager-879f6c89f-lmr9s\" (UID: \"0325b4dc-fe2a-4685-8e37-621a96f6b976\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.934329 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hg8qg"] Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.944745 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftf96\" (UniqueName: \"kubernetes.io/projected/339ca768-fe61-40dd-8a4f-93363aa23972-kube-api-access-ftf96\") pod \"package-server-manager-789f6589d5-qz4l7\" (UID: \"339ca768-fe61-40dd-8a4f-93363aa23972\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qz4l7" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.962755 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx2kb\" (UniqueName: \"kubernetes.io/projected/39079cdf-1b40-4f77-ad11-3816fc89e3df-kube-api-access-vx2kb\") pod \"machine-config-controller-84d6567774-h8b5s\" (UID: \"39079cdf-1b40-4f77-ad11-3816fc89e3df\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8b5s" Mar 09 09:22:00 crc kubenswrapper[4971]: I0309 09:22:00.977719 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fmfp\" (UniqueName: \"kubernetes.io/projected/f372b9ff-41d6-4712-bf7a-9c229f1f7673-kube-api-access-8fmfp\") pod \"dns-default-th2ls\" (UID: \"f372b9ff-41d6-4712-bf7a-9c229f1f7673\") " pod="openshift-dns/dns-default-th2ls" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.000540 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwg7w\" (UniqueName: \"kubernetes.io/projected/56909779-30d4-4350-810d-9675796d96ad-kube-api-access-hwg7w\") pod \"olm-operator-6b444d44fb-5bzw7\" (UID: \"56909779-30d4-4350-810d-9675796d96ad\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bzw7" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.012252 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:01 crc kubenswrapper[4971]: E0309 09:22:01.012862 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:01.512848149 +0000 UTC m=+125.072775959 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.018252 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-h78mk"] Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.024827 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7vj4\" (UniqueName: \"kubernetes.io/projected/6dba6300-591c-4fc5-8544-b208731d2dc6-kube-api-access-x7vj4\") pod \"ingress-canary-57xjq\" (UID: \"6dba6300-591c-4fc5-8544-b208731d2dc6\") " pod="openshift-ingress-canary/ingress-canary-57xjq" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.041187 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95c0f392-b6e8-4719-8cad-0d267cf8b955-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2kmr7\" (UID: \"95c0f392-b6e8-4719-8cad-0d267cf8b955\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2kmr7" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.047420 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-th2ls" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.062965 4971 request.go:700] Waited for 1.380609572s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/serviceaccounts/collect-profiles/token Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.067531 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9wjh\" (UniqueName: \"kubernetes.io/projected/1ed6451f-4bc6-4dcc-b84c-413dbb95114b-kube-api-access-j9wjh\") pod \"marketplace-operator-79b997595-zqnt8\" (UID: \"1ed6451f-4bc6-4dcc-b84c-413dbb95114b\") " pod="openshift-marketplace/marketplace-operator-79b997595-zqnt8" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.083508 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npm8w\" (UniqueName: \"kubernetes.io/projected/1707bff4-eb31-4ed0-bbc5-054813b1a34a-kube-api-access-npm8w\") pod \"collect-profiles-29550795-nmbwp\" (UID: \"1707bff4-eb31-4ed0-bbc5-054813b1a34a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-nmbwp" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.097749 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-dnx9z"] Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.104645 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdnk" Mar 09 09:22:01 crc kubenswrapper[4971]: E0309 09:22:01.105651 4971 projected.go:288] Couldn't get configMap openshift-etcd-operator/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:01 crc kubenswrapper[4971]: E0309 09:22:01.105669 4971 projected.go:194] Error preparing data for projected volume kube-api-access-jjgtf for pod openshift-etcd-operator/etcd-operator-b45778765-nw59v: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:01 crc kubenswrapper[4971]: E0309 09:22:01.105714 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-kube-api-access-jjgtf podName:2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da nodeName:}" failed. No retries permitted until 2026-03-09 09:22:01.6056984 +0000 UTC m=+125.165626210 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jjgtf" (UniqueName: "kubernetes.io/projected/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-kube-api-access-jjgtf") pod "etcd-operator-b45778765-nw59v" (UID: "2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.108771 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxf26\" (UniqueName: \"kubernetes.io/projected/43af91cb-669b-473a-a92f-d6b8fffa0cc7-kube-api-access-jxf26\") pod \"machine-config-server-4g4cr\" (UID: \"43af91cb-669b-473a-a92f-d6b8fffa0cc7\") " pod="openshift-machine-config-operator/machine-config-server-4g4cr" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.113984 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50b5a937-bb23-4b89-86a3-6ad4944f5440-service-ca-bundle\") pod \"authentication-operator-69f744f599-mct42\" (UID: \"50b5a937-bb23-4b89-86a3-6ad4944f5440\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mct42" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.114052 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5073d2d2-177a-4e70-9638-7fe56084c301-serving-cert\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.114133 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50b5a937-bb23-4b89-86a3-6ad4944f5440-config\") pod \"authentication-operator-69f744f599-mct42\" (UID: \"50b5a937-bb23-4b89-86a3-6ad4944f5440\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mct42" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.114159 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.114202 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70b1c95e-1326-4a4d-92f8-12df76f6a23a-config\") pod \"route-controller-manager-6576b87f9c-t9sxl\" (UID: \"70b1c95e-1326-4a4d-92f8-12df76f6a23a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.114223 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.114240 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5073d2d2-177a-4e70-9638-7fe56084c301-etcd-client\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.114257 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/db64f07f-f1cb-4754-8e1f-33951a826f78-images\") pod \"machine-api-operator-5694c8668f-t9hb6\" (UID: \"db64f07f-f1cb-4754-8e1f-33951a826f78\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t9hb6" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.114287 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70b1c95e-1326-4a4d-92f8-12df76f6a23a-client-ca\") pod \"route-controller-manager-6576b87f9c-t9sxl\" (UID: \"70b1c95e-1326-4a4d-92f8-12df76f6a23a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.114312 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5073d2d2-177a-4e70-9638-7fe56084c301-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.114334 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70b1c95e-1326-4a4d-92f8-12df76f6a23a-serving-cert\") pod \"route-controller-manager-6576b87f9c-t9sxl\" (UID: \"70b1c95e-1326-4a4d-92f8-12df76f6a23a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.114353 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50b5a937-bb23-4b89-86a3-6ad4944f5440-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mct42\" (UID: \"50b5a937-bb23-4b89-86a3-6ad4944f5440\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mct42" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.114387 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5073d2d2-177a-4e70-9638-7fe56084c301-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.115428 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/db64f07f-f1cb-4754-8e1f-33951a826f78-images\") pod \"machine-api-operator-5694c8668f-t9hb6\" (UID: \"db64f07f-f1cb-4754-8e1f-33951a826f78\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t9hb6" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.116205 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70b1c95e-1326-4a4d-92f8-12df76f6a23a-client-ca\") pod \"route-controller-manager-6576b87f9c-t9sxl\" (UID: \"70b1c95e-1326-4a4d-92f8-12df76f6a23a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.116741 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50b5a937-bb23-4b89-86a3-6ad4944f5440-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mct42\" (UID: \"50b5a937-bb23-4b89-86a3-6ad4944f5440\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mct42" Mar 09 09:22:01 crc kubenswrapper[4971]: E0309 09:22:01.116937 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:01.616919447 +0000 UTC m=+125.176847327 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.116975 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50b5a937-bb23-4b89-86a3-6ad4944f5440-config\") pod \"authentication-operator-69f744f599-mct42\" (UID: \"50b5a937-bb23-4b89-86a3-6ad4944f5440\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mct42" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.117341 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70b1c95e-1326-4a4d-92f8-12df76f6a23a-config\") pod \"route-controller-manager-6576b87f9c-t9sxl\" (UID: \"70b1c95e-1326-4a4d-92f8-12df76f6a23a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.121009 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5073d2d2-177a-4e70-9638-7fe56084c301-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.121233 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50b5a937-bb23-4b89-86a3-6ad4944f5440-service-ca-bundle\") pod \"authentication-operator-69f744f599-mct42\" (UID: \"50b5a937-bb23-4b89-86a3-6ad4944f5440\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mct42" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.122728 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.123103 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70b1c95e-1326-4a4d-92f8-12df76f6a23a-serving-cert\") pod \"route-controller-manager-6576b87f9c-t9sxl\" (UID: \"70b1c95e-1326-4a4d-92f8-12df76f6a23a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.125709 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjmnd\" (UniqueName: \"kubernetes.io/projected/f7b8f2b8-0607-467d-8ba2-3b823817b639-kube-api-access-hjmnd\") pod \"multus-admission-controller-857f4d67dd-48g6z\" (UID: \"f7b8f2b8-0607-467d-8ba2-3b823817b639\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-48g6z" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.126063 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5073d2d2-177a-4e70-9638-7fe56084c301-serving-cert\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.130156 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5073d2d2-177a-4e70-9638-7fe56084c301-etcd-client\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.131117 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-d25sv"] Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.131680 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5073d2d2-177a-4e70-9638-7fe56084c301-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.138302 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b5fx\" (UniqueName: \"kubernetes.io/projected/7c2058d7-0c77-4f28-a103-679184ed575c-kube-api-access-2b5fx\") pod \"service-ca-operator-777779d784-brkss\" (UID: \"7c2058d7-0c77-4f28-a103-679184ed575c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-brkss" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.147574 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.158820 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zqnt8" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.166604 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m7rf\" (UniqueName: \"kubernetes.io/projected/13a19b2e-fdd8-41cc-89ac-ed182fa3a449-kube-api-access-4m7rf\") pod \"packageserver-d55dfcdfc-b65bx\" (UID: \"13a19b2e-fdd8-41cc-89ac-ed182fa3a449\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b65bx" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.168315 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2kmr7" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.174700 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-nmbwp" Mar 09 09:22:01 crc kubenswrapper[4971]: E0309 09:22:01.198538 4971 projected.go:288] Couldn't get configMap openshift-apiserver-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.203321 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfbb9\" (UniqueName: \"kubernetes.io/projected/83f8f490-e050-4721-8784-0879496323ad-kube-api-access-xfbb9\") pod \"machine-config-operator-74547568cd-4zczm\" (UID: \"83f8f490-e050-4721-8784-0879496323ad\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zczm" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.203460 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2wnj\" (UniqueName: \"kubernetes.io/projected/2432a454-1fbc-4fe4-a6cb-27292e8b670d-kube-api-access-j2wnj\") pod \"migrator-59844c95c7-mbv68\" (UID: \"2432a454-1fbc-4fe4-a6cb-27292e8b670d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mbv68" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.217070 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:01 crc kubenswrapper[4971]: E0309 09:22:01.217316 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:01.717293097 +0000 UTC m=+125.277220907 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.217496 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvz99\" (UniqueName: \"kubernetes.io/projected/a5371ca7-5f2f-4b51-add8-021a77d93c9c-kube-api-access-zvz99\") pod \"openshift-controller-manager-operator-756b6f6bc6-txc2s\" (UID: \"a5371ca7-5f2f-4b51-add8-021a77d93c9c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-txc2s" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.220988 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtqcm\" (UniqueName: \"kubernetes.io/projected/91b993d3-35bb-4b9b-9e1a-ca96fa6f8162-kube-api-access-jtqcm\") pod \"csi-hostpathplugin-krqgm\" (UID: \"91b993d3-35bb-4b9b-9e1a-ca96fa6f8162\") " pod="hostpath-provisioner/csi-hostpathplugin-krqgm" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.227854 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qz4l7" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.237868 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btv9h\" (UniqueName: \"kubernetes.io/projected/e6ecdd40-f0c8-4f9d-9b8a-90c9941d159a-kube-api-access-btv9h\") pod \"service-ca-9c57cc56f-fg4hj\" (UID: \"e6ecdd40-f0c8-4f9d-9b8a-90c9941d159a\") " pod="openshift-service-ca/service-ca-9c57cc56f-fg4hj" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.238085 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-brkss" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.250824 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8b5s" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.262038 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrlsz\" (UniqueName: \"kubernetes.io/projected/1a0999c2-4d90-4197-8075-e11790a0ed9b-kube-api-access-lrlsz\") pod \"cni-sysctl-allowlist-ds-n8lbv\" (UID: \"1a0999c2-4d90-4197-8075-e11790a0ed9b\") " pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.262438 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-48g6z" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.269491 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mbv68" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.283174 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-th2ls"] Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.283197 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78kq2\" (UniqueName: \"kubernetes.io/projected/c91cf18b-1765-48d3-9e00-66747b628f33-kube-api-access-78kq2\") pod \"catalog-operator-68c6474976-j9sc4\" (UID: \"c91cf18b-1765-48d3-9e00-66747b628f33\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j9sc4" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.289162 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b65bx" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.291007 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-fg4hj" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.299754 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bzw7" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.309138 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.320939 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.320974 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-57xjq" Mar 09 09:22:01 crc kubenswrapper[4971]: E0309 09:22:01.322830 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:01.822814788 +0000 UTC m=+125.382742588 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:01 crc kubenswrapper[4971]: E0309 09:22:01.323224 4971 projected.go:288] Couldn't get configMap openshift-authentication-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.324587 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.327448 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8a3dd14-c5c3-4251-88bf-31dcafe04ef1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xhbrd\" (UID: \"e8a3dd14-c5c3-4251-88bf-31dcafe04ef1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xhbrd" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.330729 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-krqgm" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.338714 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-4g4cr" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.339248 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvz99\" (UniqueName: \"kubernetes.io/projected/a5371ca7-5f2f-4b51-add8-021a77d93c9c-kube-api-access-zvz99\") pod \"openshift-controller-manager-operator-756b6f6bc6-txc2s\" (UID: \"a5371ca7-5f2f-4b51-add8-021a77d93c9c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-txc2s" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.344719 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.363999 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.383639 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:22:01 crc kubenswrapper[4971]: E0309 09:22:01.398495 4971 projected.go:288] Couldn't get configMap openshift-machine-api/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.424332 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:01 crc kubenswrapper[4971]: E0309 09:22:01.424497 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:01.924468725 +0000 UTC m=+125.484396545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.424625 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:01 crc kubenswrapper[4971]: E0309 09:22:01.424984 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:01.92497258 +0000 UTC m=+125.484900390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.427849 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.434785 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zczm" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.462854 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlrhf\" (UniqueName: \"kubernetes.io/projected/603b9f27-06c0-4fe8-8cc3-416122462369-kube-api-access-tlrhf\") pod \"auto-csr-approver-29550802-d8cbz\" (UID: \"603b9f27-06c0-4fe8-8cc3-416122462369\") " pod="openshift-infra/auto-csr-approver-29550802-d8cbz" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.463880 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.484977 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.485147 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j9sc4" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.493819 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xhbrd" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.507774 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.524574 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.525177 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:01 crc kubenswrapper[4971]: E0309 09:22:01.525741 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:02.025723541 +0000 UTC m=+125.585651351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.525813 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wf8hd" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.551740 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.564001 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.564212 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-7xwd6" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.596320 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.603450 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.606567 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtn24\" (UniqueName: \"kubernetes.io/projected/9e0270a9-8b08-4abf-88da-75319c5e6f48-kube-api-access-vtn24\") pod \"router-default-5444994796-w4c8h\" (UID: \"9e0270a9-8b08-4abf-88da-75319c5e6f48\") " pod="openshift-ingress/router-default-5444994796-w4c8h" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.627081 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjgtf\" (UniqueName: \"kubernetes.io/projected/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-kube-api-access-jjgtf\") pod \"etcd-operator-b45778765-nw59v\" (UID: \"2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.627178 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:01 crc kubenswrapper[4971]: E0309 09:22:01.628459 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:02.128444379 +0000 UTC m=+125.688372189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.631262 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjgtf\" (UniqueName: \"kubernetes.io/projected/2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da-kube-api-access-jjgtf\") pod \"etcd-operator-b45778765-nw59v\" (UID: \"2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.631645 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.637864 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.645255 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.664964 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 09 09:22:01 crc kubenswrapper[4971]: E0309 09:22:01.669765 4971 projected.go:194] Error preparing data for projected volume kube-api-access-92vq4 for pod openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pf6b7: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:01 crc kubenswrapper[4971]: E0309 09:22:01.669856 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f037869a-34a5-43d9-8c27-6ac17e4fe6b1-kube-api-access-92vq4 podName:f037869a-34a5-43d9-8c27-6ac17e4fe6b1 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:02.169833288 +0000 UTC m=+125.729761098 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-92vq4" (UniqueName: "kubernetes.io/projected/f037869a-34a5-43d9-8c27-6ac17e4fe6b1-kube-api-access-92vq4") pod "openshift-apiserver-operator-796bbdcf4f-pf6b7" (UID: "f037869a-34a5-43d9-8c27-6ac17e4fe6b1") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.692997 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.693377 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-w4c8h" Mar 09 09:22:01 crc kubenswrapper[4971]: E0309 09:22:01.693413 4971 projected.go:194] Error preparing data for projected volume kube-api-access-p5mrq for pod openshift-authentication-operator/authentication-operator-69f744f599-mct42: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:01 crc kubenswrapper[4971]: E0309 09:22:01.693460 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/50b5a937-bb23-4b89-86a3-6ad4944f5440-kube-api-access-p5mrq podName:50b5a937-bb23-4b89-86a3-6ad4944f5440 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:02.193443777 +0000 UTC m=+125.753371587 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-p5mrq" (UniqueName: "kubernetes.io/projected/50b5a937-bb23-4b89-86a3-6ad4944f5440-kube-api-access-p5mrq") pod "authentication-operator-69f744f599-mct42" (UID: "50b5a937-bb23-4b89-86a3-6ad4944f5440") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.695413 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550802-d8cbz" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.706643 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 09 09:22:01 crc kubenswrapper[4971]: E0309 09:22:01.708697 4971 projected.go:194] Error preparing data for projected volume kube-api-access-5g9hh for pod openshift-machine-api/machine-api-operator-5694c8668f-t9hb6: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:01 crc kubenswrapper[4971]: E0309 09:22:01.708809 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/db64f07f-f1cb-4754-8e1f-33951a826f78-kube-api-access-5g9hh podName:db64f07f-f1cb-4754-8e1f-33951a826f78 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:02.208789185 +0000 UTC m=+125.768716995 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5g9hh" (UniqueName: "kubernetes.io/projected/db64f07f-f1cb-4754-8e1f-33951a826f78-kube-api-access-5g9hh") pod "machine-api-operator-5694c8668f-t9hb6" (UID: "db64f07f-f1cb-4754-8e1f-33951a826f78") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.718476 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsdx6\" (UniqueName: \"kubernetes.io/projected/c9cdbff0-0cca-4375-8c92-1117ce5d1dea-kube-api-access-gsdx6\") pod \"control-plane-machine-set-operator-78cbb6b69f-dvp8t\" (UID: \"c9cdbff0-0cca-4375-8c92-1117ce5d1dea\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dvp8t" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.721381 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dvp8t" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.727796 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:01 crc kubenswrapper[4971]: E0309 09:22:01.728099 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:02.228076718 +0000 UTC m=+125.788004528 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.728277 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:01 crc kubenswrapper[4971]: E0309 09:22:01.728565 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:02.228552882 +0000 UTC m=+125.788480692 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.744071 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.748704 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc9xf\" (UniqueName: \"kubernetes.io/projected/70b1c95e-1326-4a4d-92f8-12df76f6a23a-kube-api-access-pc9xf\") pod \"route-controller-manager-6576b87f9c-t9sxl\" (UID: \"70b1c95e-1326-4a4d-92f8-12df76f6a23a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.803715 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.807214 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.807277 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.818454 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-txc2s" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.823896 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8rps\" (UniqueName: \"kubernetes.io/projected/f9f38918-24d8-44d6-9ed5-0d9e69ddc590-kube-api-access-l8rps\") pod \"kube-storage-version-migrator-operator-b67b599dd-pwp59\" (UID: \"f9f38918-24d8-44d6-9ed5-0d9e69ddc590\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwp59" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.824354 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.828909 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:01 crc kubenswrapper[4971]: E0309 09:22:01.829310 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:02.329291502 +0000 UTC m=+125.889219312 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.833406 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk7tf\" (UniqueName: \"kubernetes.io/projected/2555712b-fa0a-4831-90ca-78d22b2e48b9-kube-api-access-jk7tf\") pod \"oauth-openshift-558db77b4-nvzgg\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.841350 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpzdw\" (UniqueName: \"kubernetes.io/projected/5073d2d2-177a-4e70-9638-7fe56084c301-kube-api-access-rpzdw\") pod \"apiserver-7bbb656c7d-brs7r\" (UID: \"5073d2d2-177a-4e70-9638-7fe56084c301\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.844626 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dnx9z" event={"ID":"c8c3ac1c-4896-4db2-8917-0a57667a1fa8","Type":"ContainerStarted","Data":"000b3705ed5fcb0464783395439463f6717916c845721203750e2b06de55366b"} Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.844664 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dnx9z" event={"ID":"c8c3ac1c-4896-4db2-8917-0a57667a1fa8","Type":"ContainerStarted","Data":"9d00534b9e0623a02c348f0154d77a68d884d647b6d5776fab375244717fa46a"} Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.847910 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-4g4cr" event={"ID":"43af91cb-669b-473a-a92f-d6b8fffa0cc7","Type":"ContainerStarted","Data":"95b2275da7c62d154a3c41bd11103e8131a1869c9339b738d0ffafd98595107f"} Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.849429 4971 generic.go:334] "Generic (PLEG): container finished" podID="80f7e4a7-4617-4978-b42e-8a33b6465690" containerID="b716f2eb29255e505fc6cf32493a9d37bb8761ca0c51c19a9f092a140f17ff85" exitCode=0 Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.849479 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h78mk" event={"ID":"80f7e4a7-4617-4978-b42e-8a33b6465690","Type":"ContainerDied","Data":"b716f2eb29255e505fc6cf32493a9d37bb8761ca0c51c19a9f092a140f17ff85"} Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.849495 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h78mk" event={"ID":"80f7e4a7-4617-4978-b42e-8a33b6465690","Type":"ContainerStarted","Data":"14b6b19188d2daa8c53f8057eb53776b5f78144d8cd66be35b548625d023005b"} Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.851315 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hg8qg" event={"ID":"d28dca1b-efeb-4b15-833b-8bc78aa16238","Type":"ContainerStarted","Data":"33272a89716180f968143aca76e52e01fa851cd20e58b1103a9b56e172fb45eb"} Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.851343 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hg8qg" event={"ID":"d28dca1b-efeb-4b15-833b-8bc78aa16238","Type":"ContainerStarted","Data":"d207faa6c4f2ae9dcc41cbc24e452e4929e94fb955c9bb7b976dc79f73928d28"} Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.851382 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hg8qg" event={"ID":"d28dca1b-efeb-4b15-833b-8bc78aa16238","Type":"ContainerStarted","Data":"65c7a56c5f44a645eb88c6d43b2cec699f926adbe4705e999356661174eab73e"} Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.854989 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-d25sv" event={"ID":"afc88ae6-e5b1-4da0-b10a-a6bf1816e6fa","Type":"ContainerStarted","Data":"67ecf997d9524ce207cd72be3a5a0ab6716d663d11fea3fdcd0bb3a9765f13d7"} Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.855029 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-d25sv" event={"ID":"afc88ae6-e5b1-4da0-b10a-a6bf1816e6fa","Type":"ContainerStarted","Data":"3d933b8766d5863609d99ca65c46b7161014b709c7df8c28339a9ef10d6cbabb"} Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.855434 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-d25sv" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.857432 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9w96x" event={"ID":"23bc42fe-aadb-4283-a679-b07d87b04a15","Type":"ContainerStarted","Data":"16533c1156de354f0cf17d3d4b8f2fbeaf14b8f7363de8be6c1fc5e91ebe8c94"} Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.857467 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9w96x" event={"ID":"23bc42fe-aadb-4283-a679-b07d87b04a15","Type":"ContainerStarted","Data":"1550e2b27844d33402d4a63dd95094a3230f59a014e7d0fd8b3bad28de153a1b"} Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.859001 4971 patch_prober.go:28] interesting pod/downloads-7954f5f757-d25sv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.859035 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d25sv" podUID="afc88ae6-e5b1-4da0-b10a-a6bf1816e6fa" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.867757 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" event={"ID":"1a0999c2-4d90-4197-8075-e11790a0ed9b","Type":"ContainerStarted","Data":"0df6c92ebf52c79c1452578b5abd86c7588faa9829a1cca2d963045587eed64a"} Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.880488 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-th2ls" event={"ID":"f372b9ff-41d6-4712-bf7a-9c229f1f7673","Type":"ContainerStarted","Data":"a5442709533f1b90a95cb98046e181578a18792101f4a0ec1d055662108e7587"} Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.900217 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-w4c8h" event={"ID":"9e0270a9-8b08-4abf-88da-75319c5e6f48","Type":"ContainerStarted","Data":"2a79e784ffd87a5da9fa55c715a4d0d7f1d988f363b034aed82cd35e57b4583b"} Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.930796 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.931487 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:01 crc kubenswrapper[4971]: E0309 09:22:01.931802 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:02.431788095 +0000 UTC m=+125.991715905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.944598 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.945800 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.948841 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.983364 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 09 09:22:01 crc kubenswrapper[4971]: I0309 09:22:01.983520 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwp59" Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.024973 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.032745 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:02 crc kubenswrapper[4971]: E0309 09:22:02.035670 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:02.535649036 +0000 UTC m=+126.095576846 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.036728 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.046297 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.052152 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.140956 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.141302 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b19b44a-0898-4886-b5d2-4bc4ff950094-metrics-certs\") pod \"network-metrics-daemon-9lhtb\" (UID: \"8b19b44a-0898-4886-b5d2-4bc4ff950094\") " pod="openshift-multus/network-metrics-daemon-9lhtb" Mar 09 09:22:02 crc kubenswrapper[4971]: E0309 09:22:02.141556 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:02.641541517 +0000 UTC m=+126.201469327 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.242429 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.242699 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92vq4\" (UniqueName: \"kubernetes.io/projected/f037869a-34a5-43d9-8c27-6ac17e4fe6b1-kube-api-access-92vq4\") pod \"openshift-apiserver-operator-796bbdcf4f-pf6b7\" (UID: \"f037869a-34a5-43d9-8c27-6ac17e4fe6b1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pf6b7" Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.242746 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5mrq\" (UniqueName: \"kubernetes.io/projected/50b5a937-bb23-4b89-86a3-6ad4944f5440-kube-api-access-p5mrq\") pod \"authentication-operator-69f744f599-mct42\" (UID: \"50b5a937-bb23-4b89-86a3-6ad4944f5440\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mct42" Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.242825 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g9hh\" (UniqueName: \"kubernetes.io/projected/db64f07f-f1cb-4754-8e1f-33951a826f78-kube-api-access-5g9hh\") pod \"machine-api-operator-5694c8668f-t9hb6\" (UID: \"db64f07f-f1cb-4754-8e1f-33951a826f78\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t9hb6" Mar 09 09:22:02 crc kubenswrapper[4971]: E0309 09:22:02.243260 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:02.743209905 +0000 UTC m=+126.303137715 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.250467 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b19b44a-0898-4886-b5d2-4bc4ff950094-metrics-certs\") pod \"network-metrics-daemon-9lhtb\" (UID: \"8b19b44a-0898-4886-b5d2-4bc4ff950094\") " pod="openshift-multus/network-metrics-daemon-9lhtb" Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.250632 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92vq4\" (UniqueName: \"kubernetes.io/projected/f037869a-34a5-43d9-8c27-6ac17e4fe6b1-kube-api-access-92vq4\") pod \"openshift-apiserver-operator-796bbdcf4f-pf6b7\" (UID: \"f037869a-34a5-43d9-8c27-6ac17e4fe6b1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pf6b7" Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.250751 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g9hh\" (UniqueName: \"kubernetes.io/projected/db64f07f-f1cb-4754-8e1f-33951a826f78-kube-api-access-5g9hh\") pod \"machine-api-operator-5694c8668f-t9hb6\" (UID: \"db64f07f-f1cb-4754-8e1f-33951a826f78\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t9hb6" Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.250792 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5mrq\" (UniqueName: \"kubernetes.io/projected/50b5a937-bb23-4b89-86a3-6ad4944f5440-kube-api-access-p5mrq\") pod \"authentication-operator-69f744f599-mct42\" (UID: \"50b5a937-bb23-4b89-86a3-6ad4944f5440\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mct42" Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.264186 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.269536 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vsdnk"] Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.269765 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lmr9s"] Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.272484 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-mct42" Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.274171 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550795-nmbwp"] Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.277112 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zqnt8"] Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.279085 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2kmr7"] Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.344388 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:02 crc kubenswrapper[4971]: E0309 09:22:02.345913 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:02.845901173 +0000 UTC m=+126.405828983 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.367210 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9lhtb" Mar 09 09:22:02 crc kubenswrapper[4971]: W0309 09:22:02.442459 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95c0f392_b6e8_4719_8cad_0d267cf8b955.slice/crio-b66321aa2cfceab60e3dae9160cc7124f9b0c510b6bb7d99df419a95536d29d4 WatchSource:0}: Error finding container b66321aa2cfceab60e3dae9160cc7124f9b0c510b6bb7d99df419a95536d29d4: Status 404 returned error can't find the container with id b66321aa2cfceab60e3dae9160cc7124f9b0c510b6bb7d99df419a95536d29d4 Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.447253 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:02 crc kubenswrapper[4971]: E0309 09:22:02.447451 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:02.947424057 +0000 UTC m=+126.507351867 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.447599 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:02 crc kubenswrapper[4971]: E0309 09:22:02.448420 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:02.948407745 +0000 UTC m=+126.508335555 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.546733 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.548477 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:02 crc kubenswrapper[4971]: E0309 09:22:02.548895 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:03.048878848 +0000 UTC m=+126.608806658 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.558456 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-t9hb6" Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.591607 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.600297 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pf6b7" Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.650469 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:02 crc kubenswrapper[4971]: E0309 09:22:02.650720 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:03.150709841 +0000 UTC m=+126.710637651 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.751844 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:02 crc kubenswrapper[4971]: E0309 09:22:02.752140 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:03.252118551 +0000 UTC m=+126.812046361 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.752271 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:02 crc kubenswrapper[4971]: E0309 09:22:02.752697 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:03.252685668 +0000 UTC m=+126.812613478 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.852896 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:02 crc kubenswrapper[4971]: E0309 09:22:02.853302 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:03.353283764 +0000 UTC m=+126.913211574 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.894447 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cm5pk" podStartSLOduration=62.894423985 podStartE2EDuration="1m2.894423985s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:02.881654133 +0000 UTC m=+126.441581953" watchObservedRunningTime="2026-03-09 09:22:02.894423985 +0000 UTC m=+126.454351805" Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.936899 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zqnt8" event={"ID":"1ed6451f-4bc6-4dcc-b84c-413dbb95114b","Type":"ContainerStarted","Data":"ce02bb1075c284aa444bfff808d0c5b398e493fbc55a84134cd986b105388be0"} Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.936952 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zqnt8" event={"ID":"1ed6451f-4bc6-4dcc-b84c-413dbb95114b","Type":"ContainerStarted","Data":"2af6caf033d893c4e6413834973ee705fc8147c41f3512d296b54c5573bd67f3"} Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.943604 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2kmr7" event={"ID":"95c0f392-b6e8-4719-8cad-0d267cf8b955","Type":"ContainerStarted","Data":"b66321aa2cfceab60e3dae9160cc7124f9b0c510b6bb7d99df419a95536d29d4"} Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.948277 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h78mk" event={"ID":"80f7e4a7-4617-4978-b42e-8a33b6465690","Type":"ContainerStarted","Data":"7eb0690526dcd763c23b4c8ed3d71292110faaaf7e1b2c9678dede06fbcc8648"} Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.948498 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h78mk" Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.949124 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdnk" event={"ID":"2fbb93d1-04cc-4152-b593-4fc23ebfa1ac","Type":"ContainerStarted","Data":"9df2e8a325574d8500ebc6e17cbd9ffa8ace3258f3e132e72c61987116eeba41"} Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.950495 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" event={"ID":"1a0999c2-4d90-4197-8075-e11790a0ed9b","Type":"ContainerStarted","Data":"61c76aff58c35e94f3e4d72f3e326230fd28af3da5913b414ea145eb56170a68"} Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.951233 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.954183 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:02 crc kubenswrapper[4971]: E0309 09:22:02.954554 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:03.45453884 +0000 UTC m=+127.014466650 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.957943 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-th2ls" event={"ID":"f372b9ff-41d6-4712-bf7a-9c229f1f7673","Type":"ContainerStarted","Data":"ca1c43602ee56282e884359ce1908b1bd3d3614f5bdb7ecd60b03816183cacc2"} Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.958044 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-th2ls" event={"ID":"f372b9ff-41d6-4712-bf7a-9c229f1f7673","Type":"ContainerStarted","Data":"43d6f91d54af8d0b7e61a062dcccb9145b027a13dc0bd5375e5a8a5793d02352"} Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.958067 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-th2ls" Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.959842 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" event={"ID":"0325b4dc-fe2a-4685-8e37-621a96f6b976","Type":"ContainerStarted","Data":"2c7ea772321c8be3f72680f9f9d2af70bb7eebc7251a552b40a308fe34cc0adb"} Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.965174 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-nmbwp" event={"ID":"1707bff4-eb31-4ed0-bbc5-054813b1a34a","Type":"ContainerStarted","Data":"47815d72534e469733e7130be9a8d78589dcbf3ab1208bc7459ae3ec23a27e59"} Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.965222 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-nmbwp" event={"ID":"1707bff4-eb31-4ed0-bbc5-054813b1a34a","Type":"ContainerStarted","Data":"5af6fc53c5463f604b20674cee7524f118f21819a0e8eb5f4d3e5d345bd8be5b"} Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.966924 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-4g4cr" event={"ID":"43af91cb-669b-473a-a92f-d6b8fffa0cc7","Type":"ContainerStarted","Data":"3c1f7729c93952458b980093f5f8952e0a83f21b1635d11ab4062ad8bfe35e26"} Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.970543 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-w4c8h" event={"ID":"9e0270a9-8b08-4abf-88da-75319c5e6f48","Type":"ContainerStarted","Data":"f251ba5978efd569a0327ef111e2e437efc5a50d8ce267771ea8be10ff4244a3"} Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.971779 4971 patch_prober.go:28] interesting pod/downloads-7954f5f757-d25sv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Mar 09 09:22:02 crc kubenswrapper[4971]: I0309 09:22:02.971822 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d25sv" podUID="afc88ae6-e5b1-4da0-b10a-a6bf1816e6fa" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.055088 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:03 crc kubenswrapper[4971]: E0309 09:22:03.056510 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:03.555305161 +0000 UTC m=+127.115232971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.057517 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:03 crc kubenswrapper[4971]: E0309 09:22:03.058671 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:03.558658939 +0000 UTC m=+127.118586809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.161008 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:03 crc kubenswrapper[4971]: E0309 09:22:03.161810 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:03.661791149 +0000 UTC m=+127.221718959 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.183896 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-mbv68"] Mar 09 09:22:03 crc kubenswrapper[4971]: W0309 09:22:03.198619 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2432a454_1fbc_4fe4_a6cb_27292e8b670d.slice/crio-1f9236579b1563203263b5497f6a134efec3da346b1cb803fa8783be6e79c262 WatchSource:0}: Error finding container 1f9236579b1563203263b5497f6a134efec3da346b1cb803fa8783be6e79c262: Status 404 returned error can't find the container with id 1f9236579b1563203263b5497f6a134efec3da346b1cb803fa8783be6e79c262 Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.214962 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-brkss"] Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.238470 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4zczm"] Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.262634 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:03 crc kubenswrapper[4971]: E0309 09:22:03.263030 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:03.763016854 +0000 UTC m=+127.322944664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.267047 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-h8b5s"] Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.275104 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-krqgm"] Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.277743 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-fg4hj"] Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.278969 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-dnx9z" podStartSLOduration=64.278946989 podStartE2EDuration="1m4.278946989s" podCreationTimestamp="2026-03-09 09:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:03.278483226 +0000 UTC m=+126.838411036" watchObservedRunningTime="2026-03-09 09:22:03.278946989 +0000 UTC m=+126.838874799" Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.289487 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qz4l7"] Mar 09 09:22:03 crc kubenswrapper[4971]: W0309 09:22:03.294482 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6ecdd40_f0c8_4f9d_9b8a_90c9941d159a.slice/crio-990ac1016b4b774d34e905055902bf925819e961589b5c2a96a20444efc70172 WatchSource:0}: Error finding container 990ac1016b4b774d34e905055902bf925819e961589b5c2a96a20444efc70172: Status 404 returned error can't find the container with id 990ac1016b4b774d34e905055902bf925819e961589b5c2a96a20444efc70172 Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.315685 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9w96x" podStartSLOduration=64.315659731 podStartE2EDuration="1m4.315659731s" podCreationTimestamp="2026-03-09 09:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:03.311333154 +0000 UTC m=+126.871260954" watchObservedRunningTime="2026-03-09 09:22:03.315659731 +0000 UTC m=+126.875587561" Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.372848 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:03 crc kubenswrapper[4971]: E0309 09:22:03.373112 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:03.873081757 +0000 UTC m=+127.433009567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.374333 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:03 crc kubenswrapper[4971]: E0309 09:22:03.374744 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:03.874732255 +0000 UTC m=+127.434660065 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.400988 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-d25sv" podStartSLOduration=64.400969071 podStartE2EDuration="1m4.400969071s" podCreationTimestamp="2026-03-09 09:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:03.363716904 +0000 UTC m=+126.923644714" watchObservedRunningTime="2026-03-09 09:22:03.400969071 +0000 UTC m=+126.960896891" Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.477054 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:03 crc kubenswrapper[4971]: E0309 09:22:03.477383 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:03.977368281 +0000 UTC m=+127.537296091 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.485854 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bzmmh" podStartSLOduration=63.485839169 podStartE2EDuration="1m3.485839169s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:03.485095847 +0000 UTC m=+127.045023657" watchObservedRunningTime="2026-03-09 09:22:03.485839169 +0000 UTC m=+127.045766979" Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.581126 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:03 crc kubenswrapper[4971]: E0309 09:22:03.581712 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:04.081698987 +0000 UTC m=+127.641626797 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.633126 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bzw7"] Mar 09 09:22:03 crc kubenswrapper[4971]: W0309 09:22:03.654700 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56909779_30d4_4350_810d_9675796d96ad.slice/crio-6105907c34136208874364a757a652b71cc07a4138778e2c8b38f7de8e197b7c WatchSource:0}: Error finding container 6105907c34136208874364a757a652b71cc07a4138778e2c8b38f7de8e197b7c: Status 404 returned error can't find the container with id 6105907c34136208874364a757a652b71cc07a4138778e2c8b38f7de8e197b7c Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.662685 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwp59"] Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.672335 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-48g6z"] Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.684279 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:03 crc kubenswrapper[4971]: E0309 09:22:03.684696 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:04.184676253 +0000 UTC m=+127.744604063 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.693584 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-w4c8h" Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.701314 4971 patch_prober.go:28] interesting pod/router-default-5444994796-w4c8h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:22:03 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 09 09:22:03 crc kubenswrapper[4971]: [+]process-running ok Mar 09 09:22:03 crc kubenswrapper[4971]: healthz check failed Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.701386 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w4c8h" podUID="9e0270a9-8b08-4abf-88da-75319c5e6f48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.745201 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550802-d8cbz"] Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.759468 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xhbrd"] Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.763431 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j9sc4"] Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.764995 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dvp8t"] Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.766748 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wf8hd"] Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.775431 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b65bx"] Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.783455 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7xwd6"] Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.785974 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:03 crc kubenswrapper[4971]: E0309 09:22:03.786399 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:04.286379522 +0000 UTC m=+127.846307332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.794691 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-57xjq"] Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.812211 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-txc2s"] Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.816536 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9lhtb"] Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.831384 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mct42"] Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.833309 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nvzgg"] Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.837691 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t9hb6"] Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.841030 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r"] Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.842890 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl"] Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.844530 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rqlbq"] Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.846084 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nw59v"] Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.847535 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pf6b7"] Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.887779 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:03 crc kubenswrapper[4971]: E0309 09:22:03.888081 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:04.38805597 +0000 UTC m=+127.947983780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:03 crc kubenswrapper[4971]: W0309 09:22:03.945534 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod603b9f27_06c0_4fe8_8cc3_416122462369.slice/crio-d4ecad667b6fe558d6eb04e691231e454f199ca1b59dce91c529c6d8a87126d9 WatchSource:0}: Error finding container d4ecad667b6fe558d6eb04e691231e454f199ca1b59dce91c529c6d8a87126d9: Status 404 returned error can't find the container with id d4ecad667b6fe558d6eb04e691231e454f199ca1b59dce91c529c6d8a87126d9 Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.949419 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.963069 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hg8qg" podStartSLOduration=64.963048939 podStartE2EDuration="1m4.963048939s" podCreationTimestamp="2026-03-09 09:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:03.960876846 +0000 UTC m=+127.520804666" watchObservedRunningTime="2026-03-09 09:22:03.963048939 +0000 UTC m=+127.522976749" Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.987114 4971 ???:1] "http: TLS handshake error from 192.168.126.11:45362: no serving certificate available for the kubelet" Mar 09 09:22:03 crc kubenswrapper[4971]: I0309 09:22:03.998323 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:03 crc kubenswrapper[4971]: E0309 09:22:03.998707 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:04.49869415 +0000 UTC m=+128.058621960 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.002183 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-w4c8h" podStartSLOduration=64.002167841 podStartE2EDuration="1m4.002167841s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:04.000382119 +0000 UTC m=+127.560309949" watchObservedRunningTime="2026-03-09 09:22:04.002167841 +0000 UTC m=+127.562095651" Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.012660 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zczm" event={"ID":"83f8f490-e050-4721-8784-0879496323ad","Type":"ContainerStarted","Data":"bde2e6804a296771a3b664b5ff76be8069ed9f18a0f26ac0cfa42940381f3ccc"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.012755 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zczm" event={"ID":"83f8f490-e050-4721-8784-0879496323ad","Type":"ContainerStarted","Data":"bf4d0547786f67a41727a24db23f81e4af0a589ba22e8fd44906874aa0e54532"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.012765 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zczm" event={"ID":"83f8f490-e050-4721-8784-0879496323ad","Type":"ContainerStarted","Data":"d143909d1dab974fd6d9ac3620e26c826b8de67f7df67caa8cad2b519326a76f"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.043182 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" podStartSLOduration=8.043162078 podStartE2EDuration="8.043162078s" podCreationTimestamp="2026-03-09 09:21:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:04.038522282 +0000 UTC m=+127.598450092" watchObservedRunningTime="2026-03-09 09:22:04.043162078 +0000 UTC m=+127.603089888" Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.045128 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qz4l7" event={"ID":"339ca768-fe61-40dd-8a4f-93363aa23972","Type":"ContainerStarted","Data":"ea848ed3a7f0f69ef153f3fe41278ffae489adc8cef394e6e204e725f474dabd"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.045179 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qz4l7" event={"ID":"339ca768-fe61-40dd-8a4f-93363aa23972","Type":"ContainerStarted","Data":"e98941577912b011a6600b24724db4da182b252db937abd4793ba9eb8855dbf7"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.050216 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-fg4hj" event={"ID":"e6ecdd40-f0c8-4f9d-9b8a-90c9941d159a","Type":"ContainerStarted","Data":"3afe6a16eb602a6319f6e72531ea42d97779473082d445b608e55e742080d883"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.050258 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-fg4hj" event={"ID":"e6ecdd40-f0c8-4f9d-9b8a-90c9941d159a","Type":"ContainerStarted","Data":"990ac1016b4b774d34e905055902bf925819e961589b5c2a96a20444efc70172"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.052608 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j9sc4" event={"ID":"c91cf18b-1765-48d3-9e00-66747b628f33","Type":"ContainerStarted","Data":"922748f28e66424dbe2a28b84eb3f0f58a31f73217261f7184de02a9681bc03c"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.067277 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dvp8t" event={"ID":"c9cdbff0-0cca-4375-8c92-1117ce5d1dea","Type":"ContainerStarted","Data":"0dc7881fb6747f8d67b472e9095848187f3d650fba6074017fffeeeff5c67298"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.089119 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550802-d8cbz" event={"ID":"603b9f27-06c0-4fe8-8cc3-416122462369","Type":"ContainerStarted","Data":"d4ecad667b6fe558d6eb04e691231e454f199ca1b59dce91c529c6d8a87126d9"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.101403 4971 ???:1] "http: TLS handshake error from 192.168.126.11:45368: no serving certificate available for the kubelet" Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.101932 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:04 crc kubenswrapper[4971]: E0309 09:22:04.102282 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:04.602264883 +0000 UTC m=+128.162192693 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.122678 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-4g4cr" podStartSLOduration=8.122656488 podStartE2EDuration="8.122656488s" podCreationTimestamp="2026-03-09 09:21:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:04.120482125 +0000 UTC m=+127.680409935" watchObservedRunningTime="2026-03-09 09:22:04.122656488 +0000 UTC m=+127.682584298" Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.124153 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-th2ls" podStartSLOduration=7.124142241 podStartE2EDuration="7.124142241s" podCreationTimestamp="2026-03-09 09:21:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:04.083440863 +0000 UTC m=+127.643368673" watchObservedRunningTime="2026-03-09 09:22:04.124142241 +0000 UTC m=+127.684070051" Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.148790 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b65bx" event={"ID":"13a19b2e-fdd8-41cc-89ac-ed182fa3a449","Type":"ContainerStarted","Data":"da1a2209a82408a60673d86c46cfbb039b4f16fc5fc2973af5ea97541f937409"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.171334 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwp59" event={"ID":"f9f38918-24d8-44d6-9ed5-0d9e69ddc590","Type":"ContainerStarted","Data":"61740cf159cf469cfe6f372efee1b7e01e2ba96949062b82fc5dac280401e074"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.182756 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mbv68" event={"ID":"2432a454-1fbc-4fe4-a6cb-27292e8b670d","Type":"ContainerStarted","Data":"a22e03d1e26951656a8ccba81e96be7dec70a48dd138ba1aecd977ba3bd17e9f"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.182799 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mbv68" event={"ID":"2432a454-1fbc-4fe4-a6cb-27292e8b670d","Type":"ContainerStarted","Data":"4f5a81b15561a5732bbcc3762523abdf375b15b8eca8a7e12a75bd173ae493c1"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.182808 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mbv68" event={"ID":"2432a454-1fbc-4fe4-a6cb-27292e8b670d","Type":"ContainerStarted","Data":"1f9236579b1563203263b5497f6a134efec3da346b1cb803fa8783be6e79c262"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.186000 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bzw7" event={"ID":"56909779-30d4-4350-810d-9675796d96ad","Type":"ContainerStarted","Data":"6105907c34136208874364a757a652b71cc07a4138778e2c8b38f7de8e197b7c"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.188979 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-48g6z" event={"ID":"f7b8f2b8-0607-467d-8ba2-3b823817b639","Type":"ContainerStarted","Data":"ddf30fee01925cf6b9b17ac147790e17cbef47b43fe02de7cf812f531ac11aba"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.193935 4971 ???:1] "http: TLS handshake error from 192.168.126.11:45370: no serving certificate available for the kubelet" Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.196143 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wf8hd" event={"ID":"79d6be06-8c45-4058-a2ff-5daf63d0404e","Type":"ContainerStarted","Data":"626370bc6a68b3f6628b568c7090951fd2b127d65ad22d314300346497eea423"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.197503 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xhbrd" event={"ID":"e8a3dd14-c5c3-4251-88bf-31dcafe04ef1","Type":"ContainerStarted","Data":"abf028af8a13cc05a1693263b31d27cbdcd1b4dfeeeb350545c4a1f0b0ad64dd"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.198431 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-krqgm" event={"ID":"91b993d3-35bb-4b9b-9e1a-ca96fa6f8162","Type":"ContainerStarted","Data":"2cc4029e308ae67e6793b576d202952eb760c1fc0e3c8c6278279489560af539"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.202428 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-nmbwp" podStartSLOduration=65.202411386 podStartE2EDuration="1m5.202411386s" podCreationTimestamp="2026-03-09 09:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:04.200524661 +0000 UTC m=+127.760452491" watchObservedRunningTime="2026-03-09 09:22:04.202411386 +0000 UTC m=+127.762339206" Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.202796 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h78mk" podStartSLOduration=65.202788767 podStartE2EDuration="1m5.202788767s" podCreationTimestamp="2026-03-09 09:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:04.157002381 +0000 UTC m=+127.716930251" watchObservedRunningTime="2026-03-09 09:22:04.202788767 +0000 UTC m=+127.762716577" Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.203090 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:04 crc kubenswrapper[4971]: E0309 09:22:04.203713 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:04.703699104 +0000 UTC m=+128.263626914 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.206210 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8b5s" event={"ID":"39079cdf-1b40-4f77-ad11-3816fc89e3df","Type":"ContainerStarted","Data":"18bad416dcce51bb7c94c07ac366c327a3bf0ce877653d8c97361441db74788a"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.206252 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8b5s" event={"ID":"39079cdf-1b40-4f77-ad11-3816fc89e3df","Type":"ContainerStarted","Data":"0b6fa1c1eb513e023ea93edfcd71db7536eb734d9768b24525ea36a34653cd55"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.215301 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-brkss" event={"ID":"7c2058d7-0c77-4f28-a103-679184ed575c","Type":"ContainerStarted","Data":"93545cd26e4c87231f82bba8692edf9c46c81d66fbe08abcd93f4cc4ee0c8c30"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.215425 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-brkss" event={"ID":"7c2058d7-0c77-4f28-a103-679184ed575c","Type":"ContainerStarted","Data":"65a6fcb24716661f7d9ed2eb5081dd65c1f374a85c32216433e9549223bc315d"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.219099 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdnk" event={"ID":"2fbb93d1-04cc-4152-b593-4fc23ebfa1ac","Type":"ContainerStarted","Data":"654fff87261378a080c03652a7ac6d566b87ae56b366cf656977bd834767f9ed"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.219141 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdnk" event={"ID":"2fbb93d1-04cc-4152-b593-4fc23ebfa1ac","Type":"ContainerStarted","Data":"3e354636f60026f440fc5c364419dbf077dc74eeb184990450f2946a31e81fde"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.223402 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2kmr7" event={"ID":"95c0f392-b6e8-4719-8cad-0d267cf8b955","Type":"ContainerStarted","Data":"99f5302e19bf58a4252ecfcbcb2cd4240585c06f4234f5cc700863019a08c29d"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.227907 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" event={"ID":"0325b4dc-fe2a-4685-8e37-621a96f6b976","Type":"ContainerStarted","Data":"8cb837315cb8cfd11cf59cc4c075ade7c9b05fd48d55400bb199dc80921c1560"} Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.227985 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.230772 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zqnt8" Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.243755 4971 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-lmr9s container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.243820 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" podUID="0325b4dc-fe2a-4685-8e37-621a96f6b976" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.244621 4971 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zqnt8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.244681 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zqnt8" podUID="1ed6451f-4bc6-4dcc-b84c-413dbb95114b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.269337 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.274784 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mbv68" podStartSLOduration=64.274768509 podStartE2EDuration="1m4.274768509s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:04.235211254 +0000 UTC m=+127.795139064" watchObservedRunningTime="2026-03-09 09:22:04.274768509 +0000 UTC m=+127.834696319" Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.275097 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-fg4hj" podStartSLOduration=64.275092408 podStartE2EDuration="1m4.275092408s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:04.274570863 +0000 UTC m=+127.834498673" watchObservedRunningTime="2026-03-09 09:22:04.275092408 +0000 UTC m=+127.835020218" Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.295009 4971 ???:1] "http: TLS handshake error from 192.168.126.11:45378: no serving certificate available for the kubelet" Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.304250 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:04 crc kubenswrapper[4971]: E0309 09:22:04.304454 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:04.804428544 +0000 UTC m=+128.364356354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.305636 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:04 crc kubenswrapper[4971]: E0309 09:22:04.308320 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:04.808298377 +0000 UTC m=+128.368226267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.318440 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2kmr7" podStartSLOduration=64.318238008 podStartE2EDuration="1m4.318238008s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:04.314182619 +0000 UTC m=+127.874110429" watchObservedRunningTime="2026-03-09 09:22:04.318238008 +0000 UTC m=+127.878165818" Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.358948 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-brkss" podStartSLOduration=64.358931515 podStartE2EDuration="1m4.358931515s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:04.357668389 +0000 UTC m=+127.917596199" watchObservedRunningTime="2026-03-09 09:22:04.358931515 +0000 UTC m=+127.918859325" Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.407505 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:04 crc kubenswrapper[4971]: E0309 09:22:04.407779 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:04.907760641 +0000 UTC m=+128.467688451 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.416752 4971 ???:1] "http: TLS handshake error from 192.168.126.11:45384: no serving certificate available for the kubelet" Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.437393 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" podStartSLOduration=64.437377425 podStartE2EDuration="1m4.437377425s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:04.437068776 +0000 UTC m=+127.996996586" watchObservedRunningTime="2026-03-09 09:22:04.437377425 +0000 UTC m=+127.997305235" Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.478102 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zqnt8" podStartSLOduration=64.478082594 podStartE2EDuration="1m4.478082594s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:04.477121296 +0000 UTC m=+128.037049106" watchObservedRunningTime="2026-03-09 09:22:04.478082594 +0000 UTC m=+128.038010394" Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.508676 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:04 crc kubenswrapper[4971]: E0309 09:22:04.509102 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:05.009086549 +0000 UTC m=+128.569014359 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.523031 4971 ???:1] "http: TLS handshake error from 192.168.126.11:45386: no serving certificate available for the kubelet" Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.551215 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdnk" podStartSLOduration=64.551199208 podStartE2EDuration="1m4.551199208s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:04.516470504 +0000 UTC m=+128.076398324" watchObservedRunningTime="2026-03-09 09:22:04.551199208 +0000 UTC m=+128.111127018" Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.610326 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:04 crc kubenswrapper[4971]: E0309 09:22:04.610503 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:05.110477808 +0000 UTC m=+128.670405618 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.611200 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:04 crc kubenswrapper[4971]: E0309 09:22:04.611688 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:05.111669733 +0000 UTC m=+128.671597543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.698458 4971 patch_prober.go:28] interesting pod/router-default-5444994796-w4c8h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:22:04 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 09 09:22:04 crc kubenswrapper[4971]: [+]process-running ok Mar 09 09:22:04 crc kubenswrapper[4971]: healthz check failed Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.698531 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w4c8h" podUID="9e0270a9-8b08-4abf-88da-75319c5e6f48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.706138 4971 ???:1] "http: TLS handshake error from 192.168.126.11:45390: no serving certificate available for the kubelet" Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.712133 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:04 crc kubenswrapper[4971]: E0309 09:22:04.712475 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:05.212459905 +0000 UTC m=+128.772387715 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.814218 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:04 crc kubenswrapper[4971]: E0309 09:22:04.814720 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:05.31470085 +0000 UTC m=+128.874628660 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.915791 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:04 crc kubenswrapper[4971]: E0309 09:22:04.915940 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:05.415921525 +0000 UTC m=+128.975849335 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:04 crc kubenswrapper[4971]: I0309 09:22:04.916517 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:04 crc kubenswrapper[4971]: E0309 09:22:04.916868 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:05.416851362 +0000 UTC m=+128.976779162 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.018222 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:05 crc kubenswrapper[4971]: E0309 09:22:05.018395 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:05.518368375 +0000 UTC m=+129.078296185 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.018853 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:05 crc kubenswrapper[4971]: E0309 09:22:05.019184 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:05.519176249 +0000 UTC m=+129.079104059 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.057872 4971 ???:1] "http: TLS handshake error from 192.168.126.11:45404: no serving certificate available for the kubelet" Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.119855 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:05 crc kubenswrapper[4971]: E0309 09:22:05.119997 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:05.619972181 +0000 UTC m=+129.179899991 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.120114 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:05 crc kubenswrapper[4971]: E0309 09:22:05.120408 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:05.620400444 +0000 UTC m=+129.180328254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.197539 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-n8lbv"] Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.221530 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:05 crc kubenswrapper[4971]: E0309 09:22:05.221739 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:05.721713151 +0000 UTC m=+129.281640961 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.221973 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:05 crc kubenswrapper[4971]: E0309 09:22:05.222295 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:05.722284048 +0000 UTC m=+129.282211858 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.231950 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" event={"ID":"bcd6b63d-8557-4c0b-b000-7d9e14cd229e","Type":"ContainerStarted","Data":"98335c64eb1567a887fc81fcdc17fb2682d4678920f2c2c392401f610a6898c3"} Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.232811 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" event={"ID":"70b1c95e-1326-4a4d-92f8-12df76f6a23a","Type":"ContainerStarted","Data":"fb6702515f9badf816344febadc98388380014042bbb6946613c5d053ab4e320"} Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.233891 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwp59" event={"ID":"f9f38918-24d8-44d6-9ed5-0d9e69ddc590","Type":"ContainerStarted","Data":"f145b403b014624f84a7d9ac42251cc53566b49a96f16124417665e22cb3d01f"} Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.234860 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" event={"ID":"2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da","Type":"ContainerStarted","Data":"386f5cf24fa758c90a198805cf9d48d4af1a37c1d83641647744043fcdaa80c6"} Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.235793 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9lhtb" event={"ID":"8b19b44a-0898-4886-b5d2-4bc4ff950094","Type":"ContainerStarted","Data":"d9b0d455022b2cdbbe8a4626936a9441476adf17c6fa38f452bd4a535031e9f5"} Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.237783 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7xwd6" event={"ID":"7e1d5ee3-5d9c-4d44-bf5a-343216e8803e","Type":"ContainerStarted","Data":"8be7aa1de5a95586ddfbf04a2c6be3ea543cbfedfccfa04db4b8e22ff44470fa"} Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.238975 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bzw7" event={"ID":"56909779-30d4-4350-810d-9675796d96ad","Type":"ContainerStarted","Data":"9d7df2d5ec97feb1e2bd2223e5c3e347ce2234ca2c518c1e041aa8d1fc62558f"} Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.239134 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bzw7" Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.240063 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-57xjq" event={"ID":"6dba6300-591c-4fc5-8544-b208731d2dc6","Type":"ContainerStarted","Data":"a7c1ac51bfa204724710638966f99bf0221a63fdfbe78c4b3eea9d7d8ad290db"} Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.240857 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" event={"ID":"2555712b-fa0a-4831-90ca-78d22b2e48b9","Type":"ContainerStarted","Data":"8043e5c566deff82a35ba7b2829ca22de3a082b596fc22b1a6d56c9fb24594b5"} Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.240864 4971 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-5bzw7 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.240917 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bzw7" podUID="56909779-30d4-4350-810d-9675796d96ad" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.243638 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-mct42" event={"ID":"50b5a937-bb23-4b89-86a3-6ad4944f5440","Type":"ContainerStarted","Data":"802f378b27ffd546165460a44bbe01ec73bdc7a6503b4c60490553c5f3b0e92c"} Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.244747 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" event={"ID":"5073d2d2-177a-4e70-9638-7fe56084c301","Type":"ContainerStarted","Data":"3ddae319666a5a0fce9514e4cd368ed2c1afce95f1fa86b6302a940487d506bd"} Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.245766 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t9hb6" event={"ID":"db64f07f-f1cb-4754-8e1f-33951a826f78","Type":"ContainerStarted","Data":"35654681d24b2c75dca71bc0163b8043062af70662b2b9c4333adc6abb07b0fe"} Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.246730 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-txc2s" event={"ID":"a5371ca7-5f2f-4b51-add8-021a77d93c9c","Type":"ContainerStarted","Data":"922644749e8cf1a5421e0e030e2fb1852c0900c90032f65569dc7562eff84000"} Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.247714 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pf6b7" event={"ID":"f037869a-34a5-43d9-8c27-6ac17e4fe6b1","Type":"ContainerStarted","Data":"ce3fd08be7f614a3862e49ffa1a17b79b164074bfb3b6b55f57e09d34996b80e"} Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.249156 4971 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-lmr9s container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.249201 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" podUID="0325b4dc-fe2a-4685-8e37-621a96f6b976" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.322547 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:05 crc kubenswrapper[4971]: E0309 09:22:05.322837 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:05.822818873 +0000 UTC m=+129.382746683 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.323157 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:05 crc kubenswrapper[4971]: E0309 09:22:05.327222 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:05.82719517 +0000 UTC m=+129.387122980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.425413 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:05 crc kubenswrapper[4971]: E0309 09:22:05.425614 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:05.925578852 +0000 UTC m=+129.485506662 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.425734 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:05 crc kubenswrapper[4971]: E0309 09:22:05.426113 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:05.926103658 +0000 UTC m=+129.486031468 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.477221 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bzw7" podStartSLOduration=65.477206279 podStartE2EDuration="1m5.477206279s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:05.475665334 +0000 UTC m=+129.035593154" watchObservedRunningTime="2026-03-09 09:22:05.477206279 +0000 UTC m=+129.037134089" Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.516608 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4zczm" podStartSLOduration=65.516594049 podStartE2EDuration="1m5.516594049s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:05.515828157 +0000 UTC m=+129.075755977" watchObservedRunningTime="2026-03-09 09:22:05.516594049 +0000 UTC m=+129.076521859" Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.527388 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:05 crc kubenswrapper[4971]: E0309 09:22:05.527583 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:06.02756589 +0000 UTC m=+129.587493700 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.527731 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:05 crc kubenswrapper[4971]: E0309 09:22:05.528049 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:06.028038023 +0000 UTC m=+129.587965833 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.554644 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwp59" podStartSLOduration=65.554626629 podStartE2EDuration="1m5.554626629s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:05.553152976 +0000 UTC m=+129.113080796" watchObservedRunningTime="2026-03-09 09:22:05.554626629 +0000 UTC m=+129.114554439" Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.629306 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:05 crc kubenswrapper[4971]: E0309 09:22:05.629509 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:06.129482675 +0000 UTC m=+129.689410485 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.629616 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:05 crc kubenswrapper[4971]: E0309 09:22:05.629925 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:06.129917957 +0000 UTC m=+129.689845767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.696871 4971 patch_prober.go:28] interesting pod/router-default-5444994796-w4c8h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:22:05 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 09 09:22:05 crc kubenswrapper[4971]: [+]process-running ok Mar 09 09:22:05 crc kubenswrapper[4971]: healthz check failed Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.696943 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w4c8h" podUID="9e0270a9-8b08-4abf-88da-75319c5e6f48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.714883 4971 ???:1] "http: TLS handshake error from 192.168.126.11:43136: no serving certificate available for the kubelet" Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.730609 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:05 crc kubenswrapper[4971]: E0309 09:22:05.730763 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:06.230744551 +0000 UTC m=+129.790672361 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.730850 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:05 crc kubenswrapper[4971]: E0309 09:22:05.731115 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:06.231104171 +0000 UTC m=+129.791031981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.832588 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:05 crc kubenswrapper[4971]: E0309 09:22:05.833027 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:06.333007556 +0000 UTC m=+129.892935366 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.836072 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zqnt8" Mar 09 09:22:05 crc kubenswrapper[4971]: I0309 09:22:05.934646 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:05 crc kubenswrapper[4971]: E0309 09:22:05.935293 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:06.435280981 +0000 UTC m=+129.995208791 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.036997 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:06 crc kubenswrapper[4971]: E0309 09:22:06.037416 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:06.537395712 +0000 UTC m=+130.097323522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.130964 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.131792 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.137053 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.137424 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.138610 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:06 crc kubenswrapper[4971]: E0309 09:22:06.138934 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:06.638924136 +0000 UTC m=+130.198851946 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.147221 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.239376 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.239599 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f154ebf8-2843-4580-ae89-fbfcb0d6c5c1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f154ebf8-2843-4580-ae89-fbfcb0d6c5c1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.239635 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f154ebf8-2843-4580-ae89-fbfcb0d6c5c1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f154ebf8-2843-4580-ae89-fbfcb0d6c5c1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:22:06 crc kubenswrapper[4971]: E0309 09:22:06.239786 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:06.73976496 +0000 UTC m=+130.299692770 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.285740 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-48g6z" event={"ID":"f7b8f2b8-0607-467d-8ba2-3b823817b639","Type":"ContainerStarted","Data":"f670dbf2b39249203d4562acbd2bd573dea6d0132bf441e69d38809456d194b2"} Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.296706 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b65bx" event={"ID":"13a19b2e-fdd8-41cc-89ac-ed182fa3a449","Type":"ContainerStarted","Data":"d185af342858cbe450cc702a6d81aa0654dd46ac73a169ad9c8d95c7b0ad1e59"} Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.297695 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b65bx" Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.300778 4971 generic.go:334] "Generic (PLEG): container finished" podID="1707bff4-eb31-4ed0-bbc5-054813b1a34a" containerID="47815d72534e469733e7130be9a8d78589dcbf3ab1208bc7459ae3ec23a27e59" exitCode=0 Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.300831 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-nmbwp" event={"ID":"1707bff4-eb31-4ed0-bbc5-054813b1a34a","Type":"ContainerDied","Data":"47815d72534e469733e7130be9a8d78589dcbf3ab1208bc7459ae3ec23a27e59"} Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.312587 4971 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-b65bx container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" start-of-body= Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.312642 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b65bx" podUID="13a19b2e-fdd8-41cc-89ac-ed182fa3a449" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.316238 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xhbrd" event={"ID":"e8a3dd14-c5c3-4251-88bf-31dcafe04ef1","Type":"ContainerStarted","Data":"a9d19be924ebb3b50ae04cb8312e2e2d8397eec64334700bdbfe00d216767c84"} Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.316939 4971 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-5bzw7 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.316982 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bzw7" podUID="56909779-30d4-4350-810d-9675796d96ad" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.317265 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" podUID="1a0999c2-4d90-4197-8075-e11790a0ed9b" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://61c76aff58c35e94f3e4d72f3e326230fd28af3da5913b414ea145eb56170a68" gracePeriod=30 Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.341418 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f154ebf8-2843-4580-ae89-fbfcb0d6c5c1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f154ebf8-2843-4580-ae89-fbfcb0d6c5c1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.341497 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f154ebf8-2843-4580-ae89-fbfcb0d6c5c1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f154ebf8-2843-4580-ae89-fbfcb0d6c5c1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.341747 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:06 crc kubenswrapper[4971]: E0309 09:22:06.344702 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:06.844628511 +0000 UTC m=+130.404556321 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.345463 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f154ebf8-2843-4580-ae89-fbfcb0d6c5c1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f154ebf8-2843-4580-ae89-fbfcb0d6c5c1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.373682 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f154ebf8-2843-4580-ae89-fbfcb0d6c5c1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f154ebf8-2843-4580-ae89-fbfcb0d6c5c1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.389986 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b65bx" podStartSLOduration=66.389971914 podStartE2EDuration="1m6.389971914s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:06.389172771 +0000 UTC m=+129.949100581" watchObservedRunningTime="2026-03-09 09:22:06.389971914 +0000 UTC m=+129.949899724" Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.448131 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:06 crc kubenswrapper[4971]: E0309 09:22:06.448469 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:06.948445011 +0000 UTC m=+130.508372821 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.451265 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:06 crc kubenswrapper[4971]: E0309 09:22:06.451708 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:06.951696826 +0000 UTC m=+130.511624636 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.470039 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xhbrd" podStartSLOduration=66.470025021 podStartE2EDuration="1m6.470025021s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:06.424229184 +0000 UTC m=+129.984156994" watchObservedRunningTime="2026-03-09 09:22:06.470025021 +0000 UTC m=+130.029952831" Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.553428 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:06 crc kubenswrapper[4971]: E0309 09:22:06.553588 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:07.05355455 +0000 UTC m=+130.613482360 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.554037 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:06 crc kubenswrapper[4971]: E0309 09:22:06.554314 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:07.054304692 +0000 UTC m=+130.614232502 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.559437 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.656537 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:06 crc kubenswrapper[4971]: E0309 09:22:06.656878 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:07.156851204 +0000 UTC m=+130.716779014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.699909 4971 patch_prober.go:28] interesting pod/router-default-5444994796-w4c8h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:22:06 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 09 09:22:06 crc kubenswrapper[4971]: [+]process-running ok Mar 09 09:22:06 crc kubenswrapper[4971]: healthz check failed Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.699968 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w4c8h" podUID="9e0270a9-8b08-4abf-88da-75319c5e6f48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.758231 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:06 crc kubenswrapper[4971]: E0309 09:22:06.758627 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:07.258617375 +0000 UTC m=+130.818545175 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.796201 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h78mk" Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.861770 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:06 crc kubenswrapper[4971]: E0309 09:22:06.862181 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:07.362162527 +0000 UTC m=+130.922090337 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:06 crc kubenswrapper[4971]: I0309 09:22:06.969257 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:06 crc kubenswrapper[4971]: E0309 09:22:06.971210 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:07.47119659 +0000 UTC m=+131.031124400 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.034853 4971 ???:1] "http: TLS handshake error from 192.168.126.11:43146: no serving certificate available for the kubelet" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.057153 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.076967 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:07 crc kubenswrapper[4971]: E0309 09:22:07.077260 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:07.577227055 +0000 UTC m=+131.137154865 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:07 crc kubenswrapper[4971]: W0309 09:22:07.079582 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf154ebf8_2843_4580_ae89_fbfcb0d6c5c1.slice/crio-e411ce532ffde5887a09289368b86ce4c13a5b81f2566c3d35144d04f5b6fd97 WatchSource:0}: Error finding container e411ce532ffde5887a09289368b86ce4c13a5b81f2566c3d35144d04f5b6fd97: Status 404 returned error can't find the container with id e411ce532ffde5887a09289368b86ce4c13a5b81f2566c3d35144d04f5b6fd97 Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.178168 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:07 crc kubenswrapper[4971]: E0309 09:22:07.178538 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:07.678522522 +0000 UTC m=+131.238450332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.279280 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:07 crc kubenswrapper[4971]: E0309 09:22:07.279538 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:07.7795086 +0000 UTC m=+131.339436420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.279620 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:07 crc kubenswrapper[4971]: E0309 09:22:07.280031 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:07.779984914 +0000 UTC m=+131.339912724 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.322584 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8b5s" event={"ID":"39079cdf-1b40-4f77-ad11-3816fc89e3df","Type":"ContainerStarted","Data":"bcd22b01f48a5951d5aee9d5327360fe5dd86bc343ba1640d03b633cca167a62"} Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.324319 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wf8hd" event={"ID":"79d6be06-8c45-4058-a2ff-5daf63d0404e","Type":"ContainerStarted","Data":"67ee5ac8abc13a4b4e980b13d510951072fbfa2e6ce8ea5bd0cdb3c451e13cd8"} Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.324996 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-wf8hd" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.327140 4971 patch_prober.go:28] interesting pod/console-operator-58897d9998-wf8hd container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.327184 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-wf8hd" podUID="79d6be06-8c45-4058-a2ff-5daf63d0404e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.334807 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" event={"ID":"70b1c95e-1326-4a4d-92f8-12df76f6a23a","Type":"ContainerStarted","Data":"bb3a286d82cee965ba9ca19b7be6268ae3e147bee10a83aa90e858730c3371f6"} Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.334864 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.339112 4971 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-t9sxl container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.339172 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" podUID="70b1c95e-1326-4a4d-92f8-12df76f6a23a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.344244 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h8b5s" podStartSLOduration=67.344223259 podStartE2EDuration="1m7.344223259s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:07.33774585 +0000 UTC m=+130.897673680" watchObservedRunningTime="2026-03-09 09:22:07.344223259 +0000 UTC m=+130.904151059" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.345576 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t9hb6" event={"ID":"db64f07f-f1cb-4754-8e1f-33951a826f78","Type":"ContainerStarted","Data":"3ae957b1af83224d5e5d9bb95e4cd67c85eb2c4e90041cd91194f0dc0950b592"} Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.354947 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j9sc4" event={"ID":"c91cf18b-1765-48d3-9e00-66747b628f33","Type":"ContainerStarted","Data":"cb4cc0da2a737c2617dc29ce62665e46965efdc1694e1b8a16618d95b56737b6"} Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.356603 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j9sc4" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.362084 4971 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-j9sc4 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.362150 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j9sc4" podUID="c91cf18b-1765-48d3-9e00-66747b628f33" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.365107 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pf6b7" event={"ID":"f037869a-34a5-43d9-8c27-6ac17e4fe6b1","Type":"ContainerStarted","Data":"ad62530a89e690bfaa44634a826484ef9779306188f05f7782540fd99a3ede2c"} Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.367766 4971 generic.go:334] "Generic (PLEG): container finished" podID="5073d2d2-177a-4e70-9638-7fe56084c301" containerID="5f0d8622f75ee1c452c65b30a180f2f9dc2340d737a1300a2527569508232c0a" exitCode=0 Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.368071 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" event={"ID":"5073d2d2-177a-4e70-9638-7fe56084c301","Type":"ContainerDied","Data":"5f0d8622f75ee1c452c65b30a180f2f9dc2340d737a1300a2527569508232c0a"} Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.371963 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-txc2s" event={"ID":"a5371ca7-5f2f-4b51-add8-021a77d93c9c","Type":"ContainerStarted","Data":"c102693e609f26c1c7a63e82f098274ef4a7167a3098c927c17c3c7fb86ca185"} Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.380418 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:07 crc kubenswrapper[4971]: E0309 09:22:07.380672 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:07.880626642 +0000 UTC m=+131.440554452 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.380732 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:07 crc kubenswrapper[4971]: E0309 09:22:07.381170 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:07.881161228 +0000 UTC m=+131.441089038 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.382848 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qz4l7" event={"ID":"339ca768-fe61-40dd-8a4f-93363aa23972","Type":"ContainerStarted","Data":"426866addde9da85bfdf1e2f765bc832a9b2093d91b141f60b8669864ef64f40"} Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.382931 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qz4l7" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.386317 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dvp8t" event={"ID":"c9cdbff0-0cca-4375-8c92-1117ce5d1dea","Type":"ContainerStarted","Data":"28ea14d8b25b5f9f030519aa4bd4ff822992da2131f24fb49dd44e1eca45b151"} Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.386745 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-wf8hd" podStartSLOduration=68.38672483 podStartE2EDuration="1m8.38672483s" podCreationTimestamp="2026-03-09 09:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:07.379461628 +0000 UTC m=+130.939389448" watchObservedRunningTime="2026-03-09 09:22:07.38672483 +0000 UTC m=+130.946652640" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.388504 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" podStartSLOduration=67.388493962 podStartE2EDuration="1m7.388493962s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:07.359923378 +0000 UTC m=+130.919851188" watchObservedRunningTime="2026-03-09 09:22:07.388493962 +0000 UTC m=+130.948421762" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.388668 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f154ebf8-2843-4580-ae89-fbfcb0d6c5c1","Type":"ContainerStarted","Data":"e411ce532ffde5887a09289368b86ce4c13a5b81f2566c3d35144d04f5b6fd97"} Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.389918 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-mct42" event={"ID":"50b5a937-bb23-4b89-86a3-6ad4944f5440","Type":"ContainerStarted","Data":"bc5420b49c832919f07650ac409554e0244fd6ae270ad44b5dd7b210acb97dff"} Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.392478 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-57xjq" event={"ID":"6dba6300-591c-4fc5-8544-b208731d2dc6","Type":"ContainerStarted","Data":"b45714edfc7a01e7cb0c251daadbcda62e09e582a36a1e1533adc2a248b0e313"} Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.402271 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-txc2s" podStartSLOduration=68.402243053 podStartE2EDuration="1m8.402243053s" podCreationTimestamp="2026-03-09 09:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:07.396240048 +0000 UTC m=+130.956167858" watchObservedRunningTime="2026-03-09 09:22:07.402243053 +0000 UTC m=+130.962170863" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.414965 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9lhtb" event={"ID":"8b19b44a-0898-4886-b5d2-4bc4ff950094","Type":"ContainerStarted","Data":"9617d06fa7e932f9cb8b92d917bd544ef020112ce1ad126e2465d83e46672f8c"} Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.416234 4971 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-b65bx container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" start-of-body= Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.416314 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b65bx" podUID="13a19b2e-fdd8-41cc-89ac-ed182fa3a449" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.481970 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:07 crc kubenswrapper[4971]: E0309 09:22:07.482175 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:07.982115385 +0000 UTC m=+131.542043205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.482705 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.484364 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pf6b7" podStartSLOduration=68.484320499 podStartE2EDuration="1m8.484320499s" podCreationTimestamp="2026-03-09 09:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:07.46518365 +0000 UTC m=+131.025111460" watchObservedRunningTime="2026-03-09 09:22:07.484320499 +0000 UTC m=+131.044248309" Mar 09 09:22:07 crc kubenswrapper[4971]: E0309 09:22:07.486168 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:07.986149603 +0000 UTC m=+131.546077413 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.504114 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j9sc4" podStartSLOduration=67.504091926 podStartE2EDuration="1m7.504091926s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:07.495923068 +0000 UTC m=+131.055850888" watchObservedRunningTime="2026-03-09 09:22:07.504091926 +0000 UTC m=+131.064019726" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.518742 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-57xjq" podStartSLOduration=11.518724853 podStartE2EDuration="11.518724853s" podCreationTimestamp="2026-03-09 09:21:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:07.515978453 +0000 UTC m=+131.075906263" watchObservedRunningTime="2026-03-09 09:22:07.518724853 +0000 UTC m=+131.078652663" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.537850 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dvp8t" podStartSLOduration=67.537832411 podStartE2EDuration="1m7.537832411s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:07.534788492 +0000 UTC m=+131.094716322" watchObservedRunningTime="2026-03-09 09:22:07.537832411 +0000 UTC m=+131.097760221" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.583935 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-mct42" podStartSLOduration=68.583918096 podStartE2EDuration="1m8.583918096s" podCreationTimestamp="2026-03-09 09:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:07.556685752 +0000 UTC m=+131.116613562" watchObservedRunningTime="2026-03-09 09:22:07.583918096 +0000 UTC m=+131.143845906" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.584545 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qz4l7" podStartSLOduration=67.584539595 podStartE2EDuration="1m7.584539595s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:07.578069206 +0000 UTC m=+131.137997016" watchObservedRunningTime="2026-03-09 09:22:07.584539595 +0000 UTC m=+131.144467395" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.584604 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:07 crc kubenswrapper[4971]: E0309 09:22:07.584841 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:08.084822183 +0000 UTC m=+131.644750003 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.584962 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:07 crc kubenswrapper[4971]: E0309 09:22:07.586222 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:08.086204893 +0000 UTC m=+131.646132773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:07 crc kubenswrapper[4971]: E0309 09:22:07.665719 4971 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcd6b63d_8557_4c0b_b000_7d9e14cd229e.slice/crio-conmon-a1da45f80271302f0f505b4eb8572926a37362dd6024a91b40ca65802105a50f.scope\": RecentStats: unable to find data in memory cache]" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.686190 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:07 crc kubenswrapper[4971]: E0309 09:22:07.686586 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:08.186568323 +0000 UTC m=+131.746496133 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.700944 4971 patch_prober.go:28] interesting pod/router-default-5444994796-w4c8h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:22:07 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 09 09:22:07 crc kubenswrapper[4971]: [+]process-running ok Mar 09 09:22:07 crc kubenswrapper[4971]: healthz check failed Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.700996 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w4c8h" podUID="9e0270a9-8b08-4abf-88da-75319c5e6f48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.785017 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-nmbwp" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.787730 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:07 crc kubenswrapper[4971]: E0309 09:22:07.788015 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:08.288000884 +0000 UTC m=+131.847928694 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.888534 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.888576 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npm8w\" (UniqueName: \"kubernetes.io/projected/1707bff4-eb31-4ed0-bbc5-054813b1a34a-kube-api-access-npm8w\") pod \"1707bff4-eb31-4ed0-bbc5-054813b1a34a\" (UID: \"1707bff4-eb31-4ed0-bbc5-054813b1a34a\") " Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.888661 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1707bff4-eb31-4ed0-bbc5-054813b1a34a-config-volume\") pod \"1707bff4-eb31-4ed0-bbc5-054813b1a34a\" (UID: \"1707bff4-eb31-4ed0-bbc5-054813b1a34a\") " Mar 09 09:22:07 crc kubenswrapper[4971]: E0309 09:22:07.888681 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:08.388659092 +0000 UTC m=+131.948586902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.888714 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1707bff4-eb31-4ed0-bbc5-054813b1a34a-secret-volume\") pod \"1707bff4-eb31-4ed0-bbc5-054813b1a34a\" (UID: \"1707bff4-eb31-4ed0-bbc5-054813b1a34a\") " Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.888978 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:07 crc kubenswrapper[4971]: E0309 09:22:07.889328 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:08.389321982 +0000 UTC m=+131.949249792 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.890048 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1707bff4-eb31-4ed0-bbc5-054813b1a34a-config-volume" (OuterVolumeSpecName: "config-volume") pod "1707bff4-eb31-4ed0-bbc5-054813b1a34a" (UID: "1707bff4-eb31-4ed0-bbc5-054813b1a34a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.897781 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1707bff4-eb31-4ed0-bbc5-054813b1a34a-kube-api-access-npm8w" (OuterVolumeSpecName: "kube-api-access-npm8w") pod "1707bff4-eb31-4ed0-bbc5-054813b1a34a" (UID: "1707bff4-eb31-4ed0-bbc5-054813b1a34a"). InnerVolumeSpecName "kube-api-access-npm8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.898285 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1707bff4-eb31-4ed0-bbc5-054813b1a34a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1707bff4-eb31-4ed0-bbc5-054813b1a34a" (UID: "1707bff4-eb31-4ed0-bbc5-054813b1a34a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.990162 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.990518 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npm8w\" (UniqueName: \"kubernetes.io/projected/1707bff4-eb31-4ed0-bbc5-054813b1a34a-kube-api-access-npm8w\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.990538 4971 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1707bff4-eb31-4ed0-bbc5-054813b1a34a-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:07 crc kubenswrapper[4971]: I0309 09:22:07.990550 4971 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1707bff4-eb31-4ed0-bbc5-054813b1a34a-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:07 crc kubenswrapper[4971]: E0309 09:22:07.990628 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:08.490607638 +0000 UTC m=+132.050535448 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.091892 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:08 crc kubenswrapper[4971]: E0309 09:22:08.092332 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:08.592313937 +0000 UTC m=+132.152241747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.193466 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:08 crc kubenswrapper[4971]: E0309 09:22:08.193856 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:08.693841881 +0000 UTC m=+132.253769691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.295095 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:08 crc kubenswrapper[4971]: E0309 09:22:08.295517 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:08.795505159 +0000 UTC m=+132.355432969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.396280 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:08 crc kubenswrapper[4971]: E0309 09:22:08.396474 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:08.896449336 +0000 UTC m=+132.456377146 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.396924 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:08 crc kubenswrapper[4971]: E0309 09:22:08.397328 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:08.897313501 +0000 UTC m=+132.457241311 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.445648 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" event={"ID":"2555712b-fa0a-4831-90ca-78d22b2e48b9","Type":"ContainerStarted","Data":"9dd9162e41be5ca1e990c283b045bc207dc6f4526631e7d93083a2d856a09d35"} Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.445992 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.447839 4971 generic.go:334] "Generic (PLEG): container finished" podID="f154ebf8-2843-4580-ae89-fbfcb0d6c5c1" containerID="4f91de6b44a51a663147d58f400843046d37d96b9c8d322c61fee9584fd5b5c5" exitCode=0 Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.447910 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f154ebf8-2843-4580-ae89-fbfcb0d6c5c1","Type":"ContainerDied","Data":"4f91de6b44a51a663147d58f400843046d37d96b9c8d322c61fee9584fd5b5c5"} Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.448267 4971 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-nvzgg container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.29:6443/healthz\": dial tcp 10.217.0.29:6443: connect: connection refused" start-of-body= Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.448308 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" podUID="2555712b-fa0a-4831-90ca-78d22b2e48b9" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.29:6443/healthz\": dial tcp 10.217.0.29:6443: connect: connection refused" Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.449561 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-48g6z" event={"ID":"f7b8f2b8-0607-467d-8ba2-3b823817b639","Type":"ContainerStarted","Data":"01631200bc7d01243ad0b2d2d8dae016e895155b9a054fbf6e7ebbd76d60359d"} Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.451018 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-krqgm" event={"ID":"91b993d3-35bb-4b9b-9e1a-ca96fa6f8162","Type":"ContainerStarted","Data":"5d6c193fd0f43389bf3e140929d41a9a1de82622988ab3209e4fe1f673010da3"} Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.452840 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9lhtb" event={"ID":"8b19b44a-0898-4886-b5d2-4bc4ff950094","Type":"ContainerStarted","Data":"1064628de30bb7999381ee1f4ee15774cd5760564551f408c732574b01c0c2e0"} Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.458458 4971 generic.go:334] "Generic (PLEG): container finished" podID="bcd6b63d-8557-4c0b-b000-7d9e14cd229e" containerID="a1da45f80271302f0f505b4eb8572926a37362dd6024a91b40ca65802105a50f" exitCode=0 Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.458557 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" event={"ID":"bcd6b63d-8557-4c0b-b000-7d9e14cd229e","Type":"ContainerDied","Data":"a1da45f80271302f0f505b4eb8572926a37362dd6024a91b40ca65802105a50f"} Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.464734 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-nmbwp" event={"ID":"1707bff4-eb31-4ed0-bbc5-054813b1a34a","Type":"ContainerDied","Data":"5af6fc53c5463f604b20674cee7524f118f21819a0e8eb5f4d3e5d345bd8be5b"} Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.464988 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5af6fc53c5463f604b20674cee7524f118f21819a0e8eb5f4d3e5d345bd8be5b" Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.464750 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-nmbwp" Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.469711 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" event={"ID":"5073d2d2-177a-4e70-9638-7fe56084c301","Type":"ContainerStarted","Data":"edbccf7234c8650a23ec2a1ab6cc50bf4b7f2d74a83969b0e7a9f1ac1ae5a221"} Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.473878 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t9hb6" event={"ID":"db64f07f-f1cb-4754-8e1f-33951a826f78","Type":"ContainerStarted","Data":"5ecfb06609b54a2852428d244cf24c1d06d0b5fd72baa9728a99858b2ebf9e91"} Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.478255 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7xwd6" event={"ID":"7e1d5ee3-5d9c-4d44-bf5a-343216e8803e","Type":"ContainerStarted","Data":"78b4de171d63fcfaf85b69c25920955f0007d29911ddc4d65f9cdc58ca29a701"} Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.478300 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7xwd6" event={"ID":"7e1d5ee3-5d9c-4d44-bf5a-343216e8803e","Type":"ContainerStarted","Data":"421c5397188ce46036fada89b5c91a42e610b209736530d5d824166fe961018e"} Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.483030 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" event={"ID":"2cb2b3b4-5e17-41ef-9f6c-49a86ee0a6da","Type":"ContainerStarted","Data":"b216aed8d4f8322bbfc23a9d841286d6a44994042563ce0c399e3a37014059f8"} Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.483768 4971 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-j9sc4 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.483811 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j9sc4" podUID="c91cf18b-1765-48d3-9e00-66747b628f33" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.498562 4971 patch_prober.go:28] interesting pod/console-operator-58897d9998-wf8hd container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.498651 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-wf8hd" podUID="79d6be06-8c45-4058-a2ff-5daf63d0404e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.498562 4971 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-b65bx container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" start-of-body= Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.498729 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b65bx" podUID="13a19b2e-fdd8-41cc-89ac-ed182fa3a449" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.499916 4971 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-t9sxl container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.499991 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" podUID="70b1c95e-1326-4a4d-92f8-12df76f6a23a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.502277 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.502842 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" podStartSLOduration=69.502826121 podStartE2EDuration="1m9.502826121s" podCreationTimestamp="2026-03-09 09:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:08.499246796 +0000 UTC m=+132.059174606" watchObservedRunningTime="2026-03-09 09:22:08.502826121 +0000 UTC m=+132.062753931" Mar 09 09:22:08 crc kubenswrapper[4971]: E0309 09:22:08.504777 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:09.004664535 +0000 UTC m=+132.564592355 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.537291 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-t9hb6" podStartSLOduration=68.537265636 podStartE2EDuration="1m8.537265636s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:08.532922289 +0000 UTC m=+132.092850099" watchObservedRunningTime="2026-03-09 09:22:08.537265636 +0000 UTC m=+132.097193466" Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.604205 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:08 crc kubenswrapper[4971]: E0309 09:22:08.606811 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:09.106795856 +0000 UTC m=+132.666723666 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.626877 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-nw59v" podStartSLOduration=68.626855361 podStartE2EDuration="1m8.626855361s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:08.626240403 +0000 UTC m=+132.186168213" watchObservedRunningTime="2026-03-09 09:22:08.626855361 +0000 UTC m=+132.186783171" Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.628733 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" podStartSLOduration=68.628723026 podStartE2EDuration="1m8.628723026s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:08.59699323 +0000 UTC m=+132.156921040" watchObservedRunningTime="2026-03-09 09:22:08.628723026 +0000 UTC m=+132.188650846" Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.661822 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-48g6z" podStartSLOduration=68.661803632 podStartE2EDuration="1m8.661803632s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:08.660942386 +0000 UTC m=+132.220870196" watchObservedRunningTime="2026-03-09 09:22:08.661803632 +0000 UTC m=+132.221731442" Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.704666 4971 patch_prober.go:28] interesting pod/router-default-5444994796-w4c8h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:22:08 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 09 09:22:08 crc kubenswrapper[4971]: [+]process-running ok Mar 09 09:22:08 crc kubenswrapper[4971]: healthz check failed Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.704747 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w4c8h" podUID="9e0270a9-8b08-4abf-88da-75319c5e6f48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.716917 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:08 crc kubenswrapper[4971]: E0309 09:22:08.717205 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:09.217186138 +0000 UTC m=+132.777113948 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.717291 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:08 crc kubenswrapper[4971]: E0309 09:22:08.717638 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:09.217631241 +0000 UTC m=+132.777559051 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.786952 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9lhtb" podStartSLOduration=68.786935374 podStartE2EDuration="1m8.786935374s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:08.71689812 +0000 UTC m=+132.276825930" watchObservedRunningTime="2026-03-09 09:22:08.786935374 +0000 UTC m=+132.346863184" Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.818949 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:08 crc kubenswrapper[4971]: E0309 09:22:08.819298 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:09.319280939 +0000 UTC m=+132.879208749 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.901069 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-7xwd6" podStartSLOduration=68.901036015 podStartE2EDuration="1m8.901036015s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:08.821632227 +0000 UTC m=+132.381560037" watchObservedRunningTime="2026-03-09 09:22:08.901036015 +0000 UTC m=+132.460963815" Mar 09 09:22:08 crc kubenswrapper[4971]: I0309 09:22:08.920663 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:08 crc kubenswrapper[4971]: E0309 09:22:08.921081 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:09.421061845 +0000 UTC m=+132.980989655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:09 crc kubenswrapper[4971]: I0309 09:22:09.022149 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:09 crc kubenswrapper[4971]: E0309 09:22:09.022324 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:09.522299404 +0000 UTC m=+133.082227214 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:09 crc kubenswrapper[4971]: I0309 09:22:09.022431 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:09 crc kubenswrapper[4971]: E0309 09:22:09.022777 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:09.522767971 +0000 UTC m=+133.082695781 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:09 crc kubenswrapper[4971]: I0309 09:22:09.123759 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:09 crc kubenswrapper[4971]: E0309 09:22:09.124157 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:09.624142986 +0000 UTC m=+133.184070796 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:09 crc kubenswrapper[4971]: I0309 09:22:09.155129 4971 scope.go:117] "RemoveContainer" containerID="0498fa34e162baaf3d51e00c839035dfb5a043d12e709f17f37859b8d3fbe083" Mar 09 09:22:09 crc kubenswrapper[4971]: E0309 09:22:09.155337 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:22:09 crc kubenswrapper[4971]: I0309 09:22:09.225194 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:09 crc kubenswrapper[4971]: E0309 09:22:09.225592 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:09.725575272 +0000 UTC m=+133.285503082 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:09 crc kubenswrapper[4971]: I0309 09:22:09.333770 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:09 crc kubenswrapper[4971]: E0309 09:22:09.334009 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:09.833987619 +0000 UTC m=+133.393915429 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:09 crc kubenswrapper[4971]: I0309 09:22:09.334133 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:09 crc kubenswrapper[4971]: E0309 09:22:09.334431 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:09.834424064 +0000 UTC m=+133.394351874 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:09 crc kubenswrapper[4971]: I0309 09:22:09.435118 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:09 crc kubenswrapper[4971]: E0309 09:22:09.435634 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:09.935618392 +0000 UTC m=+133.495546202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:09 crc kubenswrapper[4971]: I0309 09:22:09.490784 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" event={"ID":"bcd6b63d-8557-4c0b-b000-7d9e14cd229e","Type":"ContainerStarted","Data":"dee9a4285b72567ba1be91a695893ee90ad697406d7f7171ef53986f53a474dd"} Mar 09 09:22:09 crc kubenswrapper[4971]: I0309 09:22:09.491997 4971 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-nvzgg container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.29:6443/healthz\": dial tcp 10.217.0.29:6443: connect: connection refused" start-of-body= Mar 09 09:22:09 crc kubenswrapper[4971]: I0309 09:22:09.492398 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" podUID="2555712b-fa0a-4831-90ca-78d22b2e48b9" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.29:6443/healthz\": dial tcp 10.217.0.29:6443: connect: connection refused" Mar 09 09:22:09 crc kubenswrapper[4971]: I0309 09:22:09.492603 4971 patch_prober.go:28] interesting pod/console-operator-58897d9998-wf8hd container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 09 09:22:09 crc kubenswrapper[4971]: I0309 09:22:09.492649 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-wf8hd" podUID="79d6be06-8c45-4058-a2ff-5daf63d0404e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 09 09:22:09 crc kubenswrapper[4971]: I0309 09:22:09.538688 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:09 crc kubenswrapper[4971]: E0309 09:22:09.539181 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:10.039160874 +0000 UTC m=+133.599088694 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:09 crc kubenswrapper[4971]: I0309 09:22:09.587292 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j9sc4" Mar 09 09:22:09 crc kubenswrapper[4971]: I0309 09:22:09.640007 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:09 crc kubenswrapper[4971]: E0309 09:22:09.644414 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:10.144392067 +0000 UTC m=+133.704319927 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:09 crc kubenswrapper[4971]: I0309 09:22:09.672655 4971 ???:1] "http: TLS handshake error from 192.168.126.11:43148: no serving certificate available for the kubelet" Mar 09 09:22:09 crc kubenswrapper[4971]: I0309 09:22:09.698769 4971 patch_prober.go:28] interesting pod/router-default-5444994796-w4c8h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:22:09 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 09 09:22:09 crc kubenswrapper[4971]: [+]process-running ok Mar 09 09:22:09 crc kubenswrapper[4971]: healthz check failed Mar 09 09:22:09 crc kubenswrapper[4971]: I0309 09:22:09.698840 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w4c8h" podUID="9e0270a9-8b08-4abf-88da-75319c5e6f48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:22:09 crc kubenswrapper[4971]: I0309 09:22:09.741807 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:09 crc kubenswrapper[4971]: E0309 09:22:09.742129 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:10.24211741 +0000 UTC m=+133.802045240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:09 crc kubenswrapper[4971]: I0309 09:22:09.842527 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:09 crc kubenswrapper[4971]: E0309 09:22:09.842787 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:10.342772619 +0000 UTC m=+133.902700429 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:09 crc kubenswrapper[4971]: I0309 09:22:09.862714 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:22:09 crc kubenswrapper[4971]: I0309 09:22:09.943887 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:09 crc kubenswrapper[4971]: E0309 09:22:09.944139 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:10.444100032 +0000 UTC m=+134.004027842 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.044833 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.044908 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f154ebf8-2843-4580-ae89-fbfcb0d6c5c1-kube-api-access\") pod \"f154ebf8-2843-4580-ae89-fbfcb0d6c5c1\" (UID: \"f154ebf8-2843-4580-ae89-fbfcb0d6c5c1\") " Mar 09 09:22:10 crc kubenswrapper[4971]: E0309 09:22:10.045014 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:10.544991919 +0000 UTC m=+134.104919719 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.045057 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f154ebf8-2843-4580-ae89-fbfcb0d6c5c1-kubelet-dir\") pod \"f154ebf8-2843-4580-ae89-fbfcb0d6c5c1\" (UID: \"f154ebf8-2843-4580-ae89-fbfcb0d6c5c1\") " Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.045139 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f154ebf8-2843-4580-ae89-fbfcb0d6c5c1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f154ebf8-2843-4580-ae89-fbfcb0d6c5c1" (UID: "f154ebf8-2843-4580-ae89-fbfcb0d6c5c1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.045410 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.045553 4971 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f154ebf8-2843-4580-ae89-fbfcb0d6c5c1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:10 crc kubenswrapper[4971]: E0309 09:22:10.045744 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:10.545729705 +0000 UTC m=+134.105657555 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.051800 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f154ebf8-2843-4580-ae89-fbfcb0d6c5c1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f154ebf8-2843-4580-ae89-fbfcb0d6c5c1" (UID: "f154ebf8-2843-4580-ae89-fbfcb0d6c5c1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.173401 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:10 crc kubenswrapper[4971]: E0309 09:22:10.173521 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:10.673495195 +0000 UTC m=+134.233422995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.173665 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.173729 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f154ebf8-2843-4580-ae89-fbfcb0d6c5c1-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:10 crc kubenswrapper[4971]: E0309 09:22:10.173945 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:10.67392934 +0000 UTC m=+134.233857150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.274560 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:10 crc kubenswrapper[4971]: E0309 09:22:10.274709 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:10.774692893 +0000 UTC m=+134.334620703 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.274762 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:10 crc kubenswrapper[4971]: E0309 09:22:10.275175 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:10.775156249 +0000 UTC m=+134.335084059 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.375378 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:10 crc kubenswrapper[4971]: E0309 09:22:10.375601 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:10.875552239 +0000 UTC m=+134.435480069 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.476684 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:10 crc kubenswrapper[4971]: E0309 09:22:10.477060 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:10.977042297 +0000 UTC m=+134.536970107 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.504651 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.504836 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f154ebf8-2843-4580-ae89-fbfcb0d6c5c1","Type":"ContainerDied","Data":"e411ce532ffde5887a09289368b86ce4c13a5b81f2566c3d35144d04f5b6fd97"} Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.504876 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e411ce532ffde5887a09289368b86ce4c13a5b81f2566c3d35144d04f5b6fd97" Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.533469 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" event={"ID":"bcd6b63d-8557-4c0b-b000-7d9e14cd229e","Type":"ContainerStarted","Data":"4fa61cbabf59634bcab568905f942ce65f0cbbf5fbda3464e3c00793243b7ee3"} Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.568869 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" podStartSLOduration=71.568850218 podStartE2EDuration="1m11.568850218s" podCreationTimestamp="2026-03-09 09:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:10.567038453 +0000 UTC m=+134.126966263" watchObservedRunningTime="2026-03-09 09:22:10.568850218 +0000 UTC m=+134.128778018" Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.577441 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:10 crc kubenswrapper[4971]: E0309 09:22:10.577687 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:11.077653844 +0000 UTC m=+134.637581654 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.578475 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:10 crc kubenswrapper[4971]: E0309 09:22:10.579002 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:11.078992862 +0000 UTC m=+134.638920672 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.679427 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:10 crc kubenswrapper[4971]: E0309 09:22:10.679548 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:11.179530456 +0000 UTC m=+134.739458266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.679815 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:10 crc kubenswrapper[4971]: E0309 09:22:10.680255 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:11.180236542 +0000 UTC m=+134.740164352 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.698774 4971 patch_prober.go:28] interesting pod/router-default-5444994796-w4c8h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:22:10 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 09 09:22:10 crc kubenswrapper[4971]: [+]process-running ok Mar 09 09:22:10 crc kubenswrapper[4971]: healthz check failed Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.698859 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w4c8h" podUID="9e0270a9-8b08-4abf-88da-75319c5e6f48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.780571 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:10 crc kubenswrapper[4971]: E0309 09:22:10.780759 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:11.280733335 +0000 UTC m=+134.840661145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.780824 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:10 crc kubenswrapper[4971]: E0309 09:22:10.781203 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:11.281194311 +0000 UTC m=+134.841122121 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.803972 4971 patch_prober.go:28] interesting pod/downloads-7954f5f757-d25sv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.804042 4971 patch_prober.go:28] interesting pod/downloads-7954f5f757-d25sv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.804100 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d25sv" podUID="afc88ae6-e5b1-4da0-b10a-a6bf1816e6fa" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.804046 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-d25sv" podUID="afc88ae6-e5b1-4da0-b10a-a6bf1816e6fa" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.815503 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.815556 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.817306 4971 patch_prober.go:28] interesting pod/console-f9d7485db-dnx9z container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.817396 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dnx9z" podUID="c8c3ac1c-4896-4db2-8917-0a57667a1fa8" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.881407 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:10 crc kubenswrapper[4971]: E0309 09:22:10.881618 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:11.38159175 +0000 UTC m=+134.941519570 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:10 crc kubenswrapper[4971]: I0309 09:22:10.982385 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:10 crc kubenswrapper[4971]: E0309 09:22:10.982800 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:11.482783178 +0000 UTC m=+135.042710988 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.082905 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:11 crc kubenswrapper[4971]: E0309 09:22:11.083016 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:11.583002731 +0000 UTC m=+135.142930541 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.083318 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:11 crc kubenswrapper[4971]: E0309 09:22:11.083574 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:11.583566751 +0000 UTC m=+135.143494561 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.206008 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:11 crc kubenswrapper[4971]: E0309 09:22:11.206798 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:11.706771748 +0000 UTC m=+135.266699588 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.227380 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.227433 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ft9v2"] Mar 09 09:22:11 crc kubenswrapper[4971]: E0309 09:22:11.227790 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1707bff4-eb31-4ed0-bbc5-054813b1a34a" containerName="collect-profiles" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.227813 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1707bff4-eb31-4ed0-bbc5-054813b1a34a" containerName="collect-profiles" Mar 09 09:22:11 crc kubenswrapper[4971]: E0309 09:22:11.227847 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f154ebf8-2843-4580-ae89-fbfcb0d6c5c1" containerName="pruner" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.227855 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f154ebf8-2843-4580-ae89-fbfcb0d6c5c1" containerName="pruner" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.228076 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1707bff4-eb31-4ed0-bbc5-054813b1a34a" containerName="collect-profiles" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.228131 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f154ebf8-2843-4580-ae89-fbfcb0d6c5c1" containerName="pruner" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.230213 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ft9v2"] Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.230336 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ft9v2" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.233728 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.251629 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.265941 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.293501 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b65bx" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.311103 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1054c243-8a85-4262-ba12-2ee5643d0255-catalog-content\") pod \"certified-operators-ft9v2\" (UID: \"1054c243-8a85-4262-ba12-2ee5643d0255\") " pod="openshift-marketplace/certified-operators-ft9v2" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.311210 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.311910 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf854\" (UniqueName: \"kubernetes.io/projected/1054c243-8a85-4262-ba12-2ee5643d0255-kube-api-access-hf854\") pod \"certified-operators-ft9v2\" (UID: \"1054c243-8a85-4262-ba12-2ee5643d0255\") " pod="openshift-marketplace/certified-operators-ft9v2" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.311989 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1054c243-8a85-4262-ba12-2ee5643d0255-utilities\") pod \"certified-operators-ft9v2\" (UID: \"1054c243-8a85-4262-ba12-2ee5643d0255\") " pod="openshift-marketplace/certified-operators-ft9v2" Mar 09 09:22:11 crc kubenswrapper[4971]: E0309 09:22:11.312833 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:11.81282156 +0000 UTC m=+135.372749370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.320596 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bzw7" Mar 09 09:22:11 crc kubenswrapper[4971]: E0309 09:22:11.327298 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="61c76aff58c35e94f3e4d72f3e326230fd28af3da5913b414ea145eb56170a68" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 09 09:22:11 crc kubenswrapper[4971]: E0309 09:22:11.330797 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="61c76aff58c35e94f3e4d72f3e326230fd28af3da5913b414ea145eb56170a68" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 09 09:22:11 crc kubenswrapper[4971]: E0309 09:22:11.336103 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="61c76aff58c35e94f3e4d72f3e326230fd28af3da5913b414ea145eb56170a68" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 09 09:22:11 crc kubenswrapper[4971]: E0309 09:22:11.336151 4971 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" podUID="1a0999c2-4d90-4197-8075-e11790a0ed9b" containerName="kube-multus-additional-cni-plugins" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.363343 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=0.363327031 podStartE2EDuration="363.327031ms" podCreationTimestamp="2026-03-09 09:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:11.325663121 +0000 UTC m=+134.885590931" watchObservedRunningTime="2026-03-09 09:22:11.363327031 +0000 UTC m=+134.923254841" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.393050 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=0.393032266 podStartE2EDuration="393.032266ms" podCreationTimestamp="2026-03-09 09:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:11.367159138 +0000 UTC m=+134.927086948" watchObservedRunningTime="2026-03-09 09:22:11.393032266 +0000 UTC m=+134.952960076" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.412897 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.413146 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf854\" (UniqueName: \"kubernetes.io/projected/1054c243-8a85-4262-ba12-2ee5643d0255-kube-api-access-hf854\") pod \"certified-operators-ft9v2\" (UID: \"1054c243-8a85-4262-ba12-2ee5643d0255\") " pod="openshift-marketplace/certified-operators-ft9v2" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.413199 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1054c243-8a85-4262-ba12-2ee5643d0255-utilities\") pod \"certified-operators-ft9v2\" (UID: \"1054c243-8a85-4262-ba12-2ee5643d0255\") " pod="openshift-marketplace/certified-operators-ft9v2" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.413266 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1054c243-8a85-4262-ba12-2ee5643d0255-catalog-content\") pod \"certified-operators-ft9v2\" (UID: \"1054c243-8a85-4262-ba12-2ee5643d0255\") " pod="openshift-marketplace/certified-operators-ft9v2" Mar 09 09:22:11 crc kubenswrapper[4971]: E0309 09:22:11.413652 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:11.913631884 +0000 UTC m=+135.473559704 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.414319 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1054c243-8a85-4262-ba12-2ee5643d0255-utilities\") pod \"certified-operators-ft9v2\" (UID: \"1054c243-8a85-4262-ba12-2ee5643d0255\") " pod="openshift-marketplace/certified-operators-ft9v2" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.414752 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1054c243-8a85-4262-ba12-2ee5643d0255-catalog-content\") pod \"certified-operators-ft9v2\" (UID: \"1054c243-8a85-4262-ba12-2ee5643d0255\") " pod="openshift-marketplace/certified-operators-ft9v2" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.445545 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf854\" (UniqueName: \"kubernetes.io/projected/1054c243-8a85-4262-ba12-2ee5643d0255-kube-api-access-hf854\") pod \"certified-operators-ft9v2\" (UID: \"1054c243-8a85-4262-ba12-2ee5643d0255\") " pod="openshift-marketplace/certified-operators-ft9v2" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.512790 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xnbvj"] Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.513856 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xnbvj" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.514070 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:11 crc kubenswrapper[4971]: E0309 09:22:11.515405 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:12.015388113 +0000 UTC m=+135.575315933 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.538121 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xnbvj"] Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.543459 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-wf8hd" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.559977 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ft9v2" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.574548 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-krqgm" event={"ID":"91b993d3-35bb-4b9b-9e1a-ca96fa6f8162","Type":"ContainerStarted","Data":"c409e2fff7fa7dbe8e7e016c088f780e1236803b5b83b8574559b36af6b8601e"} Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.574591 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-krqgm" event={"ID":"91b993d3-35bb-4b9b-9e1a-ca96fa6f8162","Type":"ContainerStarted","Data":"ac067952cfeb6780d83d3df8c62f0268700ca9dc59a3ccd384a193c4d622365a"} Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.614896 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:11 crc kubenswrapper[4971]: E0309 09:22:11.615174 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:12.115142409 +0000 UTC m=+135.675070219 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.615727 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.615817 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1992d44-6e31-4432-88f0-320408d9fa70-utilities\") pod \"certified-operators-xnbvj\" (UID: \"b1992d44-6e31-4432-88f0-320408d9fa70\") " pod="openshift-marketplace/certified-operators-xnbvj" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.615858 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1992d44-6e31-4432-88f0-320408d9fa70-catalog-content\") pod \"certified-operators-xnbvj\" (UID: \"b1992d44-6e31-4432-88f0-320408d9fa70\") " pod="openshift-marketplace/certified-operators-xnbvj" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.615919 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9cl4\" (UniqueName: \"kubernetes.io/projected/b1992d44-6e31-4432-88f0-320408d9fa70-kube-api-access-r9cl4\") pod \"certified-operators-xnbvj\" (UID: \"b1992d44-6e31-4432-88f0-320408d9fa70\") " pod="openshift-marketplace/certified-operators-xnbvj" Mar 09 09:22:11 crc kubenswrapper[4971]: E0309 09:22:11.618121 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:12.118110605 +0000 UTC m=+135.678038415 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.640488 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.640542 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.647299 4971 patch_prober.go:28] interesting pod/apiserver-76f77b778f-rqlbq container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.12:8443/livez\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.647360 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" podUID="bcd6b63d-8557-4c0b-b000-7d9e14cd229e" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.12:8443/livez\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.695609 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-w4c8h" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.700378 4971 patch_prober.go:28] interesting pod/router-default-5444994796-w4c8h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:22:11 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 09 09:22:11 crc kubenswrapper[4971]: [+]process-running ok Mar 09 09:22:11 crc kubenswrapper[4971]: healthz check failed Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.700429 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w4c8h" podUID="9e0270a9-8b08-4abf-88da-75319c5e6f48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.710980 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5cqt9"] Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.712155 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5cqt9" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.714383 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.718173 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.718419 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9cl4\" (UniqueName: \"kubernetes.io/projected/b1992d44-6e31-4432-88f0-320408d9fa70-kube-api-access-r9cl4\") pod \"certified-operators-xnbvj\" (UID: \"b1992d44-6e31-4432-88f0-320408d9fa70\") " pod="openshift-marketplace/certified-operators-xnbvj" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.718466 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srmbw\" (UniqueName: \"kubernetes.io/projected/e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd-kube-api-access-srmbw\") pod \"community-operators-5cqt9\" (UID: \"e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd\") " pod="openshift-marketplace/community-operators-5cqt9" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.718503 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd-catalog-content\") pod \"community-operators-5cqt9\" (UID: \"e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd\") " pod="openshift-marketplace/community-operators-5cqt9" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.718545 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd-utilities\") pod \"community-operators-5cqt9\" (UID: \"e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd\") " pod="openshift-marketplace/community-operators-5cqt9" Mar 09 09:22:11 crc kubenswrapper[4971]: E0309 09:22:11.719365 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:12.219329714 +0000 UTC m=+135.779257524 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.719473 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1992d44-6e31-4432-88f0-320408d9fa70-utilities\") pod \"certified-operators-xnbvj\" (UID: \"b1992d44-6e31-4432-88f0-320408d9fa70\") " pod="openshift-marketplace/certified-operators-xnbvj" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.719512 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1992d44-6e31-4432-88f0-320408d9fa70-catalog-content\") pod \"certified-operators-xnbvj\" (UID: \"b1992d44-6e31-4432-88f0-320408d9fa70\") " pod="openshift-marketplace/certified-operators-xnbvj" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.720283 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1992d44-6e31-4432-88f0-320408d9fa70-catalog-content\") pod \"certified-operators-xnbvj\" (UID: \"b1992d44-6e31-4432-88f0-320408d9fa70\") " pod="openshift-marketplace/certified-operators-xnbvj" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.720558 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1992d44-6e31-4432-88f0-320408d9fa70-utilities\") pod \"certified-operators-xnbvj\" (UID: \"b1992d44-6e31-4432-88f0-320408d9fa70\") " pod="openshift-marketplace/certified-operators-xnbvj" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.732499 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5cqt9"] Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.742206 4971 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.765006 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9cl4\" (UniqueName: \"kubernetes.io/projected/b1992d44-6e31-4432-88f0-320408d9fa70-kube-api-access-r9cl4\") pod \"certified-operators-xnbvj\" (UID: \"b1992d44-6e31-4432-88f0-320408d9fa70\") " pod="openshift-marketplace/certified-operators-xnbvj" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.820436 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.820499 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srmbw\" (UniqueName: \"kubernetes.io/projected/e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd-kube-api-access-srmbw\") pod \"community-operators-5cqt9\" (UID: \"e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd\") " pod="openshift-marketplace/community-operators-5cqt9" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.820518 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd-catalog-content\") pod \"community-operators-5cqt9\" (UID: \"e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd\") " pod="openshift-marketplace/community-operators-5cqt9" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.820551 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd-utilities\") pod \"community-operators-5cqt9\" (UID: \"e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd\") " pod="openshift-marketplace/community-operators-5cqt9" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.820972 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd-utilities\") pod \"community-operators-5cqt9\" (UID: \"e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd\") " pod="openshift-marketplace/community-operators-5cqt9" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.821155 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd-catalog-content\") pod \"community-operators-5cqt9\" (UID: \"e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd\") " pod="openshift-marketplace/community-operators-5cqt9" Mar 09 09:22:11 crc kubenswrapper[4971]: E0309 09:22:11.821453 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:12.321437085 +0000 UTC m=+135.881364895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.824958 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.825713 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.830608 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.833677 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.844589 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xnbvj" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.845083 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.852158 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srmbw\" (UniqueName: \"kubernetes.io/projected/e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd-kube-api-access-srmbw\") pod \"community-operators-5cqt9\" (UID: \"e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd\") " pod="openshift-marketplace/community-operators-5cqt9" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.886928 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ft9v2"] Mar 09 09:22:11 crc kubenswrapper[4971]: W0309 09:22:11.904551 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1054c243_8a85_4262_ba12_2ee5643d0255.slice/crio-d9608f12a07ea913b8527583933f427cec7f268c21001b412b23477ab6fd1bff WatchSource:0}: Error finding container d9608f12a07ea913b8527583933f427cec7f268c21001b412b23477ab6fd1bff: Status 404 returned error can't find the container with id d9608f12a07ea913b8527583933f427cec7f268c21001b412b23477ab6fd1bff Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.918770 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4dhm7"] Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.920129 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4dhm7" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.921221 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:11 crc kubenswrapper[4971]: E0309 09:22:11.921328 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:12.421296855 +0000 UTC m=+135.981224665 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.921472 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41ca417f-9f99-44da-b444-4ecf1b9b5d04-utilities\") pod \"community-operators-4dhm7\" (UID: \"41ca417f-9f99-44da-b444-4ecf1b9b5d04\") " pod="openshift-marketplace/community-operators-4dhm7" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.921510 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.921545 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41ca417f-9f99-44da-b444-4ecf1b9b5d04-catalog-content\") pod \"community-operators-4dhm7\" (UID: \"41ca417f-9f99-44da-b444-4ecf1b9b5d04\") " pod="openshift-marketplace/community-operators-4dhm7" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.921725 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fwdd\" (UniqueName: \"kubernetes.io/projected/41ca417f-9f99-44da-b444-4ecf1b9b5d04-kube-api-access-2fwdd\") pod \"community-operators-4dhm7\" (UID: \"41ca417f-9f99-44da-b444-4ecf1b9b5d04\") " pod="openshift-marketplace/community-operators-4dhm7" Mar 09 09:22:11 crc kubenswrapper[4971]: E0309 09:22:11.921884 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:12.421876405 +0000 UTC m=+135.981804215 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.962401 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4dhm7"] Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.962493 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.962513 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:22:11 crc kubenswrapper[4971]: I0309 09:22:11.979850 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.024588 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.024911 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86be3faf-7eff-4890-8a02-5c541621b4c3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"86be3faf-7eff-4890-8a02-5c541621b4c3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.024991 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41ca417f-9f99-44da-b444-4ecf1b9b5d04-utilities\") pod \"community-operators-4dhm7\" (UID: \"41ca417f-9f99-44da-b444-4ecf1b9b5d04\") " pod="openshift-marketplace/community-operators-4dhm7" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.025079 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41ca417f-9f99-44da-b444-4ecf1b9b5d04-catalog-content\") pod \"community-operators-4dhm7\" (UID: \"41ca417f-9f99-44da-b444-4ecf1b9b5d04\") " pod="openshift-marketplace/community-operators-4dhm7" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.025129 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86be3faf-7eff-4890-8a02-5c541621b4c3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"86be3faf-7eff-4890-8a02-5c541621b4c3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.025205 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fwdd\" (UniqueName: \"kubernetes.io/projected/41ca417f-9f99-44da-b444-4ecf1b9b5d04-kube-api-access-2fwdd\") pod \"community-operators-4dhm7\" (UID: \"41ca417f-9f99-44da-b444-4ecf1b9b5d04\") " pod="openshift-marketplace/community-operators-4dhm7" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.025874 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41ca417f-9f99-44da-b444-4ecf1b9b5d04-utilities\") pod \"community-operators-4dhm7\" (UID: \"41ca417f-9f99-44da-b444-4ecf1b9b5d04\") " pod="openshift-marketplace/community-operators-4dhm7" Mar 09 09:22:12 crc kubenswrapper[4971]: E0309 09:22:12.025970 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:12.525951157 +0000 UTC m=+136.085878967 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.026620 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41ca417f-9f99-44da-b444-4ecf1b9b5d04-catalog-content\") pod \"community-operators-4dhm7\" (UID: \"41ca417f-9f99-44da-b444-4ecf1b9b5d04\") " pod="openshift-marketplace/community-operators-4dhm7" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.050019 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fwdd\" (UniqueName: \"kubernetes.io/projected/41ca417f-9f99-44da-b444-4ecf1b9b5d04-kube-api-access-2fwdd\") pod \"community-operators-4dhm7\" (UID: \"41ca417f-9f99-44da-b444-4ecf1b9b5d04\") " pod="openshift-marketplace/community-operators-4dhm7" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.050630 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.062917 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lmr9s"] Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.063133 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" podUID="0325b4dc-fe2a-4685-8e37-621a96f6b976" containerName="controller-manager" containerID="cri-o://8cb837315cb8cfd11cf59cc4c075ade7c9b05fd48d55400bb199dc80921c1560" gracePeriod=30 Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.068943 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5cqt9" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.071666 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.109166 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl"] Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.125897 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86be3faf-7eff-4890-8a02-5c541621b4c3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"86be3faf-7eff-4890-8a02-5c541621b4c3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.125997 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.126049 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86be3faf-7eff-4890-8a02-5c541621b4c3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"86be3faf-7eff-4890-8a02-5c541621b4c3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:22:12 crc kubenswrapper[4971]: E0309 09:22:12.127990 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:22:12.627977444 +0000 UTC m=+136.187905254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-d6xhv" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.128169 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86be3faf-7eff-4890-8a02-5c541621b4c3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"86be3faf-7eff-4890-8a02-5c541621b4c3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.154450 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86be3faf-7eff-4890-8a02-5c541621b4c3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"86be3faf-7eff-4890-8a02-5c541621b4c3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.156873 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.227513 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:12 crc kubenswrapper[4971]: E0309 09:22:12.227788 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:22:12.727773262 +0000 UTC m=+136.287701072 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.244845 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4dhm7" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.282125 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xnbvj"] Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.303690 4971 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-09T09:22:11.742224915Z","Handler":null,"Name":""} Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.312238 4971 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.312275 4971 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 09 09:22:12 crc kubenswrapper[4971]: W0309 09:22:12.322125 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1992d44_6e31_4432_88f0_320408d9fa70.slice/crio-6e3e4d13648d14f7da9a96b1c15e0ea8e88af6582f1ec2d09c83c3ad86c96162 WatchSource:0}: Error finding container 6e3e4d13648d14f7da9a96b1c15e0ea8e88af6582f1ec2d09c83c3ad86c96162: Status 404 returned error can't find the container with id 6e3e4d13648d14f7da9a96b1c15e0ea8e88af6582f1ec2d09c83c3ad86c96162 Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.328713 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.372350 4971 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.372405 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.479332 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-d6xhv\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.512372 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.535036 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.544630 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.591383 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.593542 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-krqgm" event={"ID":"91b993d3-35bb-4b9b-9e1a-ca96fa6f8162","Type":"ContainerStarted","Data":"d4418d5eced6aa0292116313d3b50a496fa435dd25cb800dc3d47d10a3fb9d63"} Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.595033 4971 generic.go:334] "Generic (PLEG): container finished" podID="1054c243-8a85-4262-ba12-2ee5643d0255" containerID="abfa4c21dc6ec470e2bbbcf8e4aafc576c7aa170f7f11636f67174966cb1a62f" exitCode=0 Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.595550 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft9v2" event={"ID":"1054c243-8a85-4262-ba12-2ee5643d0255","Type":"ContainerDied","Data":"abfa4c21dc6ec470e2bbbcf8e4aafc576c7aa170f7f11636f67174966cb1a62f"} Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.595569 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft9v2" event={"ID":"1054c243-8a85-4262-ba12-2ee5643d0255","Type":"ContainerStarted","Data":"d9608f12a07ea913b8527583933f427cec7f268c21001b412b23477ab6fd1bff"} Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.601131 4971 generic.go:334] "Generic (PLEG): container finished" podID="0325b4dc-fe2a-4685-8e37-621a96f6b976" containerID="8cb837315cb8cfd11cf59cc4c075ade7c9b05fd48d55400bb199dc80921c1560" exitCode=0 Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.601203 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" event={"ID":"0325b4dc-fe2a-4685-8e37-621a96f6b976","Type":"ContainerDied","Data":"8cb837315cb8cfd11cf59cc4c075ade7c9b05fd48d55400bb199dc80921c1560"} Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.601243 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" event={"ID":"0325b4dc-fe2a-4685-8e37-621a96f6b976","Type":"ContainerDied","Data":"2c7ea772321c8be3f72680f9f9d2af70bb7eebc7251a552b40a308fe34cc0adb"} Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.601260 4971 scope.go:117] "RemoveContainer" containerID="8cb837315cb8cfd11cf59cc4c075ade7c9b05fd48d55400bb199dc80921c1560" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.601386 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lmr9s" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.610135 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" podUID="70b1c95e-1326-4a4d-92f8-12df76f6a23a" containerName="route-controller-manager" containerID="cri-o://bb3a286d82cee965ba9ca19b7be6268ae3e147bee10a83aa90e858730c3371f6" gracePeriod=30 Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.610367 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnbvj" event={"ID":"b1992d44-6e31-4432-88f0-320408d9fa70","Type":"ContainerStarted","Data":"6e3e4d13648d14f7da9a96b1c15e0ea8e88af6582f1ec2d09c83c3ad86c96162"} Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.620183 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-brs7r" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.637996 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0325b4dc-fe2a-4685-8e37-621a96f6b976-config\") pod \"0325b4dc-fe2a-4685-8e37-621a96f6b976\" (UID: \"0325b4dc-fe2a-4685-8e37-621a96f6b976\") " Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.638060 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htn6w\" (UniqueName: \"kubernetes.io/projected/0325b4dc-fe2a-4685-8e37-621a96f6b976-kube-api-access-htn6w\") pod \"0325b4dc-fe2a-4685-8e37-621a96f6b976\" (UID: \"0325b4dc-fe2a-4685-8e37-621a96f6b976\") " Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.638101 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0325b4dc-fe2a-4685-8e37-621a96f6b976-serving-cert\") pod \"0325b4dc-fe2a-4685-8e37-621a96f6b976\" (UID: \"0325b4dc-fe2a-4685-8e37-621a96f6b976\") " Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.638170 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0325b4dc-fe2a-4685-8e37-621a96f6b976-proxy-ca-bundles\") pod \"0325b4dc-fe2a-4685-8e37-621a96f6b976\" (UID: \"0325b4dc-fe2a-4685-8e37-621a96f6b976\") " Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.638289 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0325b4dc-fe2a-4685-8e37-621a96f6b976-client-ca\") pod \"0325b4dc-fe2a-4685-8e37-621a96f6b976\" (UID: \"0325b4dc-fe2a-4685-8e37-621a96f6b976\") " Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.640000 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0325b4dc-fe2a-4685-8e37-621a96f6b976-config" (OuterVolumeSpecName: "config") pod "0325b4dc-fe2a-4685-8e37-621a96f6b976" (UID: "0325b4dc-fe2a-4685-8e37-621a96f6b976"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.642087 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0325b4dc-fe2a-4685-8e37-621a96f6b976-client-ca" (OuterVolumeSpecName: "client-ca") pod "0325b4dc-fe2a-4685-8e37-621a96f6b976" (UID: "0325b4dc-fe2a-4685-8e37-621a96f6b976"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.643690 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0325b4dc-fe2a-4685-8e37-621a96f6b976-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0325b4dc-fe2a-4685-8e37-621a96f6b976" (UID: "0325b4dc-fe2a-4685-8e37-621a96f6b976"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.646613 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0325b4dc-fe2a-4685-8e37-621a96f6b976-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0325b4dc-fe2a-4685-8e37-621a96f6b976" (UID: "0325b4dc-fe2a-4685-8e37-621a96f6b976"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.654478 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4dhm7"] Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.659658 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0325b4dc-fe2a-4685-8e37-621a96f6b976-kube-api-access-htn6w" (OuterVolumeSpecName: "kube-api-access-htn6w") pod "0325b4dc-fe2a-4685-8e37-621a96f6b976" (UID: "0325b4dc-fe2a-4685-8e37-621a96f6b976"). InnerVolumeSpecName "kube-api-access-htn6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.661524 4971 scope.go:117] "RemoveContainer" containerID="8cb837315cb8cfd11cf59cc4c075ade7c9b05fd48d55400bb199dc80921c1560" Mar 09 09:22:12 crc kubenswrapper[4971]: E0309 09:22:12.661925 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cb837315cb8cfd11cf59cc4c075ade7c9b05fd48d55400bb199dc80921c1560\": container with ID starting with 8cb837315cb8cfd11cf59cc4c075ade7c9b05fd48d55400bb199dc80921c1560 not found: ID does not exist" containerID="8cb837315cb8cfd11cf59cc4c075ade7c9b05fd48d55400bb199dc80921c1560" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.661985 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cb837315cb8cfd11cf59cc4c075ade7c9b05fd48d55400bb199dc80921c1560"} err="failed to get container status \"8cb837315cb8cfd11cf59cc4c075ade7c9b05fd48d55400bb199dc80921c1560\": rpc error: code = NotFound desc = could not find container \"8cb837315cb8cfd11cf59cc4c075ade7c9b05fd48d55400bb199dc80921c1560\": container with ID starting with 8cb837315cb8cfd11cf59cc4c075ade7c9b05fd48d55400bb199dc80921c1560 not found: ID does not exist" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.671840 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-krqgm" podStartSLOduration=16.671818822 podStartE2EDuration="16.671818822s" podCreationTimestamp="2026-03-09 09:21:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:12.665270627 +0000 UTC m=+136.225198437" watchObservedRunningTime="2026-03-09 09:22:12.671818822 +0000 UTC m=+136.231746632" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.700280 4971 patch_prober.go:28] interesting pod/router-default-5444994796-w4c8h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:22:12 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 09 09:22:12 crc kubenswrapper[4971]: [+]process-running ok Mar 09 09:22:12 crc kubenswrapper[4971]: healthz check failed Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.700332 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w4c8h" podUID="9e0270a9-8b08-4abf-88da-75319c5e6f48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.740445 4971 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0325b4dc-fe2a-4685-8e37-621a96f6b976-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.740483 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0325b4dc-fe2a-4685-8e37-621a96f6b976-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.740493 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htn6w\" (UniqueName: \"kubernetes.io/projected/0325b4dc-fe2a-4685-8e37-621a96f6b976-kube-api-access-htn6w\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.740504 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0325b4dc-fe2a-4685-8e37-621a96f6b976-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.740512 4971 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0325b4dc-fe2a-4685-8e37-621a96f6b976-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.776702 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5cqt9"] Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.803167 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-d6xhv"] Mar 09 09:22:12 crc kubenswrapper[4971]: W0309 09:22:12.826609 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode807cb52_ad3e_4bb8_85d1_b6e3ee6870dd.slice/crio-010e36b761268009098dac8b6b3acec97266f39fcdde4db68cb4c18464d08624 WatchSource:0}: Error finding container 010e36b761268009098dac8b6b3acec97266f39fcdde4db68cb4c18464d08624: Status 404 returned error can't find the container with id 010e36b761268009098dac8b6b3acec97266f39fcdde4db68cb4c18464d08624 Mar 09 09:22:12 crc kubenswrapper[4971]: W0309 09:22:12.828785 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ddfae4b_5893_4e15_a983_1adb19c5970e.slice/crio-2a9d85b0ab65b31748dff0688cc9a2b07211e62ad6aacbb46b667420aa51f3cc WatchSource:0}: Error finding container 2a9d85b0ab65b31748dff0688cc9a2b07211e62ad6aacbb46b667420aa51f3cc: Status 404 returned error can't find the container with id 2a9d85b0ab65b31748dff0688cc9a2b07211e62ad6aacbb46b667420aa51f3cc Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.929885 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.954619 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lmr9s"] Mar 09 09:22:12 crc kubenswrapper[4971]: I0309 09:22:12.957993 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lmr9s"] Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.051531 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-th2ls" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.160406 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0325b4dc-fe2a-4685-8e37-621a96f6b976" path="/var/lib/kubelet/pods/0325b4dc-fe2a-4685-8e37-621a96f6b976/volumes" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.161154 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.409917 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6b969cd678-zr6fw"] Mar 09 09:22:13 crc kubenswrapper[4971]: E0309 09:22:13.410418 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0325b4dc-fe2a-4685-8e37-621a96f6b976" containerName="controller-manager" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.410530 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="0325b4dc-fe2a-4685-8e37-621a96f6b976" containerName="controller-manager" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.410721 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="0325b4dc-fe2a-4685-8e37-621a96f6b976" containerName="controller-manager" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.411202 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.413293 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.413758 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.414115 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.414227 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.414384 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.414475 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.423065 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b969cd678-zr6fw"] Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.424543 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.448412 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrfd8\" (UniqueName: \"kubernetes.io/projected/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-kube-api-access-qrfd8\") pod \"controller-manager-6b969cd678-zr6fw\" (UID: \"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f\") " pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.448457 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-config\") pod \"controller-manager-6b969cd678-zr6fw\" (UID: \"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f\") " pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.448503 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-client-ca\") pod \"controller-manager-6b969cd678-zr6fw\" (UID: \"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f\") " pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.448734 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-serving-cert\") pod \"controller-manager-6b969cd678-zr6fw\" (UID: \"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f\") " pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.448815 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-proxy-ca-bundles\") pod \"controller-manager-6b969cd678-zr6fw\" (UID: \"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f\") " pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.511946 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4dfvc"] Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.514128 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4dfvc" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.516680 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.521458 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4dfvc"] Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.549830 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-proxy-ca-bundles\") pod \"controller-manager-6b969cd678-zr6fw\" (UID: \"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f\") " pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.549884 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrfd8\" (UniqueName: \"kubernetes.io/projected/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-kube-api-access-qrfd8\") pod \"controller-manager-6b969cd678-zr6fw\" (UID: \"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f\") " pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.549911 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-config\") pod \"controller-manager-6b969cd678-zr6fw\" (UID: \"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f\") " pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.549937 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cb8b120-bccf-4c59-9c72-83c6169e3411-catalog-content\") pod \"redhat-marketplace-4dfvc\" (UID: \"9cb8b120-bccf-4c59-9c72-83c6169e3411\") " pod="openshift-marketplace/redhat-marketplace-4dfvc" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.549972 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6tj7\" (UniqueName: \"kubernetes.io/projected/9cb8b120-bccf-4c59-9c72-83c6169e3411-kube-api-access-s6tj7\") pod \"redhat-marketplace-4dfvc\" (UID: \"9cb8b120-bccf-4c59-9c72-83c6169e3411\") " pod="openshift-marketplace/redhat-marketplace-4dfvc" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.549992 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-client-ca\") pod \"controller-manager-6b969cd678-zr6fw\" (UID: \"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f\") " pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.550059 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cb8b120-bccf-4c59-9c72-83c6169e3411-utilities\") pod \"redhat-marketplace-4dfvc\" (UID: \"9cb8b120-bccf-4c59-9c72-83c6169e3411\") " pod="openshift-marketplace/redhat-marketplace-4dfvc" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.550079 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-serving-cert\") pod \"controller-manager-6b969cd678-zr6fw\" (UID: \"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f\") " pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.551951 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-client-ca\") pod \"controller-manager-6b969cd678-zr6fw\" (UID: \"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f\") " pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.552416 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-proxy-ca-bundles\") pod \"controller-manager-6b969cd678-zr6fw\" (UID: \"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f\") " pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.558142 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-serving-cert\") pod \"controller-manager-6b969cd678-zr6fw\" (UID: \"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f\") " pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.558221 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-config\") pod \"controller-manager-6b969cd678-zr6fw\" (UID: \"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f\") " pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.568626 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrfd8\" (UniqueName: \"kubernetes.io/projected/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-kube-api-access-qrfd8\") pod \"controller-manager-6b969cd678-zr6fw\" (UID: \"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f\") " pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.616206 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" event={"ID":"7ddfae4b-5893-4e15-a983-1adb19c5970e","Type":"ContainerStarted","Data":"2a9d85b0ab65b31748dff0688cc9a2b07211e62ad6aacbb46b667420aa51f3cc"} Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.619886 4971 generic.go:334] "Generic (PLEG): container finished" podID="70b1c95e-1326-4a4d-92f8-12df76f6a23a" containerID="bb3a286d82cee965ba9ca19b7be6268ae3e147bee10a83aa90e858730c3371f6" exitCode=0 Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.619953 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" event={"ID":"70b1c95e-1326-4a4d-92f8-12df76f6a23a","Type":"ContainerDied","Data":"bb3a286d82cee965ba9ca19b7be6268ae3e147bee10a83aa90e858730c3371f6"} Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.620950 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"86be3faf-7eff-4890-8a02-5c541621b4c3","Type":"ContainerStarted","Data":"3865aac07391fe61f4debd2209ec825d63445a6a367315ffa8a02cf254eb3a80"} Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.622166 4971 generic.go:334] "Generic (PLEG): container finished" podID="b1992d44-6e31-4432-88f0-320408d9fa70" containerID="40ef5fadc5edc6653f20762069248da053830ef3814164e7a93c1c44551a7218" exitCode=0 Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.622228 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnbvj" event={"ID":"b1992d44-6e31-4432-88f0-320408d9fa70","Type":"ContainerDied","Data":"40ef5fadc5edc6653f20762069248da053830ef3814164e7a93c1c44551a7218"} Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.624726 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cqt9" event={"ID":"e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd","Type":"ContainerStarted","Data":"010e36b761268009098dac8b6b3acec97266f39fcdde4db68cb4c18464d08624"} Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.626620 4971 generic.go:334] "Generic (PLEG): container finished" podID="41ca417f-9f99-44da-b444-4ecf1b9b5d04" containerID="db4015f80efc6015c1d6ab615103144cd9f0008f76b77dc5b13f667608a42846" exitCode=0 Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.627386 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4dhm7" event={"ID":"41ca417f-9f99-44da-b444-4ecf1b9b5d04","Type":"ContainerDied","Data":"db4015f80efc6015c1d6ab615103144cd9f0008f76b77dc5b13f667608a42846"} Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.627424 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4dhm7" event={"ID":"41ca417f-9f99-44da-b444-4ecf1b9b5d04","Type":"ContainerStarted","Data":"2af0350a4b43f0cb40c6ebba6bf403deb23ccbfab656b35eeff9d9db5d4fb8ab"} Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.652059 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cb8b120-bccf-4c59-9c72-83c6169e3411-catalog-content\") pod \"redhat-marketplace-4dfvc\" (UID: \"9cb8b120-bccf-4c59-9c72-83c6169e3411\") " pod="openshift-marketplace/redhat-marketplace-4dfvc" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.655243 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6tj7\" (UniqueName: \"kubernetes.io/projected/9cb8b120-bccf-4c59-9c72-83c6169e3411-kube-api-access-s6tj7\") pod \"redhat-marketplace-4dfvc\" (UID: \"9cb8b120-bccf-4c59-9c72-83c6169e3411\") " pod="openshift-marketplace/redhat-marketplace-4dfvc" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.655475 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cb8b120-bccf-4c59-9c72-83c6169e3411-utilities\") pod \"redhat-marketplace-4dfvc\" (UID: \"9cb8b120-bccf-4c59-9c72-83c6169e3411\") " pod="openshift-marketplace/redhat-marketplace-4dfvc" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.687099 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cb8b120-bccf-4c59-9c72-83c6169e3411-utilities\") pod \"redhat-marketplace-4dfvc\" (UID: \"9cb8b120-bccf-4c59-9c72-83c6169e3411\") " pod="openshift-marketplace/redhat-marketplace-4dfvc" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.687340 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cb8b120-bccf-4c59-9c72-83c6169e3411-catalog-content\") pod \"redhat-marketplace-4dfvc\" (UID: \"9cb8b120-bccf-4c59-9c72-83c6169e3411\") " pod="openshift-marketplace/redhat-marketplace-4dfvc" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.691161 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6tj7\" (UniqueName: \"kubernetes.io/projected/9cb8b120-bccf-4c59-9c72-83c6169e3411-kube-api-access-s6tj7\") pod \"redhat-marketplace-4dfvc\" (UID: \"9cb8b120-bccf-4c59-9c72-83c6169e3411\") " pod="openshift-marketplace/redhat-marketplace-4dfvc" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.699803 4971 patch_prober.go:28] interesting pod/router-default-5444994796-w4c8h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:22:13 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 09 09:22:13 crc kubenswrapper[4971]: [+]process-running ok Mar 09 09:22:13 crc kubenswrapper[4971]: healthz check failed Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.699868 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w4c8h" podUID="9e0270a9-8b08-4abf-88da-75319c5e6f48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.734376 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.908629 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4dfvc" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.910329 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s9xnf"] Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.911192 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s9xnf" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.924300 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s9xnf"] Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.963077 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lslm9\" (UniqueName: \"kubernetes.io/projected/4b5aceb0-6798-4435-9d7f-2548d1a42d11-kube-api-access-lslm9\") pod \"redhat-marketplace-s9xnf\" (UID: \"4b5aceb0-6798-4435-9d7f-2548d1a42d11\") " pod="openshift-marketplace/redhat-marketplace-s9xnf" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.963162 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b5aceb0-6798-4435-9d7f-2548d1a42d11-catalog-content\") pod \"redhat-marketplace-s9xnf\" (UID: \"4b5aceb0-6798-4435-9d7f-2548d1a42d11\") " pod="openshift-marketplace/redhat-marketplace-s9xnf" Mar 09 09:22:13 crc kubenswrapper[4971]: I0309 09:22:13.963225 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b5aceb0-6798-4435-9d7f-2548d1a42d11-utilities\") pod \"redhat-marketplace-s9xnf\" (UID: \"4b5aceb0-6798-4435-9d7f-2548d1a42d11\") " pod="openshift-marketplace/redhat-marketplace-s9xnf" Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.063986 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b5aceb0-6798-4435-9d7f-2548d1a42d11-utilities\") pod \"redhat-marketplace-s9xnf\" (UID: \"4b5aceb0-6798-4435-9d7f-2548d1a42d11\") " pod="openshift-marketplace/redhat-marketplace-s9xnf" Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.064093 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lslm9\" (UniqueName: \"kubernetes.io/projected/4b5aceb0-6798-4435-9d7f-2548d1a42d11-kube-api-access-lslm9\") pod \"redhat-marketplace-s9xnf\" (UID: \"4b5aceb0-6798-4435-9d7f-2548d1a42d11\") " pod="openshift-marketplace/redhat-marketplace-s9xnf" Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.064156 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b5aceb0-6798-4435-9d7f-2548d1a42d11-catalog-content\") pod \"redhat-marketplace-s9xnf\" (UID: \"4b5aceb0-6798-4435-9d7f-2548d1a42d11\") " pod="openshift-marketplace/redhat-marketplace-s9xnf" Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.065087 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b5aceb0-6798-4435-9d7f-2548d1a42d11-utilities\") pod \"redhat-marketplace-s9xnf\" (UID: \"4b5aceb0-6798-4435-9d7f-2548d1a42d11\") " pod="openshift-marketplace/redhat-marketplace-s9xnf" Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.065123 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b5aceb0-6798-4435-9d7f-2548d1a42d11-catalog-content\") pod \"redhat-marketplace-s9xnf\" (UID: \"4b5aceb0-6798-4435-9d7f-2548d1a42d11\") " pod="openshift-marketplace/redhat-marketplace-s9xnf" Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.083399 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lslm9\" (UniqueName: \"kubernetes.io/projected/4b5aceb0-6798-4435-9d7f-2548d1a42d11-kube-api-access-lslm9\") pod \"redhat-marketplace-s9xnf\" (UID: \"4b5aceb0-6798-4435-9d7f-2548d1a42d11\") " pod="openshift-marketplace/redhat-marketplace-s9xnf" Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.187675 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b969cd678-zr6fw"] Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.234798 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s9xnf" Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.342956 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4dfvc"] Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.514130 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k6x5k"] Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.515234 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k6x5k" Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.520315 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.527533 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k6x5k"] Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.654368 4971 generic.go:334] "Generic (PLEG): container finished" podID="86be3faf-7eff-4890-8a02-5c541621b4c3" containerID="09476412d2bc92d19dec5d133f7f1c0a9923b349dc27ed7702480afb4e2bf575" exitCode=0 Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.654431 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"86be3faf-7eff-4890-8a02-5c541621b4c3","Type":"ContainerDied","Data":"09476412d2bc92d19dec5d133f7f1c0a9923b349dc27ed7702480afb4e2bf575"} Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.661421 4971 generic.go:334] "Generic (PLEG): container finished" podID="e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd" containerID="03e353df500485a1110e6c0becf141e132839caf456fbe001811b6193153f555" exitCode=0 Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.661512 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cqt9" event={"ID":"e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd","Type":"ContainerDied","Data":"03e353df500485a1110e6c0becf141e132839caf456fbe001811b6193153f555"} Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.664935 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" event={"ID":"7ddfae4b-5893-4e15-a983-1adb19c5970e","Type":"ContainerStarted","Data":"29e0f6f01e4e568b808e86167b263dfc4bac34d6888c852ac22672675caf489e"} Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.665219 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.671084 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbe25e82-76e3-4639-98f8-75a1e7f51c19-catalog-content\") pod \"redhat-operators-k6x5k\" (UID: \"dbe25e82-76e3-4639-98f8-75a1e7f51c19\") " pod="openshift-marketplace/redhat-operators-k6x5k" Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.671142 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2k26\" (UniqueName: \"kubernetes.io/projected/dbe25e82-76e3-4639-98f8-75a1e7f51c19-kube-api-access-z2k26\") pod \"redhat-operators-k6x5k\" (UID: \"dbe25e82-76e3-4639-98f8-75a1e7f51c19\") " pod="openshift-marketplace/redhat-operators-k6x5k" Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.671193 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbe25e82-76e3-4639-98f8-75a1e7f51c19-utilities\") pod \"redhat-operators-k6x5k\" (UID: \"dbe25e82-76e3-4639-98f8-75a1e7f51c19\") " pod="openshift-marketplace/redhat-operators-k6x5k" Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.699870 4971 patch_prober.go:28] interesting pod/router-default-5444994796-w4c8h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:22:14 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 09 09:22:14 crc kubenswrapper[4971]: [+]process-running ok Mar 09 09:22:14 crc kubenswrapper[4971]: healthz check failed Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.699953 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w4c8h" podUID="9e0270a9-8b08-4abf-88da-75319c5e6f48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.725877 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" podStartSLOduration=74.72586183 podStartE2EDuration="1m14.72586183s" podCreationTimestamp="2026-03-09 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:14.724947717 +0000 UTC m=+138.284875547" watchObservedRunningTime="2026-03-09 09:22:14.72586183 +0000 UTC m=+138.285789650" Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.773054 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbe25e82-76e3-4639-98f8-75a1e7f51c19-catalog-content\") pod \"redhat-operators-k6x5k\" (UID: \"dbe25e82-76e3-4639-98f8-75a1e7f51c19\") " pod="openshift-marketplace/redhat-operators-k6x5k" Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.773110 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2k26\" (UniqueName: \"kubernetes.io/projected/dbe25e82-76e3-4639-98f8-75a1e7f51c19-kube-api-access-z2k26\") pod \"redhat-operators-k6x5k\" (UID: \"dbe25e82-76e3-4639-98f8-75a1e7f51c19\") " pod="openshift-marketplace/redhat-operators-k6x5k" Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.773168 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbe25e82-76e3-4639-98f8-75a1e7f51c19-utilities\") pod \"redhat-operators-k6x5k\" (UID: \"dbe25e82-76e3-4639-98f8-75a1e7f51c19\") " pod="openshift-marketplace/redhat-operators-k6x5k" Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.773592 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbe25e82-76e3-4639-98f8-75a1e7f51c19-catalog-content\") pod \"redhat-operators-k6x5k\" (UID: \"dbe25e82-76e3-4639-98f8-75a1e7f51c19\") " pod="openshift-marketplace/redhat-operators-k6x5k" Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.773871 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbe25e82-76e3-4639-98f8-75a1e7f51c19-utilities\") pod \"redhat-operators-k6x5k\" (UID: \"dbe25e82-76e3-4639-98f8-75a1e7f51c19\") " pod="openshift-marketplace/redhat-operators-k6x5k" Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.794255 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2k26\" (UniqueName: \"kubernetes.io/projected/dbe25e82-76e3-4639-98f8-75a1e7f51c19-kube-api-access-z2k26\") pod \"redhat-operators-k6x5k\" (UID: \"dbe25e82-76e3-4639-98f8-75a1e7f51c19\") " pod="openshift-marketplace/redhat-operators-k6x5k" Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.819849 4971 ???:1] "http: TLS handshake error from 192.168.126.11:43150: no serving certificate available for the kubelet" Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.835459 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k6x5k" Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.914138 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7sctl"] Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.915803 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7sctl" Mar 09 09:22:14 crc kubenswrapper[4971]: I0309 09:22:14.927795 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7sctl"] Mar 09 09:22:15 crc kubenswrapper[4971]: I0309 09:22:15.076694 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f03c17cc-a83d-4187-99d2-2c91b6edb26c-catalog-content\") pod \"redhat-operators-7sctl\" (UID: \"f03c17cc-a83d-4187-99d2-2c91b6edb26c\") " pod="openshift-marketplace/redhat-operators-7sctl" Mar 09 09:22:15 crc kubenswrapper[4971]: I0309 09:22:15.076745 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f03c17cc-a83d-4187-99d2-2c91b6edb26c-utilities\") pod \"redhat-operators-7sctl\" (UID: \"f03c17cc-a83d-4187-99d2-2c91b6edb26c\") " pod="openshift-marketplace/redhat-operators-7sctl" Mar 09 09:22:15 crc kubenswrapper[4971]: I0309 09:22:15.076764 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7gtx\" (UniqueName: \"kubernetes.io/projected/f03c17cc-a83d-4187-99d2-2c91b6edb26c-kube-api-access-g7gtx\") pod \"redhat-operators-7sctl\" (UID: \"f03c17cc-a83d-4187-99d2-2c91b6edb26c\") " pod="openshift-marketplace/redhat-operators-7sctl" Mar 09 09:22:15 crc kubenswrapper[4971]: I0309 09:22:15.178227 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f03c17cc-a83d-4187-99d2-2c91b6edb26c-catalog-content\") pod \"redhat-operators-7sctl\" (UID: \"f03c17cc-a83d-4187-99d2-2c91b6edb26c\") " pod="openshift-marketplace/redhat-operators-7sctl" Mar 09 09:22:15 crc kubenswrapper[4971]: I0309 09:22:15.178301 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f03c17cc-a83d-4187-99d2-2c91b6edb26c-utilities\") pod \"redhat-operators-7sctl\" (UID: \"f03c17cc-a83d-4187-99d2-2c91b6edb26c\") " pod="openshift-marketplace/redhat-operators-7sctl" Mar 09 09:22:15 crc kubenswrapper[4971]: I0309 09:22:15.178334 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7gtx\" (UniqueName: \"kubernetes.io/projected/f03c17cc-a83d-4187-99d2-2c91b6edb26c-kube-api-access-g7gtx\") pod \"redhat-operators-7sctl\" (UID: \"f03c17cc-a83d-4187-99d2-2c91b6edb26c\") " pod="openshift-marketplace/redhat-operators-7sctl" Mar 09 09:22:15 crc kubenswrapper[4971]: I0309 09:22:15.178907 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f03c17cc-a83d-4187-99d2-2c91b6edb26c-utilities\") pod \"redhat-operators-7sctl\" (UID: \"f03c17cc-a83d-4187-99d2-2c91b6edb26c\") " pod="openshift-marketplace/redhat-operators-7sctl" Mar 09 09:22:15 crc kubenswrapper[4971]: I0309 09:22:15.179110 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f03c17cc-a83d-4187-99d2-2c91b6edb26c-catalog-content\") pod \"redhat-operators-7sctl\" (UID: \"f03c17cc-a83d-4187-99d2-2c91b6edb26c\") " pod="openshift-marketplace/redhat-operators-7sctl" Mar 09 09:22:15 crc kubenswrapper[4971]: I0309 09:22:15.215810 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7gtx\" (UniqueName: \"kubernetes.io/projected/f03c17cc-a83d-4187-99d2-2c91b6edb26c-kube-api-access-g7gtx\") pod \"redhat-operators-7sctl\" (UID: \"f03c17cc-a83d-4187-99d2-2c91b6edb26c\") " pod="openshift-marketplace/redhat-operators-7sctl" Mar 09 09:22:15 crc kubenswrapper[4971]: I0309 09:22:15.223959 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:22:15 crc kubenswrapper[4971]: I0309 09:22:15.240071 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7sctl" Mar 09 09:22:15 crc kubenswrapper[4971]: I0309 09:22:15.696506 4971 patch_prober.go:28] interesting pod/router-default-5444994796-w4c8h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:22:15 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 09 09:22:15 crc kubenswrapper[4971]: [+]process-running ok Mar 09 09:22:15 crc kubenswrapper[4971]: healthz check failed Mar 09 09:22:15 crc kubenswrapper[4971]: I0309 09:22:15.696578 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w4c8h" podUID="9e0270a9-8b08-4abf-88da-75319c5e6f48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:22:16 crc kubenswrapper[4971]: I0309 09:22:16.645687 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:22:16 crc kubenswrapper[4971]: I0309 09:22:16.650748 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-rqlbq" Mar 09 09:22:16 crc kubenswrapper[4971]: I0309 09:22:16.730525 4971 patch_prober.go:28] interesting pod/router-default-5444994796-w4c8h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:22:16 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 09 09:22:16 crc kubenswrapper[4971]: [+]process-running ok Mar 09 09:22:16 crc kubenswrapper[4971]: healthz check failed Mar 09 09:22:16 crc kubenswrapper[4971]: I0309 09:22:16.730597 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w4c8h" podUID="9e0270a9-8b08-4abf-88da-75319c5e6f48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:22:16 crc kubenswrapper[4971]: I0309 09:22:16.800560 4971 ???:1] "http: TLS handshake error from 192.168.126.11:40464: no serving certificate available for the kubelet" Mar 09 09:22:17 crc kubenswrapper[4971]: I0309 09:22:17.696887 4971 patch_prober.go:28] interesting pod/router-default-5444994796-w4c8h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:22:17 crc kubenswrapper[4971]: [-]has-synced failed: reason withheld Mar 09 09:22:17 crc kubenswrapper[4971]: [+]process-running ok Mar 09 09:22:17 crc kubenswrapper[4971]: healthz check failed Mar 09 09:22:17 crc kubenswrapper[4971]: I0309 09:22:17.696962 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w4c8h" podUID="9e0270a9-8b08-4abf-88da-75319c5e6f48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:22:18 crc kubenswrapper[4971]: I0309 09:22:18.696011 4971 patch_prober.go:28] interesting pod/router-default-5444994796-w4c8h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:22:18 crc kubenswrapper[4971]: [+]has-synced ok Mar 09 09:22:18 crc kubenswrapper[4971]: [+]process-running ok Mar 09 09:22:18 crc kubenswrapper[4971]: healthz check failed Mar 09 09:22:18 crc kubenswrapper[4971]: I0309 09:22:18.697164 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w4c8h" podUID="9e0270a9-8b08-4abf-88da-75319c5e6f48" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:22:19 crc kubenswrapper[4971]: W0309 09:22:19.225425 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cb8b120_bccf_4c59_9c72_83c6169e3411.slice/crio-742562f6700e65f8d7ab8ba92800039b08e1ad773349376590e6b20dbf5dc557 WatchSource:0}: Error finding container 742562f6700e65f8d7ab8ba92800039b08e1ad773349376590e6b20dbf5dc557: Status 404 returned error can't find the container with id 742562f6700e65f8d7ab8ba92800039b08e1ad773349376590e6b20dbf5dc557 Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.270746 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.277363 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.312971 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46"] Mar 09 09:22:19 crc kubenswrapper[4971]: E0309 09:22:19.313233 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70b1c95e-1326-4a4d-92f8-12df76f6a23a" containerName="route-controller-manager" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.313249 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="70b1c95e-1326-4a4d-92f8-12df76f6a23a" containerName="route-controller-manager" Mar 09 09:22:19 crc kubenswrapper[4971]: E0309 09:22:19.313276 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86be3faf-7eff-4890-8a02-5c541621b4c3" containerName="pruner" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.313284 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="86be3faf-7eff-4890-8a02-5c541621b4c3" containerName="pruner" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.313426 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="86be3faf-7eff-4890-8a02-5c541621b4c3" containerName="pruner" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.313443 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="70b1c95e-1326-4a4d-92f8-12df76f6a23a" containerName="route-controller-manager" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.313879 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.336529 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46"] Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.352305 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86be3faf-7eff-4890-8a02-5c541621b4c3-kubelet-dir\") pod \"86be3faf-7eff-4890-8a02-5c541621b4c3\" (UID: \"86be3faf-7eff-4890-8a02-5c541621b4c3\") " Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.352394 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc9xf\" (UniqueName: \"kubernetes.io/projected/70b1c95e-1326-4a4d-92f8-12df76f6a23a-kube-api-access-pc9xf\") pod \"70b1c95e-1326-4a4d-92f8-12df76f6a23a\" (UID: \"70b1c95e-1326-4a4d-92f8-12df76f6a23a\") " Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.352445 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70b1c95e-1326-4a4d-92f8-12df76f6a23a-client-ca\") pod \"70b1c95e-1326-4a4d-92f8-12df76f6a23a\" (UID: \"70b1c95e-1326-4a4d-92f8-12df76f6a23a\") " Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.352502 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70b1c95e-1326-4a4d-92f8-12df76f6a23a-serving-cert\") pod \"70b1c95e-1326-4a4d-92f8-12df76f6a23a\" (UID: \"70b1c95e-1326-4a4d-92f8-12df76f6a23a\") " Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.352562 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70b1c95e-1326-4a4d-92f8-12df76f6a23a-config\") pod \"70b1c95e-1326-4a4d-92f8-12df76f6a23a\" (UID: \"70b1c95e-1326-4a4d-92f8-12df76f6a23a\") " Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.352584 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86be3faf-7eff-4890-8a02-5c541621b4c3-kube-api-access\") pod \"86be3faf-7eff-4890-8a02-5c541621b4c3\" (UID: \"86be3faf-7eff-4890-8a02-5c541621b4c3\") " Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.352850 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86be3faf-7eff-4890-8a02-5c541621b4c3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "86be3faf-7eff-4890-8a02-5c541621b4c3" (UID: "86be3faf-7eff-4890-8a02-5c541621b4c3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.353605 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70b1c95e-1326-4a4d-92f8-12df76f6a23a-client-ca" (OuterVolumeSpecName: "client-ca") pod "70b1c95e-1326-4a4d-92f8-12df76f6a23a" (UID: "70b1c95e-1326-4a4d-92f8-12df76f6a23a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.353618 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70b1c95e-1326-4a4d-92f8-12df76f6a23a-config" (OuterVolumeSpecName: "config") pod "70b1c95e-1326-4a4d-92f8-12df76f6a23a" (UID: "70b1c95e-1326-4a4d-92f8-12df76f6a23a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.374637 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86be3faf-7eff-4890-8a02-5c541621b4c3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "86be3faf-7eff-4890-8a02-5c541621b4c3" (UID: "86be3faf-7eff-4890-8a02-5c541621b4c3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.375094 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70b1c95e-1326-4a4d-92f8-12df76f6a23a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "70b1c95e-1326-4a4d-92f8-12df76f6a23a" (UID: "70b1c95e-1326-4a4d-92f8-12df76f6a23a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.375564 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70b1c95e-1326-4a4d-92f8-12df76f6a23a-kube-api-access-pc9xf" (OuterVolumeSpecName: "kube-api-access-pc9xf") pod "70b1c95e-1326-4a4d-92f8-12df76f6a23a" (UID: "70b1c95e-1326-4a4d-92f8-12df76f6a23a"). InnerVolumeSpecName "kube-api-access-pc9xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.453497 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/73403aa7-c920-436e-a393-8a47d8b64086-client-ca\") pod \"route-controller-manager-67f87dff78-6zx46\" (UID: \"73403aa7-c920-436e-a393-8a47d8b64086\") " pod="openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.453565 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4gq5\" (UniqueName: \"kubernetes.io/projected/73403aa7-c920-436e-a393-8a47d8b64086-kube-api-access-z4gq5\") pod \"route-controller-manager-67f87dff78-6zx46\" (UID: \"73403aa7-c920-436e-a393-8a47d8b64086\") " pod="openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.453609 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73403aa7-c920-436e-a393-8a47d8b64086-config\") pod \"route-controller-manager-67f87dff78-6zx46\" (UID: \"73403aa7-c920-436e-a393-8a47d8b64086\") " pod="openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.453649 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73403aa7-c920-436e-a393-8a47d8b64086-serving-cert\") pod \"route-controller-manager-67f87dff78-6zx46\" (UID: \"73403aa7-c920-436e-a393-8a47d8b64086\") " pod="openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.453696 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70b1c95e-1326-4a4d-92f8-12df76f6a23a-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.453712 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86be3faf-7eff-4890-8a02-5c541621b4c3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.453723 4971 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86be3faf-7eff-4890-8a02-5c541621b4c3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.453732 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc9xf\" (UniqueName: \"kubernetes.io/projected/70b1c95e-1326-4a4d-92f8-12df76f6a23a-kube-api-access-pc9xf\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.453741 4971 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70b1c95e-1326-4a4d-92f8-12df76f6a23a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.453750 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70b1c95e-1326-4a4d-92f8-12df76f6a23a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.555119 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/73403aa7-c920-436e-a393-8a47d8b64086-client-ca\") pod \"route-controller-manager-67f87dff78-6zx46\" (UID: \"73403aa7-c920-436e-a393-8a47d8b64086\") " pod="openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.555183 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4gq5\" (UniqueName: \"kubernetes.io/projected/73403aa7-c920-436e-a393-8a47d8b64086-kube-api-access-z4gq5\") pod \"route-controller-manager-67f87dff78-6zx46\" (UID: \"73403aa7-c920-436e-a393-8a47d8b64086\") " pod="openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.555249 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73403aa7-c920-436e-a393-8a47d8b64086-config\") pod \"route-controller-manager-67f87dff78-6zx46\" (UID: \"73403aa7-c920-436e-a393-8a47d8b64086\") " pod="openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.555302 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73403aa7-c920-436e-a393-8a47d8b64086-serving-cert\") pod \"route-controller-manager-67f87dff78-6zx46\" (UID: \"73403aa7-c920-436e-a393-8a47d8b64086\") " pod="openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.556599 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73403aa7-c920-436e-a393-8a47d8b64086-config\") pod \"route-controller-manager-67f87dff78-6zx46\" (UID: \"73403aa7-c920-436e-a393-8a47d8b64086\") " pod="openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.557283 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/73403aa7-c920-436e-a393-8a47d8b64086-client-ca\") pod \"route-controller-manager-67f87dff78-6zx46\" (UID: \"73403aa7-c920-436e-a393-8a47d8b64086\") " pod="openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.560717 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73403aa7-c920-436e-a393-8a47d8b64086-serving-cert\") pod \"route-controller-manager-67f87dff78-6zx46\" (UID: \"73403aa7-c920-436e-a393-8a47d8b64086\") " pod="openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.576260 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4gq5\" (UniqueName: \"kubernetes.io/projected/73403aa7-c920-436e-a393-8a47d8b64086-kube-api-access-z4gq5\") pod \"route-controller-manager-67f87dff78-6zx46\" (UID: \"73403aa7-c920-436e-a393-8a47d8b64086\") " pod="openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.642261 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.699526 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4dfvc" event={"ID":"9cb8b120-bccf-4c59-9c72-83c6169e3411","Type":"ContainerStarted","Data":"742562f6700e65f8d7ab8ba92800039b08e1ad773349376590e6b20dbf5dc557"} Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.700730 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-w4c8h" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.701468 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.701726 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl" event={"ID":"70b1c95e-1326-4a4d-92f8-12df76f6a23a","Type":"ContainerDied","Data":"fb6702515f9badf816344febadc98388380014042bbb6946613c5d053ab4e320"} Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.701757 4971 scope.go:117] "RemoveContainer" containerID="bb3a286d82cee965ba9ca19b7be6268ae3e147bee10a83aa90e858730c3371f6" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.703831 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"86be3faf-7eff-4890-8a02-5c541621b4c3","Type":"ContainerDied","Data":"3865aac07391fe61f4debd2209ec825d63445a6a367315ffa8a02cf254eb3a80"} Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.703873 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3865aac07391fe61f4debd2209ec825d63445a6a367315ffa8a02cf254eb3a80" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.703885 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.704526 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-w4c8h" Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.782119 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl"] Mar 09 09:22:19 crc kubenswrapper[4971]: I0309 09:22:19.789853 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-t9sxl"] Mar 09 09:22:20 crc kubenswrapper[4971]: I0309 09:22:20.153469 4971 scope.go:117] "RemoveContainer" containerID="0498fa34e162baaf3d51e00c839035dfb5a043d12e709f17f37859b8d3fbe083" Mar 09 09:22:20 crc kubenswrapper[4971]: E0309 09:22:20.153753 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:22:20 crc kubenswrapper[4971]: I0309 09:22:20.803043 4971 patch_prober.go:28] interesting pod/downloads-7954f5f757-d25sv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Mar 09 09:22:20 crc kubenswrapper[4971]: I0309 09:22:20.803101 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-d25sv" podUID="afc88ae6-e5b1-4da0-b10a-a6bf1816e6fa" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Mar 09 09:22:20 crc kubenswrapper[4971]: I0309 09:22:20.803146 4971 patch_prober.go:28] interesting pod/downloads-7954f5f757-d25sv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Mar 09 09:22:20 crc kubenswrapper[4971]: I0309 09:22:20.803219 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d25sv" podUID="afc88ae6-e5b1-4da0-b10a-a6bf1816e6fa" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Mar 09 09:22:20 crc kubenswrapper[4971]: I0309 09:22:20.815079 4971 patch_prober.go:28] interesting pod/console-f9d7485db-dnx9z container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 09 09:22:20 crc kubenswrapper[4971]: I0309 09:22:20.815135 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dnx9z" podUID="c8c3ac1c-4896-4db2-8917-0a57667a1fa8" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 09 09:22:21 crc kubenswrapper[4971]: I0309 09:22:21.158915 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70b1c95e-1326-4a4d-92f8-12df76f6a23a" path="/var/lib/kubelet/pods/70b1c95e-1326-4a4d-92f8-12df76f6a23a/volumes" Mar 09 09:22:21 crc kubenswrapper[4971]: E0309 09:22:21.312947 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="61c76aff58c35e94f3e4d72f3e326230fd28af3da5913b414ea145eb56170a68" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 09 09:22:21 crc kubenswrapper[4971]: E0309 09:22:21.314908 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="61c76aff58c35e94f3e4d72f3e326230fd28af3da5913b414ea145eb56170a68" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 09 09:22:21 crc kubenswrapper[4971]: E0309 09:22:21.316488 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="61c76aff58c35e94f3e4d72f3e326230fd28af3da5913b414ea145eb56170a68" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 09 09:22:21 crc kubenswrapper[4971]: E0309 09:22:21.316527 4971 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" podUID="1a0999c2-4d90-4197-8075-e11790a0ed9b" containerName="kube-multus-additional-cni-plugins" Mar 09 09:22:21 crc kubenswrapper[4971]: W0309 09:22:21.663071 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0ff3ca9_0eab_42b0_8b08_3bef726fd70f.slice/crio-eca1d7abab8a5286901220c988859636cc2ef54fd3f2443374de5d043d5d7982 WatchSource:0}: Error finding container eca1d7abab8a5286901220c988859636cc2ef54fd3f2443374de5d043d5d7982: Status 404 returned error can't find the container with id eca1d7abab8a5286901220c988859636cc2ef54fd3f2443374de5d043d5d7982 Mar 09 09:22:21 crc kubenswrapper[4971]: I0309 09:22:21.714799 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" event={"ID":"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f","Type":"ContainerStarted","Data":"eca1d7abab8a5286901220c988859636cc2ef54fd3f2443374de5d043d5d7982"} Mar 09 09:22:23 crc kubenswrapper[4971]: I0309 09:22:23.977351 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7sctl"] Mar 09 09:22:25 crc kubenswrapper[4971]: I0309 09:22:25.083595 4971 ???:1] "http: TLS handshake error from 192.168.126.11:40480: no serving certificate available for the kubelet" Mar 09 09:22:27 crc kubenswrapper[4971]: W0309 09:22:27.742437 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf03c17cc_a83d_4187_99d2_2c91b6edb26c.slice/crio-9849c714ed13c7cf7d1113e1721f50bfaf2d114e7f2b4f84c3682c9dff85dce4 WatchSource:0}: Error finding container 9849c714ed13c7cf7d1113e1721f50bfaf2d114e7f2b4f84c3682c9dff85dce4: Status 404 returned error can't find the container with id 9849c714ed13c7cf7d1113e1721f50bfaf2d114e7f2b4f84c3682c9dff85dce4 Mar 09 09:22:27 crc kubenswrapper[4971]: I0309 09:22:27.755397 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sctl" event={"ID":"f03c17cc-a83d-4187-99d2-2c91b6edb26c","Type":"ContainerStarted","Data":"9849c714ed13c7cf7d1113e1721f50bfaf2d114e7f2b4f84c3682c9dff85dce4"} Mar 09 09:22:29 crc kubenswrapper[4971]: I0309 09:22:29.574833 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:22:30 crc kubenswrapper[4971]: I0309 09:22:30.819091 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:22:30 crc kubenswrapper[4971]: I0309 09:22:30.822410 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-d25sv" Mar 09 09:22:30 crc kubenswrapper[4971]: I0309 09:22:30.831016 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-dnx9z" Mar 09 09:22:31 crc kubenswrapper[4971]: E0309 09:22:31.313816 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="61c76aff58c35e94f3e4d72f3e326230fd28af3da5913b414ea145eb56170a68" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 09 09:22:31 crc kubenswrapper[4971]: E0309 09:22:31.315837 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="61c76aff58c35e94f3e4d72f3e326230fd28af3da5913b414ea145eb56170a68" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 09 09:22:31 crc kubenswrapper[4971]: E0309 09:22:31.317284 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="61c76aff58c35e94f3e4d72f3e326230fd28af3da5913b414ea145eb56170a68" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 09 09:22:31 crc kubenswrapper[4971]: E0309 09:22:31.317313 4971 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" podUID="1a0999c2-4d90-4197-8075-e11790a0ed9b" containerName="kube-multus-additional-cni-plugins" Mar 09 09:22:31 crc kubenswrapper[4971]: I0309 09:22:31.865233 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6b969cd678-zr6fw"] Mar 09 09:22:31 crc kubenswrapper[4971]: I0309 09:22:31.904354 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46"] Mar 09 09:22:32 crc kubenswrapper[4971]: I0309 09:22:32.518453 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:22:34 crc kubenswrapper[4971]: I0309 09:22:34.151995 4971 scope.go:117] "RemoveContainer" containerID="0498fa34e162baaf3d51e00c839035dfb5a043d12e709f17f37859b8d3fbe083" Mar 09 09:22:34 crc kubenswrapper[4971]: E0309 09:22:34.152186 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:22:34 crc kubenswrapper[4971]: I0309 09:22:34.169499 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 09 09:22:34 crc kubenswrapper[4971]: I0309 09:22:34.172122 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 09 09:22:35 crc kubenswrapper[4971]: E0309 09:22:35.686604 4971 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 09 09:22:35 crc kubenswrapper[4971]: E0309 09:22:35.686784 4971 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hf854,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-ft9v2_openshift-marketplace(1054c243-8a85-4262-ba12-2ee5643d0255): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 09:22:35 crc kubenswrapper[4971]: E0309 09:22:35.687868 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-ft9v2" podUID="1054c243-8a85-4262-ba12-2ee5643d0255" Mar 09 09:22:35 crc kubenswrapper[4971]: I0309 09:22:35.851939 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.851888677 podStartE2EDuration="1.851888677s" podCreationTimestamp="2026-03-09 09:22:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:35.83439996 +0000 UTC m=+159.394327780" watchObservedRunningTime="2026-03-09 09:22:35.851888677 +0000 UTC m=+159.411816487" Mar 09 09:22:35 crc kubenswrapper[4971]: I0309 09:22:35.869788 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=1.869768908 podStartE2EDuration="1.869768908s" podCreationTimestamp="2026-03-09 09:22:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:35.850664343 +0000 UTC m=+159.410592173" watchObservedRunningTime="2026-03-09 09:22:35.869768908 +0000 UTC m=+159.429696718" Mar 09 09:22:36 crc kubenswrapper[4971]: I0309 09:22:36.804718 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-n8lbv_1a0999c2-4d90-4197-8075-e11790a0ed9b/kube-multus-additional-cni-plugins/0.log" Mar 09 09:22:36 crc kubenswrapper[4971]: I0309 09:22:36.805094 4971 generic.go:334] "Generic (PLEG): container finished" podID="1a0999c2-4d90-4197-8075-e11790a0ed9b" containerID="61c76aff58c35e94f3e4d72f3e326230fd28af3da5913b414ea145eb56170a68" exitCode=137 Mar 09 09:22:36 crc kubenswrapper[4971]: I0309 09:22:36.805130 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" event={"ID":"1a0999c2-4d90-4197-8075-e11790a0ed9b","Type":"ContainerDied","Data":"61c76aff58c35e94f3e4d72f3e326230fd28af3da5913b414ea145eb56170a68"} Mar 09 09:22:41 crc kubenswrapper[4971]: I0309 09:22:41.236056 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qz4l7" Mar 09 09:22:41 crc kubenswrapper[4971]: E0309 09:22:41.311741 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 61c76aff58c35e94f3e4d72f3e326230fd28af3da5913b414ea145eb56170a68 is running failed: container process not found" containerID="61c76aff58c35e94f3e4d72f3e326230fd28af3da5913b414ea145eb56170a68" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 09 09:22:41 crc kubenswrapper[4971]: E0309 09:22:41.312415 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 61c76aff58c35e94f3e4d72f3e326230fd28af3da5913b414ea145eb56170a68 is running failed: container process not found" containerID="61c76aff58c35e94f3e4d72f3e326230fd28af3da5913b414ea145eb56170a68" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 09 09:22:41 crc kubenswrapper[4971]: E0309 09:22:41.312717 4971 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 61c76aff58c35e94f3e4d72f3e326230fd28af3da5913b414ea145eb56170a68 is running failed: container process not found" containerID="61c76aff58c35e94f3e4d72f3e326230fd28af3da5913b414ea145eb56170a68" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 09 09:22:41 crc kubenswrapper[4971]: E0309 09:22:41.312754 4971 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 61c76aff58c35e94f3e4d72f3e326230fd28af3da5913b414ea145eb56170a68 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" podUID="1a0999c2-4d90-4197-8075-e11790a0ed9b" containerName="kube-multus-additional-cni-plugins" Mar 09 09:22:42 crc kubenswrapper[4971]: E0309 09:22:42.256809 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-ft9v2" podUID="1054c243-8a85-4262-ba12-2ee5643d0255" Mar 09 09:22:42 crc kubenswrapper[4971]: E0309 09:22:42.289369 4971 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 09 09:22:42 crc kubenswrapper[4971]: E0309 09:22:42.289708 4971 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2fwdd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-4dhm7_openshift-marketplace(41ca417f-9f99-44da-b444-4ecf1b9b5d04): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 09:22:42 crc kubenswrapper[4971]: E0309 09:22:42.291824 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-4dhm7" podUID="41ca417f-9f99-44da-b444-4ecf1b9b5d04" Mar 09 09:22:42 crc kubenswrapper[4971]: E0309 09:22:42.291922 4971 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 09 09:22:42 crc kubenswrapper[4971]: E0309 09:22:42.292088 4971 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-srmbw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-5cqt9_openshift-marketplace(e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 09:22:42 crc kubenswrapper[4971]: E0309 09:22:42.293264 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-5cqt9" podUID="e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd" Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.483321 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s9xnf"] Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.525848 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-n8lbv_1a0999c2-4d90-4197-8075-e11790a0ed9b/kube-multus-additional-cni-plugins/0.log" Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.525905 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.533595 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k6x5k"] Mar 09 09:22:42 crc kubenswrapper[4971]: W0309 09:22:42.552577 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbe25e82_76e3_4639_98f8_75a1e7f51c19.slice/crio-f40d96482a75c7f4f80cb47f53babf9da0fb8431f7bfe6f7e27f28f91e255023 WatchSource:0}: Error finding container f40d96482a75c7f4f80cb47f53babf9da0fb8431f7bfe6f7e27f28f91e255023: Status 404 returned error can't find the container with id f40d96482a75c7f4f80cb47f53babf9da0fb8431f7bfe6f7e27f28f91e255023 Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.564765 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46"] Mar 09 09:22:42 crc kubenswrapper[4971]: W0309 09:22:42.584373 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73403aa7_c920_436e_a393_8a47d8b64086.slice/crio-faaf0bb6d730390cb385dd3a9d457ade36ca400ea53c81931d0647250cda493c WatchSource:0}: Error finding container faaf0bb6d730390cb385dd3a9d457ade36ca400ea53c81931d0647250cda493c: Status 404 returned error can't find the container with id faaf0bb6d730390cb385dd3a9d457ade36ca400ea53c81931d0647250cda493c Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.609824 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/1a0999c2-4d90-4197-8075-e11790a0ed9b-ready\") pod \"1a0999c2-4d90-4197-8075-e11790a0ed9b\" (UID: \"1a0999c2-4d90-4197-8075-e11790a0ed9b\") " Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.609895 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1a0999c2-4d90-4197-8075-e11790a0ed9b-cni-sysctl-allowlist\") pod \"1a0999c2-4d90-4197-8075-e11790a0ed9b\" (UID: \"1a0999c2-4d90-4197-8075-e11790a0ed9b\") " Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.609955 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1a0999c2-4d90-4197-8075-e11790a0ed9b-tuning-conf-dir\") pod \"1a0999c2-4d90-4197-8075-e11790a0ed9b\" (UID: \"1a0999c2-4d90-4197-8075-e11790a0ed9b\") " Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.610026 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrlsz\" (UniqueName: \"kubernetes.io/projected/1a0999c2-4d90-4197-8075-e11790a0ed9b-kube-api-access-lrlsz\") pod \"1a0999c2-4d90-4197-8075-e11790a0ed9b\" (UID: \"1a0999c2-4d90-4197-8075-e11790a0ed9b\") " Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.610593 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a0999c2-4d90-4197-8075-e11790a0ed9b-ready" (OuterVolumeSpecName: "ready") pod "1a0999c2-4d90-4197-8075-e11790a0ed9b" (UID: "1a0999c2-4d90-4197-8075-e11790a0ed9b"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.610605 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a0999c2-4d90-4197-8075-e11790a0ed9b-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "1a0999c2-4d90-4197-8075-e11790a0ed9b" (UID: "1a0999c2-4d90-4197-8075-e11790a0ed9b"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.610774 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a0999c2-4d90-4197-8075-e11790a0ed9b-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "1a0999c2-4d90-4197-8075-e11790a0ed9b" (UID: "1a0999c2-4d90-4197-8075-e11790a0ed9b"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.616361 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a0999c2-4d90-4197-8075-e11790a0ed9b-kube-api-access-lrlsz" (OuterVolumeSpecName: "kube-api-access-lrlsz") pod "1a0999c2-4d90-4197-8075-e11790a0ed9b" (UID: "1a0999c2-4d90-4197-8075-e11790a0ed9b"). InnerVolumeSpecName "kube-api-access-lrlsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.712078 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrlsz\" (UniqueName: \"kubernetes.io/projected/1a0999c2-4d90-4197-8075-e11790a0ed9b-kube-api-access-lrlsz\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.712124 4971 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/1a0999c2-4d90-4197-8075-e11790a0ed9b-ready\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.712134 4971 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1a0999c2-4d90-4197-8075-e11790a0ed9b-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.712144 4971 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1a0999c2-4d90-4197-8075-e11790a0ed9b-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.736998 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 09:22:42 crc kubenswrapper[4971]: E0309 09:22:42.737905 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a0999c2-4d90-4197-8075-e11790a0ed9b" containerName="kube-multus-additional-cni-plugins" Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.737948 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a0999c2-4d90-4197-8075-e11790a0ed9b" containerName="kube-multus-additional-cni-plugins" Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.738109 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a0999c2-4d90-4197-8075-e11790a0ed9b" containerName="kube-multus-additional-cni-plugins" Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.738669 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.741229 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.744676 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.749631 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.813324 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bf8511f8-2505-4b62-9b52-1061935e2517-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bf8511f8-2505-4b62-9b52-1061935e2517\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.813394 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf8511f8-2505-4b62-9b52-1061935e2517-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bf8511f8-2505-4b62-9b52-1061935e2517\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.838217 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46" event={"ID":"73403aa7-c920-436e-a393-8a47d8b64086","Type":"ContainerStarted","Data":"faaf0bb6d730390cb385dd3a9d457ade36ca400ea53c81931d0647250cda493c"} Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.840906 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-n8lbv_1a0999c2-4d90-4197-8075-e11790a0ed9b/kube-multus-additional-cni-plugins/0.log" Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.840965 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" event={"ID":"1a0999c2-4d90-4197-8075-e11790a0ed9b","Type":"ContainerDied","Data":"0df6c92ebf52c79c1452578b5abd86c7588faa9829a1cca2d963045587eed64a"} Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.840993 4971 scope.go:117] "RemoveContainer" containerID="61c76aff58c35e94f3e4d72f3e326230fd28af3da5913b414ea145eb56170a68" Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.841089 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-n8lbv" Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.843457 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6x5k" event={"ID":"dbe25e82-76e3-4639-98f8-75a1e7f51c19","Type":"ContainerStarted","Data":"f40d96482a75c7f4f80cb47f53babf9da0fb8431f7bfe6f7e27f28f91e255023"} Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.844881 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s9xnf" event={"ID":"4b5aceb0-6798-4435-9d7f-2548d1a42d11","Type":"ContainerStarted","Data":"5c6b28d8e3045025f13a68b6eeca8267de1e2bd0735b5f11870b01ed8c16c64e"} Mar 09 09:22:42 crc kubenswrapper[4971]: E0309 09:22:42.846552 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-5cqt9" podUID="e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd" Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.907218 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-n8lbv"] Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.909931 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-n8lbv"] Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.918057 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bf8511f8-2505-4b62-9b52-1061935e2517-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bf8511f8-2505-4b62-9b52-1061935e2517\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.918149 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf8511f8-2505-4b62-9b52-1061935e2517-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bf8511f8-2505-4b62-9b52-1061935e2517\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.919084 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bf8511f8-2505-4b62-9b52-1061935e2517-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bf8511f8-2505-4b62-9b52-1061935e2517\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:22:42 crc kubenswrapper[4971]: I0309 09:22:42.935508 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf8511f8-2505-4b62-9b52-1061935e2517-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bf8511f8-2505-4b62-9b52-1061935e2517\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:22:43 crc kubenswrapper[4971]: I0309 09:22:43.065989 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:22:43 crc kubenswrapper[4971]: I0309 09:22:43.158107 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a0999c2-4d90-4197-8075-e11790a0ed9b" path="/var/lib/kubelet/pods/1a0999c2-4d90-4197-8075-e11790a0ed9b/volumes" Mar 09 09:22:44 crc kubenswrapper[4971]: I0309 09:22:44.855815 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4dfvc" event={"ID":"9cb8b120-bccf-4c59-9c72-83c6169e3411","Type":"ContainerStarted","Data":"51663fd9e42e2c45a32c57f4ae3dde98d1a770eb73da12d3f796683e7dba1759"} Mar 09 09:22:44 crc kubenswrapper[4971]: I0309 09:22:44.858693 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" event={"ID":"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f","Type":"ContainerStarted","Data":"3d33a47916db8d980cdf85e4dc283a379886f64bac8ba907ec558d52d023ed44"} Mar 09 09:22:44 crc kubenswrapper[4971]: I0309 09:22:44.860285 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s9xnf" event={"ID":"4b5aceb0-6798-4435-9d7f-2548d1a42d11","Type":"ContainerStarted","Data":"1773a68a876713072ab28e2de5f9975c302bece56a93e2c6be2d9fb0e5ae13a0"} Mar 09 09:22:44 crc kubenswrapper[4971]: I0309 09:22:44.861776 4971 generic.go:334] "Generic (PLEG): container finished" podID="f03c17cc-a83d-4187-99d2-2c91b6edb26c" containerID="f1730aa4237056d4df2e63efeaaec3cd995d946541bd4b39698fecf657608476" exitCode=0 Mar 09 09:22:44 crc kubenswrapper[4971]: I0309 09:22:44.861824 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sctl" event={"ID":"f03c17cc-a83d-4187-99d2-2c91b6edb26c","Type":"ContainerDied","Data":"f1730aa4237056d4df2e63efeaaec3cd995d946541bd4b39698fecf657608476"} Mar 09 09:22:45 crc kubenswrapper[4971]: I0309 09:22:45.364794 4971 csr.go:261] certificate signing request csr-xz8tm is approved, waiting to be issued Mar 09 09:22:45 crc kubenswrapper[4971]: I0309 09:22:45.372677 4971 csr.go:257] certificate signing request csr-xz8tm is issued Mar 09 09:22:45 crc kubenswrapper[4971]: I0309 09:22:45.869903 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46" event={"ID":"73403aa7-c920-436e-a393-8a47d8b64086","Type":"ContainerStarted","Data":"b5fd8d1335898c9cd3437c81582e7430d9a111bcc11b63a4f8c9f8f07c3573e0"} Mar 09 09:22:45 crc kubenswrapper[4971]: I0309 09:22:45.870082 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46" podUID="73403aa7-c920-436e-a393-8a47d8b64086" containerName="route-controller-manager" containerID="cri-o://b5fd8d1335898c9cd3437c81582e7430d9a111bcc11b63a4f8c9f8f07c3573e0" gracePeriod=30 Mar 09 09:22:45 crc kubenswrapper[4971]: I0309 09:22:45.870328 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46" Mar 09 09:22:45 crc kubenswrapper[4971]: I0309 09:22:45.871508 4971 generic.go:334] "Generic (PLEG): container finished" podID="dbe25e82-76e3-4639-98f8-75a1e7f51c19" containerID="a00ac93510f82d2dcb361af8d5e5a517a32e9e2abc45a8ac63848e6c1c2ff32f" exitCode=0 Mar 09 09:22:45 crc kubenswrapper[4971]: I0309 09:22:45.871572 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6x5k" event={"ID":"dbe25e82-76e3-4639-98f8-75a1e7f51c19","Type":"ContainerDied","Data":"a00ac93510f82d2dcb361af8d5e5a517a32e9e2abc45a8ac63848e6c1c2ff32f"} Mar 09 09:22:45 crc kubenswrapper[4971]: I0309 09:22:45.873118 4971 generic.go:334] "Generic (PLEG): container finished" podID="9cb8b120-bccf-4c59-9c72-83c6169e3411" containerID="51663fd9e42e2c45a32c57f4ae3dde98d1a770eb73da12d3f796683e7dba1759" exitCode=0 Mar 09 09:22:45 crc kubenswrapper[4971]: I0309 09:22:45.873188 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4dfvc" event={"ID":"9cb8b120-bccf-4c59-9c72-83c6169e3411","Type":"ContainerDied","Data":"51663fd9e42e2c45a32c57f4ae3dde98d1a770eb73da12d3f796683e7dba1759"} Mar 09 09:22:45 crc kubenswrapper[4971]: I0309 09:22:45.875481 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46" Mar 09 09:22:45 crc kubenswrapper[4971]: I0309 09:22:45.875787 4971 generic.go:334] "Generic (PLEG): container finished" podID="4b5aceb0-6798-4435-9d7f-2548d1a42d11" containerID="1773a68a876713072ab28e2de5f9975c302bece56a93e2c6be2d9fb0e5ae13a0" exitCode=0 Mar 09 09:22:45 crc kubenswrapper[4971]: I0309 09:22:45.875824 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s9xnf" event={"ID":"4b5aceb0-6798-4435-9d7f-2548d1a42d11","Type":"ContainerDied","Data":"1773a68a876713072ab28e2de5f9975c302bece56a93e2c6be2d9fb0e5ae13a0"} Mar 09 09:22:45 crc kubenswrapper[4971]: I0309 09:22:45.880215 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550802-d8cbz" event={"ID":"603b9f27-06c0-4fe8-8cc3-416122462369","Type":"ContainerStarted","Data":"5f39f2443e019d56e40c2f46b6f04fe595e46e9d8c4aed977f878cae5f6bd534"} Mar 09 09:22:45 crc kubenswrapper[4971]: I0309 09:22:45.880381 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" podUID="a0ff3ca9-0eab-42b0-8b08-3bef726fd70f" containerName="controller-manager" containerID="cri-o://3d33a47916db8d980cdf85e4dc283a379886f64bac8ba907ec558d52d023ed44" gracePeriod=30 Mar 09 09:22:45 crc kubenswrapper[4971]: I0309 09:22:45.880638 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" Mar 09 09:22:45 crc kubenswrapper[4971]: I0309 09:22:45.907898 4971 patch_prober.go:28] interesting pod/controller-manager-6b969cd678-zr6fw container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": read tcp 10.217.0.2:54096->10.217.0.51:8443: read: connection reset by peer" start-of-body= Mar 09 09:22:45 crc kubenswrapper[4971]: I0309 09:22:45.907954 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" podUID="a0ff3ca9-0eab-42b0-8b08-3bef726fd70f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": read tcp 10.217.0.2:54096->10.217.0.51:8443: read: connection reset by peer" Mar 09 09:22:45 crc kubenswrapper[4971]: I0309 09:22:45.908159 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46" podStartSLOduration=33.908142278 podStartE2EDuration="33.908142278s" podCreationTimestamp="2026-03-09 09:22:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:45.907680602 +0000 UTC m=+169.467608412" watchObservedRunningTime="2026-03-09 09:22:45.908142278 +0000 UTC m=+169.468070078" Mar 09 09:22:46 crc kubenswrapper[4971]: I0309 09:22:46.034601 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550802-d8cbz" podStartSLOduration=7.694556364 podStartE2EDuration="46.03457992s" podCreationTimestamp="2026-03-09 09:22:00 +0000 UTC" firstStartedPulling="2026-03-09 09:22:03.94938473 +0000 UTC m=+127.509312540" lastFinishedPulling="2026-03-09 09:22:42.289408276 +0000 UTC m=+165.849336096" observedRunningTime="2026-03-09 09:22:46.013696441 +0000 UTC m=+169.573624251" watchObservedRunningTime="2026-03-09 09:22:46.03457992 +0000 UTC m=+169.594507730" Mar 09 09:22:46 crc kubenswrapper[4971]: I0309 09:22:46.036385 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" podStartSLOduration=34.036377154 podStartE2EDuration="34.036377154s" podCreationTimestamp="2026-03-09 09:22:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:46.029619472 +0000 UTC m=+169.589547282" watchObservedRunningTime="2026-03-09 09:22:46.036377154 +0000 UTC m=+169.596304964" Mar 09 09:22:46 crc kubenswrapper[4971]: I0309 09:22:46.374082 4971 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-29 01:47:46.776421254 +0000 UTC Mar 09 09:22:46 crc kubenswrapper[4971]: I0309 09:22:46.374127 4971 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6352h25m0.402296891s for next certificate rotation Mar 09 09:22:46 crc kubenswrapper[4971]: I0309 09:22:46.825421 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 09:22:46 crc kubenswrapper[4971]: I0309 09:22:46.826861 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:22:46 crc kubenswrapper[4971]: I0309 09:22:46.833284 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 09:22:46 crc kubenswrapper[4971]: I0309 09:22:46.891844 4971 generic.go:334] "Generic (PLEG): container finished" podID="603b9f27-06c0-4fe8-8cc3-416122462369" containerID="5f39f2443e019d56e40c2f46b6f04fe595e46e9d8c4aed977f878cae5f6bd534" exitCode=0 Mar 09 09:22:46 crc kubenswrapper[4971]: I0309 09:22:46.891900 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550802-d8cbz" event={"ID":"603b9f27-06c0-4fe8-8cc3-416122462369","Type":"ContainerDied","Data":"5f39f2443e019d56e40c2f46b6f04fe595e46e9d8c4aed977f878cae5f6bd534"} Mar 09 09:22:46 crc kubenswrapper[4971]: I0309 09:22:46.895519 4971 generic.go:334] "Generic (PLEG): container finished" podID="73403aa7-c920-436e-a393-8a47d8b64086" containerID="b5fd8d1335898c9cd3437c81582e7430d9a111bcc11b63a4f8c9f8f07c3573e0" exitCode=0 Mar 09 09:22:46 crc kubenswrapper[4971]: I0309 09:22:46.895594 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46" event={"ID":"73403aa7-c920-436e-a393-8a47d8b64086","Type":"ContainerDied","Data":"b5fd8d1335898c9cd3437c81582e7430d9a111bcc11b63a4f8c9f8f07c3573e0"} Mar 09 09:22:46 crc kubenswrapper[4971]: I0309 09:22:46.897075 4971 generic.go:334] "Generic (PLEG): container finished" podID="a0ff3ca9-0eab-42b0-8b08-3bef726fd70f" containerID="3d33a47916db8d980cdf85e4dc283a379886f64bac8ba907ec558d52d023ed44" exitCode=0 Mar 09 09:22:46 crc kubenswrapper[4971]: I0309 09:22:46.897105 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" event={"ID":"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f","Type":"ContainerDied","Data":"3d33a47916db8d980cdf85e4dc283a379886f64bac8ba907ec558d52d023ed44"} Mar 09 09:22:46 crc kubenswrapper[4971]: I0309 09:22:46.912805 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ac5d26f5-5e17-4dd7-a334-5060a68b2d08-var-lock\") pod \"installer-9-crc\" (UID: \"ac5d26f5-5e17-4dd7-a334-5060a68b2d08\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:22:46 crc kubenswrapper[4971]: I0309 09:22:46.912889 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ac5d26f5-5e17-4dd7-a334-5060a68b2d08-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ac5d26f5-5e17-4dd7-a334-5060a68b2d08\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:22:46 crc kubenswrapper[4971]: I0309 09:22:46.912957 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ac5d26f5-5e17-4dd7-a334-5060a68b2d08-kube-api-access\") pod \"installer-9-crc\" (UID: \"ac5d26f5-5e17-4dd7-a334-5060a68b2d08\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:22:46 crc kubenswrapper[4971]: I0309 09:22:46.929334 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46" Mar 09 09:22:46 crc kubenswrapper[4971]: I0309 09:22:46.974146 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" Mar 09 09:22:46 crc kubenswrapper[4971]: I0309 09:22:46.976383 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr"] Mar 09 09:22:46 crc kubenswrapper[4971]: E0309 09:22:46.976592 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0ff3ca9-0eab-42b0-8b08-3bef726fd70f" containerName="controller-manager" Mar 09 09:22:46 crc kubenswrapper[4971]: I0309 09:22:46.976610 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0ff3ca9-0eab-42b0-8b08-3bef726fd70f" containerName="controller-manager" Mar 09 09:22:46 crc kubenswrapper[4971]: E0309 09:22:46.976630 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73403aa7-c920-436e-a393-8a47d8b64086" containerName="route-controller-manager" Mar 09 09:22:46 crc kubenswrapper[4971]: I0309 09:22:46.976639 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="73403aa7-c920-436e-a393-8a47d8b64086" containerName="route-controller-manager" Mar 09 09:22:46 crc kubenswrapper[4971]: I0309 09:22:46.976760 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="73403aa7-c920-436e-a393-8a47d8b64086" containerName="route-controller-manager" Mar 09 09:22:46 crc kubenswrapper[4971]: I0309 09:22:46.976776 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0ff3ca9-0eab-42b0-8b08-3bef726fd70f" containerName="controller-manager" Mar 09 09:22:46 crc kubenswrapper[4971]: I0309 09:22:46.977206 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr" Mar 09 09:22:46 crc kubenswrapper[4971]: I0309 09:22:46.979060 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr"] Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.017038 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4gq5\" (UniqueName: \"kubernetes.io/projected/73403aa7-c920-436e-a393-8a47d8b64086-kube-api-access-z4gq5\") pod \"73403aa7-c920-436e-a393-8a47d8b64086\" (UID: \"73403aa7-c920-436e-a393-8a47d8b64086\") " Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.017383 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73403aa7-c920-436e-a393-8a47d8b64086-config\") pod \"73403aa7-c920-436e-a393-8a47d8b64086\" (UID: \"73403aa7-c920-436e-a393-8a47d8b64086\") " Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.017518 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/73403aa7-c920-436e-a393-8a47d8b64086-client-ca\") pod \"73403aa7-c920-436e-a393-8a47d8b64086\" (UID: \"73403aa7-c920-436e-a393-8a47d8b64086\") " Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.017713 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73403aa7-c920-436e-a393-8a47d8b64086-serving-cert\") pod \"73403aa7-c920-436e-a393-8a47d8b64086\" (UID: \"73403aa7-c920-436e-a393-8a47d8b64086\") " Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.017998 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ac5d26f5-5e17-4dd7-a334-5060a68b2d08-kube-api-access\") pod \"installer-9-crc\" (UID: \"ac5d26f5-5e17-4dd7-a334-5060a68b2d08\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.018129 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ac5d26f5-5e17-4dd7-a334-5060a68b2d08-var-lock\") pod \"installer-9-crc\" (UID: \"ac5d26f5-5e17-4dd7-a334-5060a68b2d08\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.018716 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ac5d26f5-5e17-4dd7-a334-5060a68b2d08-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ac5d26f5-5e17-4dd7-a334-5060a68b2d08\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.018897 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ac5d26f5-5e17-4dd7-a334-5060a68b2d08-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ac5d26f5-5e17-4dd7-a334-5060a68b2d08\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.019009 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ac5d26f5-5e17-4dd7-a334-5060a68b2d08-var-lock\") pod \"installer-9-crc\" (UID: \"ac5d26f5-5e17-4dd7-a334-5060a68b2d08\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.018173 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73403aa7-c920-436e-a393-8a47d8b64086-client-ca" (OuterVolumeSpecName: "client-ca") pod "73403aa7-c920-436e-a393-8a47d8b64086" (UID: "73403aa7-c920-436e-a393-8a47d8b64086"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.018657 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73403aa7-c920-436e-a393-8a47d8b64086-config" (OuterVolumeSpecName: "config") pod "73403aa7-c920-436e-a393-8a47d8b64086" (UID: "73403aa7-c920-436e-a393-8a47d8b64086"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.024013 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73403aa7-c920-436e-a393-8a47d8b64086-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "73403aa7-c920-436e-a393-8a47d8b64086" (UID: "73403aa7-c920-436e-a393-8a47d8b64086"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.024414 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73403aa7-c920-436e-a393-8a47d8b64086-kube-api-access-z4gq5" (OuterVolumeSpecName: "kube-api-access-z4gq5") pod "73403aa7-c920-436e-a393-8a47d8b64086" (UID: "73403aa7-c920-436e-a393-8a47d8b64086"). InnerVolumeSpecName "kube-api-access-z4gq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.038170 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ac5d26f5-5e17-4dd7-a334-5060a68b2d08-kube-api-access\") pod \"installer-9-crc\" (UID: \"ac5d26f5-5e17-4dd7-a334-5060a68b2d08\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.119835 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-config\") pod \"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f\" (UID: \"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f\") " Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.119978 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-proxy-ca-bundles\") pod \"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f\" (UID: \"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f\") " Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.120037 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrfd8\" (UniqueName: \"kubernetes.io/projected/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-kube-api-access-qrfd8\") pod \"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f\" (UID: \"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f\") " Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.120059 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-serving-cert\") pod \"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f\" (UID: \"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f\") " Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.120129 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-client-ca\") pod \"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f\" (UID: \"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f\") " Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.120294 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85ec2882-f893-4f7f-a4c2-aa70aedcabf2-serving-cert\") pod \"route-controller-manager-7f6bbcfdcb-mlcdr\" (UID: \"85ec2882-f893-4f7f-a4c2-aa70aedcabf2\") " pod="openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.120339 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkcgd\" (UniqueName: \"kubernetes.io/projected/85ec2882-f893-4f7f-a4c2-aa70aedcabf2-kube-api-access-xkcgd\") pod \"route-controller-manager-7f6bbcfdcb-mlcdr\" (UID: \"85ec2882-f893-4f7f-a4c2-aa70aedcabf2\") " pod="openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.120420 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/85ec2882-f893-4f7f-a4c2-aa70aedcabf2-client-ca\") pod \"route-controller-manager-7f6bbcfdcb-mlcdr\" (UID: \"85ec2882-f893-4f7f-a4c2-aa70aedcabf2\") " pod="openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.120480 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85ec2882-f893-4f7f-a4c2-aa70aedcabf2-config\") pod \"route-controller-manager-7f6bbcfdcb-mlcdr\" (UID: \"85ec2882-f893-4f7f-a4c2-aa70aedcabf2\") " pod="openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.120550 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73403aa7-c920-436e-a393-8a47d8b64086-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.120567 4971 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/73403aa7-c920-436e-a393-8a47d8b64086-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.120580 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73403aa7-c920-436e-a393-8a47d8b64086-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.120592 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4gq5\" (UniqueName: \"kubernetes.io/projected/73403aa7-c920-436e-a393-8a47d8b64086-kube-api-access-z4gq5\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.121910 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-config" (OuterVolumeSpecName: "config") pod "a0ff3ca9-0eab-42b0-8b08-3bef726fd70f" (UID: "a0ff3ca9-0eab-42b0-8b08-3bef726fd70f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.122022 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a0ff3ca9-0eab-42b0-8b08-3bef726fd70f" (UID: "a0ff3ca9-0eab-42b0-8b08-3bef726fd70f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.122464 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-client-ca" (OuterVolumeSpecName: "client-ca") pod "a0ff3ca9-0eab-42b0-8b08-3bef726fd70f" (UID: "a0ff3ca9-0eab-42b0-8b08-3bef726fd70f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.123904 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-kube-api-access-qrfd8" (OuterVolumeSpecName: "kube-api-access-qrfd8") pod "a0ff3ca9-0eab-42b0-8b08-3bef726fd70f" (UID: "a0ff3ca9-0eab-42b0-8b08-3bef726fd70f"). InnerVolumeSpecName "kube-api-access-qrfd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.126596 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a0ff3ca9-0eab-42b0-8b08-3bef726fd70f" (UID: "a0ff3ca9-0eab-42b0-8b08-3bef726fd70f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.129967 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 09:22:47 crc kubenswrapper[4971]: W0309 09:22:47.143246 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbf8511f8_2505_4b62_9b52_1061935e2517.slice/crio-d1d99141e3cec696389bd9b1b40c16fa5eb4ef927abfdad5687f1b659650d656 WatchSource:0}: Error finding container d1d99141e3cec696389bd9b1b40c16fa5eb4ef927abfdad5687f1b659650d656: Status 404 returned error can't find the container with id d1d99141e3cec696389bd9b1b40c16fa5eb4ef927abfdad5687f1b659650d656 Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.149791 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.222083 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85ec2882-f893-4f7f-a4c2-aa70aedcabf2-serving-cert\") pod \"route-controller-manager-7f6bbcfdcb-mlcdr\" (UID: \"85ec2882-f893-4f7f-a4c2-aa70aedcabf2\") " pod="openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.222704 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkcgd\" (UniqueName: \"kubernetes.io/projected/85ec2882-f893-4f7f-a4c2-aa70aedcabf2-kube-api-access-xkcgd\") pod \"route-controller-manager-7f6bbcfdcb-mlcdr\" (UID: \"85ec2882-f893-4f7f-a4c2-aa70aedcabf2\") " pod="openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.222840 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/85ec2882-f893-4f7f-a4c2-aa70aedcabf2-client-ca\") pod \"route-controller-manager-7f6bbcfdcb-mlcdr\" (UID: \"85ec2882-f893-4f7f-a4c2-aa70aedcabf2\") " pod="openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.222988 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85ec2882-f893-4f7f-a4c2-aa70aedcabf2-config\") pod \"route-controller-manager-7f6bbcfdcb-mlcdr\" (UID: \"85ec2882-f893-4f7f-a4c2-aa70aedcabf2\") " pod="openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.223275 4971 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.223397 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.223473 4971 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.224540 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrfd8\" (UniqueName: \"kubernetes.io/projected/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-kube-api-access-qrfd8\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.224636 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.224887 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/85ec2882-f893-4f7f-a4c2-aa70aedcabf2-client-ca\") pod \"route-controller-manager-7f6bbcfdcb-mlcdr\" (UID: \"85ec2882-f893-4f7f-a4c2-aa70aedcabf2\") " pod="openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.238424 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85ec2882-f893-4f7f-a4c2-aa70aedcabf2-serving-cert\") pod \"route-controller-manager-7f6bbcfdcb-mlcdr\" (UID: \"85ec2882-f893-4f7f-a4c2-aa70aedcabf2\") " pod="openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.245500 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkcgd\" (UniqueName: \"kubernetes.io/projected/85ec2882-f893-4f7f-a4c2-aa70aedcabf2-kube-api-access-xkcgd\") pod \"route-controller-manager-7f6bbcfdcb-mlcdr\" (UID: \"85ec2882-f893-4f7f-a4c2-aa70aedcabf2\") " pod="openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.245603 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85ec2882-f893-4f7f-a4c2-aa70aedcabf2-config\") pod \"route-controller-manager-7f6bbcfdcb-mlcdr\" (UID: \"85ec2882-f893-4f7f-a4c2-aa70aedcabf2\") " pod="openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.299181 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.375184 4971 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-19 04:11:47.355088407 +0000 UTC Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.375251 4971 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6834h48m59.979839393s for next certificate rotation Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.388517 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 09:22:47 crc kubenswrapper[4971]: W0309 09:22:47.401412 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podac5d26f5_5e17_4dd7_a334_5060a68b2d08.slice/crio-7296a2a500abcb07929bae78e97ee730e2936703b117333121cca4c532b42cbc WatchSource:0}: Error finding container 7296a2a500abcb07929bae78e97ee730e2936703b117333121cca4c532b42cbc: Status 404 returned error can't find the container with id 7296a2a500abcb07929bae78e97ee730e2936703b117333121cca4c532b42cbc Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.744873 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr"] Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.913901 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr" event={"ID":"85ec2882-f893-4f7f-a4c2-aa70aedcabf2","Type":"ContainerStarted","Data":"3cb5b26f01fce584e56a8b8887ec536df131e3e3b492e10dfd8ad0f28a54cd6d"} Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.918225 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" event={"ID":"a0ff3ca9-0eab-42b0-8b08-3bef726fd70f","Type":"ContainerDied","Data":"eca1d7abab8a5286901220c988859636cc2ef54fd3f2443374de5d043d5d7982"} Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.918257 4971 scope.go:117] "RemoveContainer" containerID="3d33a47916db8d980cdf85e4dc283a379886f64bac8ba907ec558d52d023ed44" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.918386 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b969cd678-zr6fw" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.924335 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ac5d26f5-5e17-4dd7-a334-5060a68b2d08","Type":"ContainerStarted","Data":"732cb3cce9822a22f43f2856399c978d0c7464fbbfff61e340a55dd7c8effa19"} Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.924392 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ac5d26f5-5e17-4dd7-a334-5060a68b2d08","Type":"ContainerStarted","Data":"7296a2a500abcb07929bae78e97ee730e2936703b117333121cca4c532b42cbc"} Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.931079 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bf8511f8-2505-4b62-9b52-1061935e2517","Type":"ContainerStarted","Data":"912175aa1cd3a2876fbb927f23779da1c593b79de746487494500b8b40aa97c0"} Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.931143 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bf8511f8-2505-4b62-9b52-1061935e2517","Type":"ContainerStarted","Data":"d1d99141e3cec696389bd9b1b40c16fa5eb4ef927abfdad5687f1b659650d656"} Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.936844 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6b969cd678-zr6fw"] Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.937986 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46" event={"ID":"73403aa7-c920-436e-a393-8a47d8b64086","Type":"ContainerDied","Data":"faaf0bb6d730390cb385dd3a9d457ade36ca400ea53c81931d0647250cda493c"} Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.938005 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.939330 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6b969cd678-zr6fw"] Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.940215 4971 generic.go:334] "Generic (PLEG): container finished" podID="b1992d44-6e31-4432-88f0-320408d9fa70" containerID="cfe701473a62562e73a494c9c76c3af4506f0da33202cb97a0e531d1d81f26ed" exitCode=0 Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.940263 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnbvj" event={"ID":"b1992d44-6e31-4432-88f0-320408d9fa70","Type":"ContainerDied","Data":"cfe701473a62562e73a494c9c76c3af4506f0da33202cb97a0e531d1d81f26ed"} Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.947309 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=5.947297342 podStartE2EDuration="5.947297342s" podCreationTimestamp="2026-03-09 09:22:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:47.946603217 +0000 UTC m=+171.506531037" watchObservedRunningTime="2026-03-09 09:22:47.947297342 +0000 UTC m=+171.507225152" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.956636 4971 scope.go:117] "RemoveContainer" containerID="b5fd8d1335898c9cd3437c81582e7430d9a111bcc11b63a4f8c9f8f07c3573e0" Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.986138 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46"] Mar 09 09:22:47 crc kubenswrapper[4971]: I0309 09:22:47.990565 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67f87dff78-6zx46"] Mar 09 09:22:48 crc kubenswrapper[4971]: I0309 09:22:48.152499 4971 scope.go:117] "RemoveContainer" containerID="0498fa34e162baaf3d51e00c839035dfb5a043d12e709f17f37859b8d3fbe083" Mar 09 09:22:48 crc kubenswrapper[4971]: E0309 09:22:48.152823 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:22:48 crc kubenswrapper[4971]: I0309 09:22:48.253097 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550802-d8cbz" Mar 09 09:22:48 crc kubenswrapper[4971]: I0309 09:22:48.340700 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlrhf\" (UniqueName: \"kubernetes.io/projected/603b9f27-06c0-4fe8-8cc3-416122462369-kube-api-access-tlrhf\") pod \"603b9f27-06c0-4fe8-8cc3-416122462369\" (UID: \"603b9f27-06c0-4fe8-8cc3-416122462369\") " Mar 09 09:22:48 crc kubenswrapper[4971]: I0309 09:22:48.346856 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/603b9f27-06c0-4fe8-8cc3-416122462369-kube-api-access-tlrhf" (OuterVolumeSpecName: "kube-api-access-tlrhf") pod "603b9f27-06c0-4fe8-8cc3-416122462369" (UID: "603b9f27-06c0-4fe8-8cc3-416122462369"). InnerVolumeSpecName "kube-api-access-tlrhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:22:48 crc kubenswrapper[4971]: I0309 09:22:48.443694 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlrhf\" (UniqueName: \"kubernetes.io/projected/603b9f27-06c0-4fe8-8cc3-416122462369-kube-api-access-tlrhf\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:48 crc kubenswrapper[4971]: I0309 09:22:48.951285 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr" event={"ID":"85ec2882-f893-4f7f-a4c2-aa70aedcabf2","Type":"ContainerStarted","Data":"ecc91af12b1d478ad620fd22b4df5c9297f085e244616960899de7669be6cd8f"} Mar 09 09:22:48 crc kubenswrapper[4971]: I0309 09:22:48.951677 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr" Mar 09 09:22:48 crc kubenswrapper[4971]: I0309 09:22:48.954509 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550802-d8cbz" event={"ID":"603b9f27-06c0-4fe8-8cc3-416122462369","Type":"ContainerDied","Data":"d4ecad667b6fe558d6eb04e691231e454f199ca1b59dce91c529c6d8a87126d9"} Mar 09 09:22:48 crc kubenswrapper[4971]: I0309 09:22:48.954545 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4ecad667b6fe558d6eb04e691231e454f199ca1b59dce91c529c6d8a87126d9" Mar 09 09:22:48 crc kubenswrapper[4971]: I0309 09:22:48.954549 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550802-d8cbz" Mar 09 09:22:48 crc kubenswrapper[4971]: I0309 09:22:48.957425 4971 generic.go:334] "Generic (PLEG): container finished" podID="bf8511f8-2505-4b62-9b52-1061935e2517" containerID="912175aa1cd3a2876fbb927f23779da1c593b79de746487494500b8b40aa97c0" exitCode=0 Mar 09 09:22:48 crc kubenswrapper[4971]: I0309 09:22:48.957467 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bf8511f8-2505-4b62-9b52-1061935e2517","Type":"ContainerDied","Data":"912175aa1cd3a2876fbb927f23779da1c593b79de746487494500b8b40aa97c0"} Mar 09 09:22:48 crc kubenswrapper[4971]: I0309 09:22:48.959692 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr" Mar 09 09:22:48 crc kubenswrapper[4971]: I0309 09:22:48.963524 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnbvj" event={"ID":"b1992d44-6e31-4432-88f0-320408d9fa70","Type":"ContainerStarted","Data":"5690cec9250d82cb806a9759f34b8e0fcc6f90509029e2540c3624744749af74"} Mar 09 09:22:48 crc kubenswrapper[4971]: I0309 09:22:48.978186 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr" podStartSLOduration=17.978161729 podStartE2EDuration="17.978161729s" podCreationTimestamp="2026-03-09 09:22:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:48.969962515 +0000 UTC m=+172.529890345" watchObservedRunningTime="2026-03-09 09:22:48.978161729 +0000 UTC m=+172.538089539" Mar 09 09:22:49 crc kubenswrapper[4971]: I0309 09:22:49.038673 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xnbvj" podStartSLOduration=3.192013651 podStartE2EDuration="38.038652428s" podCreationTimestamp="2026-03-09 09:22:11 +0000 UTC" firstStartedPulling="2026-03-09 09:22:13.624260107 +0000 UTC m=+137.184187917" lastFinishedPulling="2026-03-09 09:22:48.470898884 +0000 UTC m=+172.030826694" observedRunningTime="2026-03-09 09:22:49.035604529 +0000 UTC m=+172.595532329" watchObservedRunningTime="2026-03-09 09:22:49.038652428 +0000 UTC m=+172.598580228" Mar 09 09:22:49 crc kubenswrapper[4971]: I0309 09:22:49.061931 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.061910202 podStartE2EDuration="3.061910202s" podCreationTimestamp="2026-03-09 09:22:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:49.056218948 +0000 UTC m=+172.616146758" watchObservedRunningTime="2026-03-09 09:22:49.061910202 +0000 UTC m=+172.621838012" Mar 09 09:22:49 crc kubenswrapper[4971]: I0309 09:22:49.161923 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73403aa7-c920-436e-a393-8a47d8b64086" path="/var/lib/kubelet/pods/73403aa7-c920-436e-a393-8a47d8b64086/volumes" Mar 09 09:22:49 crc kubenswrapper[4971]: I0309 09:22:49.162611 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0ff3ca9-0eab-42b0-8b08-3bef726fd70f" path="/var/lib/kubelet/pods/a0ff3ca9-0eab-42b0-8b08-3bef726fd70f/volumes" Mar 09 09:22:50 crc kubenswrapper[4971]: I0309 09:22:50.241237 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:22:50 crc kubenswrapper[4971]: I0309 09:22:50.375184 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bf8511f8-2505-4b62-9b52-1061935e2517-kubelet-dir\") pod \"bf8511f8-2505-4b62-9b52-1061935e2517\" (UID: \"bf8511f8-2505-4b62-9b52-1061935e2517\") " Mar 09 09:22:50 crc kubenswrapper[4971]: I0309 09:22:50.375294 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf8511f8-2505-4b62-9b52-1061935e2517-kube-api-access\") pod \"bf8511f8-2505-4b62-9b52-1061935e2517\" (UID: \"bf8511f8-2505-4b62-9b52-1061935e2517\") " Mar 09 09:22:50 crc kubenswrapper[4971]: I0309 09:22:50.375312 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf8511f8-2505-4b62-9b52-1061935e2517-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bf8511f8-2505-4b62-9b52-1061935e2517" (UID: "bf8511f8-2505-4b62-9b52-1061935e2517"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:22:50 crc kubenswrapper[4971]: I0309 09:22:50.375566 4971 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bf8511f8-2505-4b62-9b52-1061935e2517-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:50 crc kubenswrapper[4971]: I0309 09:22:50.382524 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf8511f8-2505-4b62-9b52-1061935e2517-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bf8511f8-2505-4b62-9b52-1061935e2517" (UID: "bf8511f8-2505-4b62-9b52-1061935e2517"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:22:50 crc kubenswrapper[4971]: I0309 09:22:50.477577 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf8511f8-2505-4b62-9b52-1061935e2517-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:50 crc kubenswrapper[4971]: I0309 09:22:50.995050 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bf8511f8-2505-4b62-9b52-1061935e2517","Type":"ContainerDied","Data":"d1d99141e3cec696389bd9b1b40c16fa5eb4ef927abfdad5687f1b659650d656"} Mar 09 09:22:50 crc kubenswrapper[4971]: I0309 09:22:50.995108 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1d99141e3cec696389bd9b1b40c16fa5eb4ef927abfdad5687f1b659650d656" Mar 09 09:22:50 crc kubenswrapper[4971]: I0309 09:22:50.995077 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.449418 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5d596d879-p8h8h"] Mar 09 09:22:51 crc kubenswrapper[4971]: E0309 09:22:51.449683 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="603b9f27-06c0-4fe8-8cc3-416122462369" containerName="oc" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.449698 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="603b9f27-06c0-4fe8-8cc3-416122462369" containerName="oc" Mar 09 09:22:51 crc kubenswrapper[4971]: E0309 09:22:51.449724 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf8511f8-2505-4b62-9b52-1061935e2517" containerName="pruner" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.449732 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf8511f8-2505-4b62-9b52-1061935e2517" containerName="pruner" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.449879 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf8511f8-2505-4b62-9b52-1061935e2517" containerName="pruner" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.449895 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="603b9f27-06c0-4fe8-8cc3-416122462369" containerName="oc" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.450382 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.453124 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.455827 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.456304 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.456545 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.456760 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.459161 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.459378 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d596d879-p8h8h"] Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.460787 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.593935 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-proxy-ca-bundles\") pod \"controller-manager-5d596d879-p8h8h\" (UID: \"76acb4ff-8ac4-4ef3-846c-851420bb1c1b\") " pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.593999 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hb9t\" (UniqueName: \"kubernetes.io/projected/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-kube-api-access-5hb9t\") pod \"controller-manager-5d596d879-p8h8h\" (UID: \"76acb4ff-8ac4-4ef3-846c-851420bb1c1b\") " pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.594042 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-config\") pod \"controller-manager-5d596d879-p8h8h\" (UID: \"76acb4ff-8ac4-4ef3-846c-851420bb1c1b\") " pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.594083 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-client-ca\") pod \"controller-manager-5d596d879-p8h8h\" (UID: \"76acb4ff-8ac4-4ef3-846c-851420bb1c1b\") " pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.594100 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-serving-cert\") pod \"controller-manager-5d596d879-p8h8h\" (UID: \"76acb4ff-8ac4-4ef3-846c-851420bb1c1b\") " pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.694741 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-client-ca\") pod \"controller-manager-5d596d879-p8h8h\" (UID: \"76acb4ff-8ac4-4ef3-846c-851420bb1c1b\") " pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.694785 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-serving-cert\") pod \"controller-manager-5d596d879-p8h8h\" (UID: \"76acb4ff-8ac4-4ef3-846c-851420bb1c1b\") " pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.694821 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-proxy-ca-bundles\") pod \"controller-manager-5d596d879-p8h8h\" (UID: \"76acb4ff-8ac4-4ef3-846c-851420bb1c1b\") " pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.694849 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hb9t\" (UniqueName: \"kubernetes.io/projected/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-kube-api-access-5hb9t\") pod \"controller-manager-5d596d879-p8h8h\" (UID: \"76acb4ff-8ac4-4ef3-846c-851420bb1c1b\") " pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.694885 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-config\") pod \"controller-manager-5d596d879-p8h8h\" (UID: \"76acb4ff-8ac4-4ef3-846c-851420bb1c1b\") " pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.696342 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-client-ca\") pod \"controller-manager-5d596d879-p8h8h\" (UID: \"76acb4ff-8ac4-4ef3-846c-851420bb1c1b\") " pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.698964 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-proxy-ca-bundles\") pod \"controller-manager-5d596d879-p8h8h\" (UID: \"76acb4ff-8ac4-4ef3-846c-851420bb1c1b\") " pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.699080 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-config\") pod \"controller-manager-5d596d879-p8h8h\" (UID: \"76acb4ff-8ac4-4ef3-846c-851420bb1c1b\") " pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.707681 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-serving-cert\") pod \"controller-manager-5d596d879-p8h8h\" (UID: \"76acb4ff-8ac4-4ef3-846c-851420bb1c1b\") " pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.715864 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hb9t\" (UniqueName: \"kubernetes.io/projected/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-kube-api-access-5hb9t\") pod \"controller-manager-5d596d879-p8h8h\" (UID: \"76acb4ff-8ac4-4ef3-846c-851420bb1c1b\") " pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.770151 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.846201 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xnbvj" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.856726 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xnbvj" Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.860408 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d596d879-p8h8h"] Mar 09 09:22:51 crc kubenswrapper[4971]: I0309 09:22:51.971674 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr"] Mar 09 09:22:52 crc kubenswrapper[4971]: I0309 09:22:52.003154 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr" podUID="85ec2882-f893-4f7f-a4c2-aa70aedcabf2" containerName="route-controller-manager" containerID="cri-o://ecc91af12b1d478ad620fd22b4df5c9297f085e244616960899de7669be6cd8f" gracePeriod=30 Mar 09 09:22:52 crc kubenswrapper[4971]: I0309 09:22:52.064309 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xnbvj" Mar 09 09:22:53 crc kubenswrapper[4971]: I0309 09:22:53.010786 4971 generic.go:334] "Generic (PLEG): container finished" podID="85ec2882-f893-4f7f-a4c2-aa70aedcabf2" containerID="ecc91af12b1d478ad620fd22b4df5c9297f085e244616960899de7669be6cd8f" exitCode=0 Mar 09 09:22:53 crc kubenswrapper[4971]: I0309 09:22:53.010872 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr" event={"ID":"85ec2882-f893-4f7f-a4c2-aa70aedcabf2","Type":"ContainerDied","Data":"ecc91af12b1d478ad620fd22b4df5c9297f085e244616960899de7669be6cd8f"} Mar 09 09:22:53 crc kubenswrapper[4971]: I0309 09:22:53.048827 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xnbvj" Mar 09 09:22:53 crc kubenswrapper[4971]: I0309 09:22:53.091475 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xnbvj"] Mar 09 09:22:55 crc kubenswrapper[4971]: I0309 09:22:55.021808 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xnbvj" podUID="b1992d44-6e31-4432-88f0-320408d9fa70" containerName="registry-server" containerID="cri-o://5690cec9250d82cb806a9759f34b8e0fcc6f90509029e2540c3624744749af74" gracePeriod=2 Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.029588 4971 generic.go:334] "Generic (PLEG): container finished" podID="b1992d44-6e31-4432-88f0-320408d9fa70" containerID="5690cec9250d82cb806a9759f34b8e0fcc6f90509029e2540c3624744749af74" exitCode=0 Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.029759 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnbvj" event={"ID":"b1992d44-6e31-4432-88f0-320408d9fa70","Type":"ContainerDied","Data":"5690cec9250d82cb806a9759f34b8e0fcc6f90509029e2540c3624744749af74"} Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.310629 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.337818 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw"] Mar 09 09:22:56 crc kubenswrapper[4971]: E0309 09:22:56.338124 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85ec2882-f893-4f7f-a4c2-aa70aedcabf2" containerName="route-controller-manager" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.338141 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="85ec2882-f893-4f7f-a4c2-aa70aedcabf2" containerName="route-controller-manager" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.338286 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="85ec2882-f893-4f7f-a4c2-aa70aedcabf2" containerName="route-controller-manager" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.338802 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.349064 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw"] Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.355594 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85ec2882-f893-4f7f-a4c2-aa70aedcabf2-serving-cert\") pod \"85ec2882-f893-4f7f-a4c2-aa70aedcabf2\" (UID: \"85ec2882-f893-4f7f-a4c2-aa70aedcabf2\") " Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.355666 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85ec2882-f893-4f7f-a4c2-aa70aedcabf2-config\") pod \"85ec2882-f893-4f7f-a4c2-aa70aedcabf2\" (UID: \"85ec2882-f893-4f7f-a4c2-aa70aedcabf2\") " Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.355778 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkcgd\" (UniqueName: \"kubernetes.io/projected/85ec2882-f893-4f7f-a4c2-aa70aedcabf2-kube-api-access-xkcgd\") pod \"85ec2882-f893-4f7f-a4c2-aa70aedcabf2\" (UID: \"85ec2882-f893-4f7f-a4c2-aa70aedcabf2\") " Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.355816 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/85ec2882-f893-4f7f-a4c2-aa70aedcabf2-client-ca\") pod \"85ec2882-f893-4f7f-a4c2-aa70aedcabf2\" (UID: \"85ec2882-f893-4f7f-a4c2-aa70aedcabf2\") " Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.357473 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85ec2882-f893-4f7f-a4c2-aa70aedcabf2-config" (OuterVolumeSpecName: "config") pod "85ec2882-f893-4f7f-a4c2-aa70aedcabf2" (UID: "85ec2882-f893-4f7f-a4c2-aa70aedcabf2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.357811 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85ec2882-f893-4f7f-a4c2-aa70aedcabf2-client-ca" (OuterVolumeSpecName: "client-ca") pod "85ec2882-f893-4f7f-a4c2-aa70aedcabf2" (UID: "85ec2882-f893-4f7f-a4c2-aa70aedcabf2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.364818 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85ec2882-f893-4f7f-a4c2-aa70aedcabf2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "85ec2882-f893-4f7f-a4c2-aa70aedcabf2" (UID: "85ec2882-f893-4f7f-a4c2-aa70aedcabf2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.364873 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85ec2882-f893-4f7f-a4c2-aa70aedcabf2-kube-api-access-xkcgd" (OuterVolumeSpecName: "kube-api-access-xkcgd") pod "85ec2882-f893-4f7f-a4c2-aa70aedcabf2" (UID: "85ec2882-f893-4f7f-a4c2-aa70aedcabf2"). InnerVolumeSpecName "kube-api-access-xkcgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.459556 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2w47\" (UniqueName: \"kubernetes.io/projected/d1cebb78-977e-41ad-907e-21c1c4597e28-kube-api-access-b2w47\") pod \"route-controller-manager-678c6bcfb7-wtdzw\" (UID: \"d1cebb78-977e-41ad-907e-21c1c4597e28\") " pod="openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.459822 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1cebb78-977e-41ad-907e-21c1c4597e28-serving-cert\") pod \"route-controller-manager-678c6bcfb7-wtdzw\" (UID: \"d1cebb78-977e-41ad-907e-21c1c4597e28\") " pod="openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.459847 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1cebb78-977e-41ad-907e-21c1c4597e28-config\") pod \"route-controller-manager-678c6bcfb7-wtdzw\" (UID: \"d1cebb78-977e-41ad-907e-21c1c4597e28\") " pod="openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.459864 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1cebb78-977e-41ad-907e-21c1c4597e28-client-ca\") pod \"route-controller-manager-678c6bcfb7-wtdzw\" (UID: \"d1cebb78-977e-41ad-907e-21c1c4597e28\") " pod="openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.459925 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkcgd\" (UniqueName: \"kubernetes.io/projected/85ec2882-f893-4f7f-a4c2-aa70aedcabf2-kube-api-access-xkcgd\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.459936 4971 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/85ec2882-f893-4f7f-a4c2-aa70aedcabf2-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.459945 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85ec2882-f893-4f7f-a4c2-aa70aedcabf2-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.459955 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85ec2882-f893-4f7f-a4c2-aa70aedcabf2-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.561951 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2w47\" (UniqueName: \"kubernetes.io/projected/d1cebb78-977e-41ad-907e-21c1c4597e28-kube-api-access-b2w47\") pod \"route-controller-manager-678c6bcfb7-wtdzw\" (UID: \"d1cebb78-977e-41ad-907e-21c1c4597e28\") " pod="openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.561996 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1cebb78-977e-41ad-907e-21c1c4597e28-serving-cert\") pod \"route-controller-manager-678c6bcfb7-wtdzw\" (UID: \"d1cebb78-977e-41ad-907e-21c1c4597e28\") " pod="openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.562022 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1cebb78-977e-41ad-907e-21c1c4597e28-config\") pod \"route-controller-manager-678c6bcfb7-wtdzw\" (UID: \"d1cebb78-977e-41ad-907e-21c1c4597e28\") " pod="openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.562039 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1cebb78-977e-41ad-907e-21c1c4597e28-client-ca\") pod \"route-controller-manager-678c6bcfb7-wtdzw\" (UID: \"d1cebb78-977e-41ad-907e-21c1c4597e28\") " pod="openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.562932 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1cebb78-977e-41ad-907e-21c1c4597e28-client-ca\") pod \"route-controller-manager-678c6bcfb7-wtdzw\" (UID: \"d1cebb78-977e-41ad-907e-21c1c4597e28\") " pod="openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.563171 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1cebb78-977e-41ad-907e-21c1c4597e28-config\") pod \"route-controller-manager-678c6bcfb7-wtdzw\" (UID: \"d1cebb78-977e-41ad-907e-21c1c4597e28\") " pod="openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.570936 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1cebb78-977e-41ad-907e-21c1c4597e28-serving-cert\") pod \"route-controller-manager-678c6bcfb7-wtdzw\" (UID: \"d1cebb78-977e-41ad-907e-21c1c4597e28\") " pod="openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.580662 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xnbvj" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.581068 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2w47\" (UniqueName: \"kubernetes.io/projected/d1cebb78-977e-41ad-907e-21c1c4597e28-kube-api-access-b2w47\") pod \"route-controller-manager-678c6bcfb7-wtdzw\" (UID: \"d1cebb78-977e-41ad-907e-21c1c4597e28\") " pod="openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.663002 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1992d44-6e31-4432-88f0-320408d9fa70-utilities\") pod \"b1992d44-6e31-4432-88f0-320408d9fa70\" (UID: \"b1992d44-6e31-4432-88f0-320408d9fa70\") " Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.663058 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9cl4\" (UniqueName: \"kubernetes.io/projected/b1992d44-6e31-4432-88f0-320408d9fa70-kube-api-access-r9cl4\") pod \"b1992d44-6e31-4432-88f0-320408d9fa70\" (UID: \"b1992d44-6e31-4432-88f0-320408d9fa70\") " Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.663121 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1992d44-6e31-4432-88f0-320408d9fa70-catalog-content\") pod \"b1992d44-6e31-4432-88f0-320408d9fa70\" (UID: \"b1992d44-6e31-4432-88f0-320408d9fa70\") " Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.666218 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1992d44-6e31-4432-88f0-320408d9fa70-utilities" (OuterVolumeSpecName: "utilities") pod "b1992d44-6e31-4432-88f0-320408d9fa70" (UID: "b1992d44-6e31-4432-88f0-320408d9fa70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.669739 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1992d44-6e31-4432-88f0-320408d9fa70-kube-api-access-r9cl4" (OuterVolumeSpecName: "kube-api-access-r9cl4") pod "b1992d44-6e31-4432-88f0-320408d9fa70" (UID: "b1992d44-6e31-4432-88f0-320408d9fa70"). InnerVolumeSpecName "kube-api-access-r9cl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.694477 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.739094 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1992d44-6e31-4432-88f0-320408d9fa70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1992d44-6e31-4432-88f0-320408d9fa70" (UID: "b1992d44-6e31-4432-88f0-320408d9fa70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.764015 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1992d44-6e31-4432-88f0-320408d9fa70-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.764053 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9cl4\" (UniqueName: \"kubernetes.io/projected/b1992d44-6e31-4432-88f0-320408d9fa70-kube-api-access-r9cl4\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.764063 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1992d44-6e31-4432-88f0-320408d9fa70-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.784498 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d596d879-p8h8h"] Mar 09 09:22:56 crc kubenswrapper[4971]: I0309 09:22:56.888645 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw"] Mar 09 09:22:56 crc kubenswrapper[4971]: W0309 09:22:56.901439 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1cebb78_977e_41ad_907e_21c1c4597e28.slice/crio-b33fae5e0627831e4c21b3ad7e7a732e2ea28290024f73d2df8fa67ce368e82a WatchSource:0}: Error finding container b33fae5e0627831e4c21b3ad7e7a732e2ea28290024f73d2df8fa67ce368e82a: Status 404 returned error can't find the container with id b33fae5e0627831e4c21b3ad7e7a732e2ea28290024f73d2df8fa67ce368e82a Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.035637 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw" event={"ID":"d1cebb78-977e-41ad-907e-21c1c4597e28","Type":"ContainerStarted","Data":"16d206af33d393f2c93446f9e96504e57633dd722cd6b35aa2d451b254f6ad64"} Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.035693 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw" event={"ID":"d1cebb78-977e-41ad-907e-21c1c4597e28","Type":"ContainerStarted","Data":"b33fae5e0627831e4c21b3ad7e7a732e2ea28290024f73d2df8fa67ce368e82a"} Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.037527 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" event={"ID":"76acb4ff-8ac4-4ef3-846c-851420bb1c1b","Type":"ContainerStarted","Data":"4cf95a0873484153ee167ed28e2bd26355176f42936aa4c23ef07b24f83d0070"} Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.037566 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" event={"ID":"76acb4ff-8ac4-4ef3-846c-851420bb1c1b","Type":"ContainerStarted","Data":"2eeae26b5cd2ecb93626f18622b4de2770dc96de677afd9ef675a8ce69c45596"} Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.037587 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" podUID="76acb4ff-8ac4-4ef3-846c-851420bb1c1b" containerName="controller-manager" containerID="cri-o://4cf95a0873484153ee167ed28e2bd26355176f42936aa4c23ef07b24f83d0070" gracePeriod=30 Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.037731 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.040579 4971 patch_prober.go:28] interesting pod/controller-manager-5d596d879-p8h8h container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" start-of-body= Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.040631 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" podUID="76acb4ff-8ac4-4ef3-846c-851420bb1c1b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.053539 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xnbvj" event={"ID":"b1992d44-6e31-4432-88f0-320408d9fa70","Type":"ContainerDied","Data":"6e3e4d13648d14f7da9a96b1c15e0ea8e88af6582f1ec2d09c83c3ad86c96162"} Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.053591 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xnbvj" Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.053610 4971 scope.go:117] "RemoveContainer" containerID="5690cec9250d82cb806a9759f34b8e0fcc6f90509029e2540c3624744749af74" Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.057623 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6x5k" event={"ID":"dbe25e82-76e3-4639-98f8-75a1e7f51c19","Type":"ContainerStarted","Data":"8cc8fcea12f3bf050a8bfd06f356ba185ad008fc4401b9a286faeb1b4d65e39d"} Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.058372 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" podStartSLOduration=26.058360638 podStartE2EDuration="26.058360638s" podCreationTimestamp="2026-03-09 09:22:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:57.056026484 +0000 UTC m=+180.615954284" watchObservedRunningTime="2026-03-09 09:22:57.058360638 +0000 UTC m=+180.618288438" Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.073209 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr" event={"ID":"85ec2882-f893-4f7f-a4c2-aa70aedcabf2","Type":"ContainerDied","Data":"3cb5b26f01fce584e56a8b8887ec536df131e3e3b492e10dfd8ad0f28a54cd6d"} Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.073560 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr" Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.077148 4971 generic.go:334] "Generic (PLEG): container finished" podID="9cb8b120-bccf-4c59-9c72-83c6169e3411" containerID="ea1420ceee293501f737d2da43ca644d5c18cc4d20a9f9e75467463ce26da0d8" exitCode=0 Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.077211 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4dfvc" event={"ID":"9cb8b120-bccf-4c59-9c72-83c6169e3411","Type":"ContainerDied","Data":"ea1420ceee293501f737d2da43ca644d5c18cc4d20a9f9e75467463ce26da0d8"} Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.084254 4971 generic.go:334] "Generic (PLEG): container finished" podID="4b5aceb0-6798-4435-9d7f-2548d1a42d11" containerID="49d3140c9466631c561f4e312dcd601b58cad9ed43fad5689fef3f052d9bb46c" exitCode=0 Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.084387 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s9xnf" event={"ID":"4b5aceb0-6798-4435-9d7f-2548d1a42d11","Type":"ContainerDied","Data":"49d3140c9466631c561f4e312dcd601b58cad9ed43fad5689fef3f052d9bb46c"} Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.094587 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sctl" event={"ID":"f03c17cc-a83d-4187-99d2-2c91b6edb26c","Type":"ContainerStarted","Data":"063f02fe68a3975788773fe2a3f20dcb2bd59eeb0119ca1c64b204cbc9776ade"} Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.121621 4971 scope.go:117] "RemoveContainer" containerID="cfe701473a62562e73a494c9c76c3af4506f0da33202cb97a0e531d1d81f26ed" Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.160143 4971 scope.go:117] "RemoveContainer" containerID="40ef5fadc5edc6653f20762069248da053830ef3814164e7a93c1c44551a7218" Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.166500 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xnbvj"] Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.176679 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xnbvj"] Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.180000 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr"] Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.185878 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f6bbcfdcb-mlcdr"] Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.193288 4971 scope.go:117] "RemoveContainer" containerID="ecc91af12b1d478ad620fd22b4df5c9297f085e244616960899de7669be6cd8f" Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.466829 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager_controller-manager-5d596d879-p8h8h_76acb4ff-8ac4-4ef3-846c-851420bb1c1b/controller-manager/0.log" Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.466887 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.586178 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-serving-cert\") pod \"76acb4ff-8ac4-4ef3-846c-851420bb1c1b\" (UID: \"76acb4ff-8ac4-4ef3-846c-851420bb1c1b\") " Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.586287 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-config\") pod \"76acb4ff-8ac4-4ef3-846c-851420bb1c1b\" (UID: \"76acb4ff-8ac4-4ef3-846c-851420bb1c1b\") " Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.586432 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hb9t\" (UniqueName: \"kubernetes.io/projected/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-kube-api-access-5hb9t\") pod \"76acb4ff-8ac4-4ef3-846c-851420bb1c1b\" (UID: \"76acb4ff-8ac4-4ef3-846c-851420bb1c1b\") " Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.586467 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-proxy-ca-bundles\") pod \"76acb4ff-8ac4-4ef3-846c-851420bb1c1b\" (UID: \"76acb4ff-8ac4-4ef3-846c-851420bb1c1b\") " Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.586533 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-client-ca\") pod \"76acb4ff-8ac4-4ef3-846c-851420bb1c1b\" (UID: \"76acb4ff-8ac4-4ef3-846c-851420bb1c1b\") " Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.587140 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-config" (OuterVolumeSpecName: "config") pod "76acb4ff-8ac4-4ef3-846c-851420bb1c1b" (UID: "76acb4ff-8ac4-4ef3-846c-851420bb1c1b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.587166 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "76acb4ff-8ac4-4ef3-846c-851420bb1c1b" (UID: "76acb4ff-8ac4-4ef3-846c-851420bb1c1b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.587235 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-client-ca" (OuterVolumeSpecName: "client-ca") pod "76acb4ff-8ac4-4ef3-846c-851420bb1c1b" (UID: "76acb4ff-8ac4-4ef3-846c-851420bb1c1b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.587586 4971 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.587613 4971 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.587627 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.592525 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "76acb4ff-8ac4-4ef3-846c-851420bb1c1b" (UID: "76acb4ff-8ac4-4ef3-846c-851420bb1c1b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.593800 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-kube-api-access-5hb9t" (OuterVolumeSpecName: "kube-api-access-5hb9t") pod "76acb4ff-8ac4-4ef3-846c-851420bb1c1b" (UID: "76acb4ff-8ac4-4ef3-846c-851420bb1c1b"). InnerVolumeSpecName "kube-api-access-5hb9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.688507 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hb9t\" (UniqueName: \"kubernetes.io/projected/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-kube-api-access-5hb9t\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:57 crc kubenswrapper[4971]: I0309 09:22:57.688873 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76acb4ff-8ac4-4ef3-846c-851420bb1c1b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.104499 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s9xnf" event={"ID":"4b5aceb0-6798-4435-9d7f-2548d1a42d11","Type":"ContainerStarted","Data":"afc9d210cc1019961801a8c7cc89b21aed322233e3fce1f28b394c9b20e14baf"} Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.111030 4971 generic.go:334] "Generic (PLEG): container finished" podID="dbe25e82-76e3-4639-98f8-75a1e7f51c19" containerID="8cc8fcea12f3bf050a8bfd06f356ba185ad008fc4401b9a286faeb1b4d65e39d" exitCode=0 Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.111095 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6x5k" event={"ID":"dbe25e82-76e3-4639-98f8-75a1e7f51c19","Type":"ContainerDied","Data":"8cc8fcea12f3bf050a8bfd06f356ba185ad008fc4401b9a286faeb1b4d65e39d"} Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.113696 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4dhm7" event={"ID":"41ca417f-9f99-44da-b444-4ecf1b9b5d04","Type":"ContainerStarted","Data":"43be5c38038dfe8257f9ad026b51337e974c887c65f216f5868aed47a67a6135"} Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.117178 4971 generic.go:334] "Generic (PLEG): container finished" podID="f03c17cc-a83d-4187-99d2-2c91b6edb26c" containerID="063f02fe68a3975788773fe2a3f20dcb2bd59eeb0119ca1c64b204cbc9776ade" exitCode=0 Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.117240 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sctl" event={"ID":"f03c17cc-a83d-4187-99d2-2c91b6edb26c","Type":"ContainerDied","Data":"063f02fe68a3975788773fe2a3f20dcb2bd59eeb0119ca1c64b204cbc9776ade"} Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.119439 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft9v2" event={"ID":"1054c243-8a85-4262-ba12-2ee5643d0255","Type":"ContainerStarted","Data":"e8a19e868089f0f66d6f010190443aadd0d112b4c95f6118e51f84124b498f83"} Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.122553 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4dfvc" event={"ID":"9cb8b120-bccf-4c59-9c72-83c6169e3411","Type":"ContainerStarted","Data":"aa6447d542dcde765db807feb3b3cb233af3d71acee837533cf1280dbf811020"} Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.125276 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cqt9" event={"ID":"e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd","Type":"ContainerStarted","Data":"b7a531e1b81878202962e62594813fb92ae7ec185e8d5b1a61dd5208e5026e17"} Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.128276 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager_controller-manager-5d596d879-p8h8h_76acb4ff-8ac4-4ef3-846c-851420bb1c1b/controller-manager/0.log" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.129356 4971 generic.go:334] "Generic (PLEG): container finished" podID="76acb4ff-8ac4-4ef3-846c-851420bb1c1b" containerID="4cf95a0873484153ee167ed28e2bd26355176f42936aa4c23ef07b24f83d0070" exitCode=2 Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.129622 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.129624 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" event={"ID":"76acb4ff-8ac4-4ef3-846c-851420bb1c1b","Type":"ContainerDied","Data":"4cf95a0873484153ee167ed28e2bd26355176f42936aa4c23ef07b24f83d0070"} Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.129785 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.129807 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d596d879-p8h8h" event={"ID":"76acb4ff-8ac4-4ef3-846c-851420bb1c1b","Type":"ContainerDied","Data":"2eeae26b5cd2ecb93626f18622b4de2770dc96de677afd9ef675a8ce69c45596"} Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.129834 4971 scope.go:117] "RemoveContainer" containerID="4cf95a0873484153ee167ed28e2bd26355176f42936aa4c23ef07b24f83d0070" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.130879 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s9xnf" podStartSLOduration=34.011799844 podStartE2EDuration="45.130863778s" podCreationTimestamp="2026-03-09 09:22:13 +0000 UTC" firstStartedPulling="2026-03-09 09:22:46.557078972 +0000 UTC m=+170.117006792" lastFinishedPulling="2026-03-09 09:22:57.676142916 +0000 UTC m=+181.236070726" observedRunningTime="2026-03-09 09:22:58.12980907 +0000 UTC m=+181.689736880" watchObservedRunningTime="2026-03-09 09:22:58.130863778 +0000 UTC m=+181.690791588" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.140944 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.148121 4971 scope.go:117] "RemoveContainer" containerID="4cf95a0873484153ee167ed28e2bd26355176f42936aa4c23ef07b24f83d0070" Mar 09 09:22:58 crc kubenswrapper[4971]: E0309 09:22:58.149460 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cf95a0873484153ee167ed28e2bd26355176f42936aa4c23ef07b24f83d0070\": container with ID starting with 4cf95a0873484153ee167ed28e2bd26355176f42936aa4c23ef07b24f83d0070 not found: ID does not exist" containerID="4cf95a0873484153ee167ed28e2bd26355176f42936aa4c23ef07b24f83d0070" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.149528 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cf95a0873484153ee167ed28e2bd26355176f42936aa4c23ef07b24f83d0070"} err="failed to get container status \"4cf95a0873484153ee167ed28e2bd26355176f42936aa4c23ef07b24f83d0070\": rpc error: code = NotFound desc = could not find container \"4cf95a0873484153ee167ed28e2bd26355176f42936aa4c23ef07b24f83d0070\": container with ID starting with 4cf95a0873484153ee167ed28e2bd26355176f42936aa4c23ef07b24f83d0070 not found: ID does not exist" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.314762 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4dfvc" podStartSLOduration=34.144938527 podStartE2EDuration="45.31473711s" podCreationTimestamp="2026-03-09 09:22:13 +0000 UTC" firstStartedPulling="2026-03-09 09:22:46.5570259 +0000 UTC m=+170.116953710" lastFinishedPulling="2026-03-09 09:22:57.726824483 +0000 UTC m=+181.286752293" observedRunningTime="2026-03-09 09:22:58.303891451 +0000 UTC m=+181.863819261" watchObservedRunningTime="2026-03-09 09:22:58.31473711 +0000 UTC m=+181.874664920" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.325402 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw" podStartSLOduration=7.325377972 podStartE2EDuration="7.325377972s" podCreationTimestamp="2026-03-09 09:22:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:58.318767245 +0000 UTC m=+181.878695055" watchObservedRunningTime="2026-03-09 09:22:58.325377972 +0000 UTC m=+181.885305802" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.341600 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d596d879-p8h8h"] Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.346971 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5d596d879-p8h8h"] Mar 09 09:22:58 crc kubenswrapper[4971]: E0309 09:22:58.351173 4971 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1054c243_8a85_4262_ba12_2ee5643d0255.slice/crio-conmon-e8a19e868089f0f66d6f010190443aadd0d112b4c95f6118e51f84124b498f83.scope\": RecentStats: unable to find data in memory cache]" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.447795 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb"] Mar 09 09:22:58 crc kubenswrapper[4971]: E0309 09:22:58.448099 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1992d44-6e31-4432-88f0-320408d9fa70" containerName="extract-content" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.448117 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1992d44-6e31-4432-88f0-320408d9fa70" containerName="extract-content" Mar 09 09:22:58 crc kubenswrapper[4971]: E0309 09:22:58.448132 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76acb4ff-8ac4-4ef3-846c-851420bb1c1b" containerName="controller-manager" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.448140 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="76acb4ff-8ac4-4ef3-846c-851420bb1c1b" containerName="controller-manager" Mar 09 09:22:58 crc kubenswrapper[4971]: E0309 09:22:58.448158 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1992d44-6e31-4432-88f0-320408d9fa70" containerName="extract-utilities" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.448166 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1992d44-6e31-4432-88f0-320408d9fa70" containerName="extract-utilities" Mar 09 09:22:58 crc kubenswrapper[4971]: E0309 09:22:58.448173 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1992d44-6e31-4432-88f0-320408d9fa70" containerName="registry-server" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.448181 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1992d44-6e31-4432-88f0-320408d9fa70" containerName="registry-server" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.448289 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="76acb4ff-8ac4-4ef3-846c-851420bb1c1b" containerName="controller-manager" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.448308 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1992d44-6e31-4432-88f0-320408d9fa70" containerName="registry-server" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.448756 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.450960 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.451680 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.452839 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.453195 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.453251 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.454764 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.472019 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.485479 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb"] Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.499153 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg7zb\" (UniqueName: \"kubernetes.io/projected/3f4aa567-6a43-43fb-860b-e012a2eb2878-kube-api-access-qg7zb\") pod \"controller-manager-6ff784b8b5-kgfgb\" (UID: \"3f4aa567-6a43-43fb-860b-e012a2eb2878\") " pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.499214 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4aa567-6a43-43fb-860b-e012a2eb2878-serving-cert\") pod \"controller-manager-6ff784b8b5-kgfgb\" (UID: \"3f4aa567-6a43-43fb-860b-e012a2eb2878\") " pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.499264 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f4aa567-6a43-43fb-860b-e012a2eb2878-proxy-ca-bundles\") pod \"controller-manager-6ff784b8b5-kgfgb\" (UID: \"3f4aa567-6a43-43fb-860b-e012a2eb2878\") " pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.499293 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f4aa567-6a43-43fb-860b-e012a2eb2878-config\") pod \"controller-manager-6ff784b8b5-kgfgb\" (UID: \"3f4aa567-6a43-43fb-860b-e012a2eb2878\") " pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.499338 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f4aa567-6a43-43fb-860b-e012a2eb2878-client-ca\") pod \"controller-manager-6ff784b8b5-kgfgb\" (UID: \"3f4aa567-6a43-43fb-860b-e012a2eb2878\") " pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.600596 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4aa567-6a43-43fb-860b-e012a2eb2878-serving-cert\") pod \"controller-manager-6ff784b8b5-kgfgb\" (UID: \"3f4aa567-6a43-43fb-860b-e012a2eb2878\") " pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.600662 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg7zb\" (UniqueName: \"kubernetes.io/projected/3f4aa567-6a43-43fb-860b-e012a2eb2878-kube-api-access-qg7zb\") pod \"controller-manager-6ff784b8b5-kgfgb\" (UID: \"3f4aa567-6a43-43fb-860b-e012a2eb2878\") " pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.600712 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f4aa567-6a43-43fb-860b-e012a2eb2878-proxy-ca-bundles\") pod \"controller-manager-6ff784b8b5-kgfgb\" (UID: \"3f4aa567-6a43-43fb-860b-e012a2eb2878\") " pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.600730 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f4aa567-6a43-43fb-860b-e012a2eb2878-config\") pod \"controller-manager-6ff784b8b5-kgfgb\" (UID: \"3f4aa567-6a43-43fb-860b-e012a2eb2878\") " pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.600762 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f4aa567-6a43-43fb-860b-e012a2eb2878-client-ca\") pod \"controller-manager-6ff784b8b5-kgfgb\" (UID: \"3f4aa567-6a43-43fb-860b-e012a2eb2878\") " pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.602306 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f4aa567-6a43-43fb-860b-e012a2eb2878-config\") pod \"controller-manager-6ff784b8b5-kgfgb\" (UID: \"3f4aa567-6a43-43fb-860b-e012a2eb2878\") " pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.603165 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f4aa567-6a43-43fb-860b-e012a2eb2878-client-ca\") pod \"controller-manager-6ff784b8b5-kgfgb\" (UID: \"3f4aa567-6a43-43fb-860b-e012a2eb2878\") " pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.603529 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f4aa567-6a43-43fb-860b-e012a2eb2878-proxy-ca-bundles\") pod \"controller-manager-6ff784b8b5-kgfgb\" (UID: \"3f4aa567-6a43-43fb-860b-e012a2eb2878\") " pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.606203 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4aa567-6a43-43fb-860b-e012a2eb2878-serving-cert\") pod \"controller-manager-6ff784b8b5-kgfgb\" (UID: \"3f4aa567-6a43-43fb-860b-e012a2eb2878\") " pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.626635 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg7zb\" (UniqueName: \"kubernetes.io/projected/3f4aa567-6a43-43fb-860b-e012a2eb2878-kube-api-access-qg7zb\") pod \"controller-manager-6ff784b8b5-kgfgb\" (UID: \"3f4aa567-6a43-43fb-860b-e012a2eb2878\") " pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" Mar 09 09:22:58 crc kubenswrapper[4971]: I0309 09:22:58.768879 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" Mar 09 09:22:59 crc kubenswrapper[4971]: I0309 09:22:59.024838 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb"] Mar 09 09:22:59 crc kubenswrapper[4971]: I0309 09:22:59.146261 4971 generic.go:334] "Generic (PLEG): container finished" podID="41ca417f-9f99-44da-b444-4ecf1b9b5d04" containerID="43be5c38038dfe8257f9ad026b51337e974c887c65f216f5868aed47a67a6135" exitCode=0 Mar 09 09:22:59 crc kubenswrapper[4971]: I0309 09:22:59.146400 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4dhm7" event={"ID":"41ca417f-9f99-44da-b444-4ecf1b9b5d04","Type":"ContainerDied","Data":"43be5c38038dfe8257f9ad026b51337e974c887c65f216f5868aed47a67a6135"} Mar 09 09:22:59 crc kubenswrapper[4971]: I0309 09:22:59.151909 4971 scope.go:117] "RemoveContainer" containerID="0498fa34e162baaf3d51e00c839035dfb5a043d12e709f17f37859b8d3fbe083" Mar 09 09:22:59 crc kubenswrapper[4971]: I0309 09:22:59.155684 4971 generic.go:334] "Generic (PLEG): container finished" podID="1054c243-8a85-4262-ba12-2ee5643d0255" containerID="e8a19e868089f0f66d6f010190443aadd0d112b4c95f6118e51f84124b498f83" exitCode=0 Mar 09 09:22:59 crc kubenswrapper[4971]: I0309 09:22:59.164281 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76acb4ff-8ac4-4ef3-846c-851420bb1c1b" path="/var/lib/kubelet/pods/76acb4ff-8ac4-4ef3-846c-851420bb1c1b/volumes" Mar 09 09:22:59 crc kubenswrapper[4971]: I0309 09:22:59.164678 4971 generic.go:334] "Generic (PLEG): container finished" podID="e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd" containerID="b7a531e1b81878202962e62594813fb92ae7ec185e8d5b1a61dd5208e5026e17" exitCode=0 Mar 09 09:22:59 crc kubenswrapper[4971]: I0309 09:22:59.170243 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85ec2882-f893-4f7f-a4c2-aa70aedcabf2" path="/var/lib/kubelet/pods/85ec2882-f893-4f7f-a4c2-aa70aedcabf2/volumes" Mar 09 09:22:59 crc kubenswrapper[4971]: I0309 09:22:59.170980 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1992d44-6e31-4432-88f0-320408d9fa70" path="/var/lib/kubelet/pods/b1992d44-6e31-4432-88f0-320408d9fa70/volumes" Mar 09 09:22:59 crc kubenswrapper[4971]: I0309 09:22:59.171726 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft9v2" event={"ID":"1054c243-8a85-4262-ba12-2ee5643d0255","Type":"ContainerDied","Data":"e8a19e868089f0f66d6f010190443aadd0d112b4c95f6118e51f84124b498f83"} Mar 09 09:22:59 crc kubenswrapper[4971]: I0309 09:22:59.171769 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" event={"ID":"3f4aa567-6a43-43fb-860b-e012a2eb2878","Type":"ContainerStarted","Data":"0dcb7e2a1bf67b029b73157407b99f1beeb42683f3bf11888e8590386114183b"} Mar 09 09:22:59 crc kubenswrapper[4971]: I0309 09:22:59.171784 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cqt9" event={"ID":"e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd","Type":"ContainerDied","Data":"b7a531e1b81878202962e62594813fb92ae7ec185e8d5b1a61dd5208e5026e17"} Mar 09 09:23:00 crc kubenswrapper[4971]: I0309 09:23:00.172967 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 09 09:23:00 crc kubenswrapper[4971]: I0309 09:23:00.175238 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0f2bee379f42ccccd8273d95dd2a0a851847888acd6e7718da0e1dcb6b23d8d2"} Mar 09 09:23:00 crc kubenswrapper[4971]: I0309 09:23:00.176303 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:23:00 crc kubenswrapper[4971]: I0309 09:23:00.179185 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" event={"ID":"3f4aa567-6a43-43fb-860b-e012a2eb2878","Type":"ContainerStarted","Data":"2e60fda8c9dd00e86bf523d0436a0777c051bbc014b328a107a7c348fd61a289"} Mar 09 09:23:00 crc kubenswrapper[4971]: I0309 09:23:00.179226 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" Mar 09 09:23:00 crc kubenswrapper[4971]: I0309 09:23:00.184848 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" Mar 09 09:23:00 crc kubenswrapper[4971]: I0309 09:23:00.196996 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=75.19697975 podStartE2EDuration="1m15.19697975s" podCreationTimestamp="2026-03-09 09:21:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:23:00.194517552 +0000 UTC m=+183.754445362" watchObservedRunningTime="2026-03-09 09:23:00.19697975 +0000 UTC m=+183.756907560" Mar 09 09:23:03 crc kubenswrapper[4971]: I0309 09:23:03.195782 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6x5k" event={"ID":"dbe25e82-76e3-4639-98f8-75a1e7f51c19","Type":"ContainerStarted","Data":"2e0b71b37e6c446367e99b79a27d62f1311e67c1646f01f7c2d6ee4a8fcd7acd"} Mar 09 09:23:03 crc kubenswrapper[4971]: I0309 09:23:03.219847 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k6x5k" podStartSLOduration=34.095621569 podStartE2EDuration="49.21983041s" podCreationTimestamp="2026-03-09 09:22:14 +0000 UTC" firstStartedPulling="2026-03-09 09:22:46.556977208 +0000 UTC m=+170.116905028" lastFinishedPulling="2026-03-09 09:23:01.681186049 +0000 UTC m=+185.241113869" observedRunningTime="2026-03-09 09:23:03.213583937 +0000 UTC m=+186.773511747" watchObservedRunningTime="2026-03-09 09:23:03.21983041 +0000 UTC m=+186.779758220" Mar 09 09:23:03 crc kubenswrapper[4971]: I0309 09:23:03.220856 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" podStartSLOduration=12.220847757 podStartE2EDuration="12.220847757s" podCreationTimestamp="2026-03-09 09:22:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:23:00.214179486 +0000 UTC m=+183.774107306" watchObservedRunningTime="2026-03-09 09:23:03.220847757 +0000 UTC m=+186.780775567" Mar 09 09:23:03 crc kubenswrapper[4971]: I0309 09:23:03.909184 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4dfvc" Mar 09 09:23:03 crc kubenswrapper[4971]: I0309 09:23:03.909626 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4dfvc" Mar 09 09:23:03 crc kubenswrapper[4971]: I0309 09:23:03.949732 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4dfvc" Mar 09 09:23:04 crc kubenswrapper[4971]: I0309 09:23:04.235835 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s9xnf" Mar 09 09:23:04 crc kubenswrapper[4971]: I0309 09:23:04.235875 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s9xnf" Mar 09 09:23:04 crc kubenswrapper[4971]: I0309 09:23:04.246012 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4dfvc" Mar 09 09:23:04 crc kubenswrapper[4971]: I0309 09:23:04.294227 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s9xnf" Mar 09 09:23:04 crc kubenswrapper[4971]: I0309 09:23:04.836132 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k6x5k" Mar 09 09:23:04 crc kubenswrapper[4971]: I0309 09:23:04.836202 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k6x5k" Mar 09 09:23:05 crc kubenswrapper[4971]: I0309 09:23:05.216673 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sctl" event={"ID":"f03c17cc-a83d-4187-99d2-2c91b6edb26c","Type":"ContainerStarted","Data":"0e842645ec6fd8f30f8c2f9f304defe3c8a7cf878f3a74bea50a0947928d9910"} Mar 09 09:23:05 crc kubenswrapper[4971]: I0309 09:23:05.236763 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7sctl" podStartSLOduration=34.234839229 podStartE2EDuration="51.236747097s" podCreationTimestamp="2026-03-09 09:22:14 +0000 UTC" firstStartedPulling="2026-03-09 09:22:46.556960557 +0000 UTC m=+170.116888377" lastFinishedPulling="2026-03-09 09:23:03.558868435 +0000 UTC m=+187.118796245" observedRunningTime="2026-03-09 09:23:05.234389273 +0000 UTC m=+188.794317083" watchObservedRunningTime="2026-03-09 09:23:05.236747097 +0000 UTC m=+188.796674907" Mar 09 09:23:05 crc kubenswrapper[4971]: I0309 09:23:05.241998 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7sctl" Mar 09 09:23:05 crc kubenswrapper[4971]: I0309 09:23:05.242121 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7sctl" Mar 09 09:23:05 crc kubenswrapper[4971]: I0309 09:23:05.258955 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s9xnf" Mar 09 09:23:05 crc kubenswrapper[4971]: I0309 09:23:05.879618 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k6x5k" podUID="dbe25e82-76e3-4639-98f8-75a1e7f51c19" containerName="registry-server" probeResult="failure" output=< Mar 09 09:23:05 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 09 09:23:05 crc kubenswrapper[4971]: > Mar 09 09:23:06 crc kubenswrapper[4971]: I0309 09:23:06.229416 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cqt9" event={"ID":"e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd","Type":"ContainerStarted","Data":"d0dedb7d2f2fe08ca2cbef2591a2bf14007765ab92efea23107dc3b3ed5fe3b5"} Mar 09 09:23:06 crc kubenswrapper[4971]: I0309 09:23:06.232035 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4dhm7" event={"ID":"41ca417f-9f99-44da-b444-4ecf1b9b5d04","Type":"ContainerStarted","Data":"bd2a09ab3c73b8deb1c444ebcfe0d2774422fa15e2a2b6e929781630045384b0"} Mar 09 09:23:06 crc kubenswrapper[4971]: I0309 09:23:06.234221 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft9v2" event={"ID":"1054c243-8a85-4262-ba12-2ee5643d0255","Type":"ContainerStarted","Data":"abaa47a6c4c1d5ffbb9e38d7598036e3eb6edaf32ce6d33ac34fcdeec5e1fcaa"} Mar 09 09:23:06 crc kubenswrapper[4971]: I0309 09:23:06.256447 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5cqt9" podStartSLOduration=8.868018338 podStartE2EDuration="55.256428044s" podCreationTimestamp="2026-03-09 09:22:11 +0000 UTC" firstStartedPulling="2026-03-09 09:22:19.216518981 +0000 UTC m=+142.776446791" lastFinishedPulling="2026-03-09 09:23:05.604928687 +0000 UTC m=+189.164856497" observedRunningTime="2026-03-09 09:23:06.255689097 +0000 UTC m=+189.815616907" watchObservedRunningTime="2026-03-09 09:23:06.256428044 +0000 UTC m=+189.816355854" Mar 09 09:23:06 crc kubenswrapper[4971]: I0309 09:23:06.276890 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ft9v2" podStartSLOduration=2.120711955 podStartE2EDuration="55.276871757s" podCreationTimestamp="2026-03-09 09:22:11 +0000 UTC" firstStartedPulling="2026-03-09 09:22:12.596114397 +0000 UTC m=+136.156042207" lastFinishedPulling="2026-03-09 09:23:05.752274199 +0000 UTC m=+189.312202009" observedRunningTime="2026-03-09 09:23:06.274193301 +0000 UTC m=+189.834121121" watchObservedRunningTime="2026-03-09 09:23:06.276871757 +0000 UTC m=+189.836799567" Mar 09 09:23:06 crc kubenswrapper[4971]: I0309 09:23:06.292913 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7sctl" podUID="f03c17cc-a83d-4187-99d2-2c91b6edb26c" containerName="registry-server" probeResult="failure" output=< Mar 09 09:23:06 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 09 09:23:06 crc kubenswrapper[4971]: > Mar 09 09:23:06 crc kubenswrapper[4971]: I0309 09:23:06.303011 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4dhm7" podStartSLOduration=9.011659827 podStartE2EDuration="55.302995573s" podCreationTimestamp="2026-03-09 09:22:11 +0000 UTC" firstStartedPulling="2026-03-09 09:22:19.216611514 +0000 UTC m=+142.776539324" lastFinishedPulling="2026-03-09 09:23:05.50794726 +0000 UTC m=+189.067875070" observedRunningTime="2026-03-09 09:23:06.296129127 +0000 UTC m=+189.856056937" watchObservedRunningTime="2026-03-09 09:23:06.302995573 +0000 UTC m=+189.862923393" Mar 09 09:23:07 crc kubenswrapper[4971]: I0309 09:23:07.321626 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s9xnf"] Mar 09 09:23:07 crc kubenswrapper[4971]: I0309 09:23:07.321850 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s9xnf" podUID="4b5aceb0-6798-4435-9d7f-2548d1a42d11" containerName="registry-server" containerID="cri-o://afc9d210cc1019961801a8c7cc89b21aed322233e3fce1f28b394c9b20e14baf" gracePeriod=2 Mar 09 09:23:09 crc kubenswrapper[4971]: I0309 09:23:09.066841 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:23:09 crc kubenswrapper[4971]: I0309 09:23:09.250473 4971 generic.go:334] "Generic (PLEG): container finished" podID="4b5aceb0-6798-4435-9d7f-2548d1a42d11" containerID="afc9d210cc1019961801a8c7cc89b21aed322233e3fce1f28b394c9b20e14baf" exitCode=0 Mar 09 09:23:09 crc kubenswrapper[4971]: I0309 09:23:09.250507 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s9xnf" event={"ID":"4b5aceb0-6798-4435-9d7f-2548d1a42d11","Type":"ContainerDied","Data":"afc9d210cc1019961801a8c7cc89b21aed322233e3fce1f28b394c9b20e14baf"} Mar 09 09:23:09 crc kubenswrapper[4971]: I0309 09:23:09.534112 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s9xnf" Mar 09 09:23:09 crc kubenswrapper[4971]: I0309 09:23:09.619064 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lslm9\" (UniqueName: \"kubernetes.io/projected/4b5aceb0-6798-4435-9d7f-2548d1a42d11-kube-api-access-lslm9\") pod \"4b5aceb0-6798-4435-9d7f-2548d1a42d11\" (UID: \"4b5aceb0-6798-4435-9d7f-2548d1a42d11\") " Mar 09 09:23:09 crc kubenswrapper[4971]: I0309 09:23:09.619131 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b5aceb0-6798-4435-9d7f-2548d1a42d11-catalog-content\") pod \"4b5aceb0-6798-4435-9d7f-2548d1a42d11\" (UID: \"4b5aceb0-6798-4435-9d7f-2548d1a42d11\") " Mar 09 09:23:09 crc kubenswrapper[4971]: I0309 09:23:09.619189 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b5aceb0-6798-4435-9d7f-2548d1a42d11-utilities\") pod \"4b5aceb0-6798-4435-9d7f-2548d1a42d11\" (UID: \"4b5aceb0-6798-4435-9d7f-2548d1a42d11\") " Mar 09 09:23:09 crc kubenswrapper[4971]: I0309 09:23:09.620653 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b5aceb0-6798-4435-9d7f-2548d1a42d11-utilities" (OuterVolumeSpecName: "utilities") pod "4b5aceb0-6798-4435-9d7f-2548d1a42d11" (UID: "4b5aceb0-6798-4435-9d7f-2548d1a42d11"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:23:09 crc kubenswrapper[4971]: I0309 09:23:09.627492 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b5aceb0-6798-4435-9d7f-2548d1a42d11-kube-api-access-lslm9" (OuterVolumeSpecName: "kube-api-access-lslm9") pod "4b5aceb0-6798-4435-9d7f-2548d1a42d11" (UID: "4b5aceb0-6798-4435-9d7f-2548d1a42d11"). InnerVolumeSpecName "kube-api-access-lslm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:23:09 crc kubenswrapper[4971]: I0309 09:23:09.646247 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b5aceb0-6798-4435-9d7f-2548d1a42d11-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b5aceb0-6798-4435-9d7f-2548d1a42d11" (UID: "4b5aceb0-6798-4435-9d7f-2548d1a42d11"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:23:09 crc kubenswrapper[4971]: I0309 09:23:09.720559 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lslm9\" (UniqueName: \"kubernetes.io/projected/4b5aceb0-6798-4435-9d7f-2548d1a42d11-kube-api-access-lslm9\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:09 crc kubenswrapper[4971]: I0309 09:23:09.720638 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b5aceb0-6798-4435-9d7f-2548d1a42d11-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:09 crc kubenswrapper[4971]: I0309 09:23:09.720649 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b5aceb0-6798-4435-9d7f-2548d1a42d11-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:10 crc kubenswrapper[4971]: I0309 09:23:10.256894 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s9xnf" event={"ID":"4b5aceb0-6798-4435-9d7f-2548d1a42d11","Type":"ContainerDied","Data":"5c6b28d8e3045025f13a68b6eeca8267de1e2bd0735b5f11870b01ed8c16c64e"} Mar 09 09:23:10 crc kubenswrapper[4971]: I0309 09:23:10.256952 4971 scope.go:117] "RemoveContainer" containerID="afc9d210cc1019961801a8c7cc89b21aed322233e3fce1f28b394c9b20e14baf" Mar 09 09:23:10 crc kubenswrapper[4971]: I0309 09:23:10.257084 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s9xnf" Mar 09 09:23:10 crc kubenswrapper[4971]: I0309 09:23:10.276108 4971 scope.go:117] "RemoveContainer" containerID="49d3140c9466631c561f4e312dcd601b58cad9ed43fad5689fef3f052d9bb46c" Mar 09 09:23:10 crc kubenswrapper[4971]: I0309 09:23:10.296030 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s9xnf"] Mar 09 09:23:10 crc kubenswrapper[4971]: I0309 09:23:10.305608 4971 scope.go:117] "RemoveContainer" containerID="1773a68a876713072ab28e2de5f9975c302bece56a93e2c6be2d9fb0e5ae13a0" Mar 09 09:23:10 crc kubenswrapper[4971]: I0309 09:23:10.313638 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s9xnf"] Mar 09 09:23:11 crc kubenswrapper[4971]: I0309 09:23:11.158822 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b5aceb0-6798-4435-9d7f-2548d1a42d11" path="/var/lib/kubelet/pods/4b5aceb0-6798-4435-9d7f-2548d1a42d11/volumes" Mar 09 09:23:11 crc kubenswrapper[4971]: I0309 09:23:11.560863 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ft9v2" Mar 09 09:23:11 crc kubenswrapper[4971]: I0309 09:23:11.560914 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ft9v2" Mar 09 09:23:11 crc kubenswrapper[4971]: I0309 09:23:11.604575 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ft9v2" Mar 09 09:23:11 crc kubenswrapper[4971]: I0309 09:23:11.892595 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw"] Mar 09 09:23:11 crc kubenswrapper[4971]: I0309 09:23:11.892848 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw" podUID="d1cebb78-977e-41ad-907e-21c1c4597e28" containerName="route-controller-manager" containerID="cri-o://16d206af33d393f2c93446f9e96504e57633dd722cd6b35aa2d451b254f6ad64" gracePeriod=30 Mar 09 09:23:12 crc kubenswrapper[4971]: I0309 09:23:12.069468 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5cqt9" Mar 09 09:23:12 crc kubenswrapper[4971]: I0309 09:23:12.070365 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5cqt9" Mar 09 09:23:12 crc kubenswrapper[4971]: I0309 09:23:12.111062 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5cqt9" Mar 09 09:23:12 crc kubenswrapper[4971]: I0309 09:23:12.245669 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4dhm7" Mar 09 09:23:12 crc kubenswrapper[4971]: I0309 09:23:12.245725 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4dhm7" Mar 09 09:23:12 crc kubenswrapper[4971]: I0309 09:23:12.285751 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4dhm7" Mar 09 09:23:12 crc kubenswrapper[4971]: I0309 09:23:12.311305 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ft9v2" Mar 09 09:23:12 crc kubenswrapper[4971]: I0309 09:23:12.311994 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5cqt9" Mar 09 09:23:12 crc kubenswrapper[4971]: I0309 09:23:12.331818 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4dhm7" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.276812 4971 generic.go:334] "Generic (PLEG): container finished" podID="d1cebb78-977e-41ad-907e-21c1c4597e28" containerID="16d206af33d393f2c93446f9e96504e57633dd722cd6b35aa2d451b254f6ad64" exitCode=0 Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.276861 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw" event={"ID":"d1cebb78-977e-41ad-907e-21c1c4597e28","Type":"ContainerDied","Data":"16d206af33d393f2c93446f9e96504e57633dd722cd6b35aa2d451b254f6ad64"} Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.638816 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.672292 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg"] Mar 09 09:23:13 crc kubenswrapper[4971]: E0309 09:23:13.672692 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b5aceb0-6798-4435-9d7f-2548d1a42d11" containerName="extract-content" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.672705 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b5aceb0-6798-4435-9d7f-2548d1a42d11" containerName="extract-content" Mar 09 09:23:13 crc kubenswrapper[4971]: E0309 09:23:13.672723 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b5aceb0-6798-4435-9d7f-2548d1a42d11" containerName="extract-utilities" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.672730 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b5aceb0-6798-4435-9d7f-2548d1a42d11" containerName="extract-utilities" Mar 09 09:23:13 crc kubenswrapper[4971]: E0309 09:23:13.672738 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b5aceb0-6798-4435-9d7f-2548d1a42d11" containerName="registry-server" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.672745 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b5aceb0-6798-4435-9d7f-2548d1a42d11" containerName="registry-server" Mar 09 09:23:13 crc kubenswrapper[4971]: E0309 09:23:13.672760 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1cebb78-977e-41ad-907e-21c1c4597e28" containerName="route-controller-manager" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.672767 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1cebb78-977e-41ad-907e-21c1c4597e28" containerName="route-controller-manager" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.672984 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b5aceb0-6798-4435-9d7f-2548d1a42d11" containerName="registry-server" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.673000 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1cebb78-977e-41ad-907e-21c1c4597e28" containerName="route-controller-manager" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.674036 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.677497 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/733bd451-d1e8-48db-a9fb-87a12b247c39-serving-cert\") pod \"route-controller-manager-7449c948f6-fvmkg\" (UID: \"733bd451-d1e8-48db-a9fb-87a12b247c39\") " pod="openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.677568 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/733bd451-d1e8-48db-a9fb-87a12b247c39-client-ca\") pod \"route-controller-manager-7449c948f6-fvmkg\" (UID: \"733bd451-d1e8-48db-a9fb-87a12b247c39\") " pod="openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.678179 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/733bd451-d1e8-48db-a9fb-87a12b247c39-config\") pod \"route-controller-manager-7449c948f6-fvmkg\" (UID: \"733bd451-d1e8-48db-a9fb-87a12b247c39\") " pod="openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.678372 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvjfw\" (UniqueName: \"kubernetes.io/projected/733bd451-d1e8-48db-a9fb-87a12b247c39-kube-api-access-lvjfw\") pod \"route-controller-manager-7449c948f6-fvmkg\" (UID: \"733bd451-d1e8-48db-a9fb-87a12b247c39\") " pod="openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.685362 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg"] Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.779813 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1cebb78-977e-41ad-907e-21c1c4597e28-config\") pod \"d1cebb78-977e-41ad-907e-21c1c4597e28\" (UID: \"d1cebb78-977e-41ad-907e-21c1c4597e28\") " Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.779960 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1cebb78-977e-41ad-907e-21c1c4597e28-serving-cert\") pod \"d1cebb78-977e-41ad-907e-21c1c4597e28\" (UID: \"d1cebb78-977e-41ad-907e-21c1c4597e28\") " Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.780047 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1cebb78-977e-41ad-907e-21c1c4597e28-client-ca\") pod \"d1cebb78-977e-41ad-907e-21c1c4597e28\" (UID: \"d1cebb78-977e-41ad-907e-21c1c4597e28\") " Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.780114 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2w47\" (UniqueName: \"kubernetes.io/projected/d1cebb78-977e-41ad-907e-21c1c4597e28-kube-api-access-b2w47\") pod \"d1cebb78-977e-41ad-907e-21c1c4597e28\" (UID: \"d1cebb78-977e-41ad-907e-21c1c4597e28\") " Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.780361 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/733bd451-d1e8-48db-a9fb-87a12b247c39-serving-cert\") pod \"route-controller-manager-7449c948f6-fvmkg\" (UID: \"733bd451-d1e8-48db-a9fb-87a12b247c39\") " pod="openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.780397 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/733bd451-d1e8-48db-a9fb-87a12b247c39-client-ca\") pod \"route-controller-manager-7449c948f6-fvmkg\" (UID: \"733bd451-d1e8-48db-a9fb-87a12b247c39\") " pod="openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.780443 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/733bd451-d1e8-48db-a9fb-87a12b247c39-config\") pod \"route-controller-manager-7449c948f6-fvmkg\" (UID: \"733bd451-d1e8-48db-a9fb-87a12b247c39\") " pod="openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.780505 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvjfw\" (UniqueName: \"kubernetes.io/projected/733bd451-d1e8-48db-a9fb-87a12b247c39-kube-api-access-lvjfw\") pod \"route-controller-manager-7449c948f6-fvmkg\" (UID: \"733bd451-d1e8-48db-a9fb-87a12b247c39\") " pod="openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.781880 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1cebb78-977e-41ad-907e-21c1c4597e28-config" (OuterVolumeSpecName: "config") pod "d1cebb78-977e-41ad-907e-21c1c4597e28" (UID: "d1cebb78-977e-41ad-907e-21c1c4597e28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.782049 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1cebb78-977e-41ad-907e-21c1c4597e28-client-ca" (OuterVolumeSpecName: "client-ca") pod "d1cebb78-977e-41ad-907e-21c1c4597e28" (UID: "d1cebb78-977e-41ad-907e-21c1c4597e28"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.783334 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/733bd451-d1e8-48db-a9fb-87a12b247c39-config\") pod \"route-controller-manager-7449c948f6-fvmkg\" (UID: \"733bd451-d1e8-48db-a9fb-87a12b247c39\") " pod="openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.784716 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/733bd451-d1e8-48db-a9fb-87a12b247c39-client-ca\") pod \"route-controller-manager-7449c948f6-fvmkg\" (UID: \"733bd451-d1e8-48db-a9fb-87a12b247c39\") " pod="openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.787979 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/733bd451-d1e8-48db-a9fb-87a12b247c39-serving-cert\") pod \"route-controller-manager-7449c948f6-fvmkg\" (UID: \"733bd451-d1e8-48db-a9fb-87a12b247c39\") " pod="openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.790383 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1cebb78-977e-41ad-907e-21c1c4597e28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d1cebb78-977e-41ad-907e-21c1c4597e28" (UID: "d1cebb78-977e-41ad-907e-21c1c4597e28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.797964 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvjfw\" (UniqueName: \"kubernetes.io/projected/733bd451-d1e8-48db-a9fb-87a12b247c39-kube-api-access-lvjfw\") pod \"route-controller-manager-7449c948f6-fvmkg\" (UID: \"733bd451-d1e8-48db-a9fb-87a12b247c39\") " pod="openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.800604 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1cebb78-977e-41ad-907e-21c1c4597e28-kube-api-access-b2w47" (OuterVolumeSpecName: "kube-api-access-b2w47") pod "d1cebb78-977e-41ad-907e-21c1c4597e28" (UID: "d1cebb78-977e-41ad-907e-21c1c4597e28"). InnerVolumeSpecName "kube-api-access-b2w47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.881088 4971 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1cebb78-977e-41ad-907e-21c1c4597e28-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.881122 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2w47\" (UniqueName: \"kubernetes.io/projected/d1cebb78-977e-41ad-907e-21c1c4597e28-kube-api-access-b2w47\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.881135 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1cebb78-977e-41ad-907e-21c1c4597e28-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.881143 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1cebb78-977e-41ad-907e-21c1c4597e28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:13 crc kubenswrapper[4971]: I0309 09:23:13.996101 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg" Mar 09 09:23:14 crc kubenswrapper[4971]: I0309 09:23:14.283095 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw" event={"ID":"d1cebb78-977e-41ad-907e-21c1c4597e28","Type":"ContainerDied","Data":"b33fae5e0627831e4c21b3ad7e7a732e2ea28290024f73d2df8fa67ce368e82a"} Mar 09 09:23:14 crc kubenswrapper[4971]: I0309 09:23:14.283160 4971 scope.go:117] "RemoveContainer" containerID="16d206af33d393f2c93446f9e96504e57633dd722cd6b35aa2d451b254f6ad64" Mar 09 09:23:14 crc kubenswrapper[4971]: I0309 09:23:14.283335 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw" Mar 09 09:23:14 crc kubenswrapper[4971]: I0309 09:23:14.320475 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw"] Mar 09 09:23:14 crc kubenswrapper[4971]: I0309 09:23:14.327315 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-678c6bcfb7-wtdzw"] Mar 09 09:23:14 crc kubenswrapper[4971]: I0309 09:23:14.332721 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4dhm7"] Mar 09 09:23:14 crc kubenswrapper[4971]: I0309 09:23:14.333000 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4dhm7" podUID="41ca417f-9f99-44da-b444-4ecf1b9b5d04" containerName="registry-server" containerID="cri-o://bd2a09ab3c73b8deb1c444ebcfe0d2774422fa15e2a2b6e929781630045384b0" gracePeriod=2 Mar 09 09:23:14 crc kubenswrapper[4971]: I0309 09:23:14.367405 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg"] Mar 09 09:23:14 crc kubenswrapper[4971]: W0309 09:23:14.372708 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod733bd451_d1e8_48db_a9fb_87a12b247c39.slice/crio-61d9edc1658341fe2cb9fe37f92690ea33d5d0e2670b094f01a048a233d87413 WatchSource:0}: Error finding container 61d9edc1658341fe2cb9fe37f92690ea33d5d0e2670b094f01a048a233d87413: Status 404 returned error can't find the container with id 61d9edc1658341fe2cb9fe37f92690ea33d5d0e2670b094f01a048a233d87413 Mar 09 09:23:14 crc kubenswrapper[4971]: I0309 09:23:14.789755 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4dhm7" Mar 09 09:23:14 crc kubenswrapper[4971]: I0309 09:23:14.884087 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k6x5k" Mar 09 09:23:14 crc kubenswrapper[4971]: I0309 09:23:14.897378 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41ca417f-9f99-44da-b444-4ecf1b9b5d04-catalog-content\") pod \"41ca417f-9f99-44da-b444-4ecf1b9b5d04\" (UID: \"41ca417f-9f99-44da-b444-4ecf1b9b5d04\") " Mar 09 09:23:14 crc kubenswrapper[4971]: I0309 09:23:14.897444 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41ca417f-9f99-44da-b444-4ecf1b9b5d04-utilities\") pod \"41ca417f-9f99-44da-b444-4ecf1b9b5d04\" (UID: \"41ca417f-9f99-44da-b444-4ecf1b9b5d04\") " Mar 09 09:23:14 crc kubenswrapper[4971]: I0309 09:23:14.897746 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fwdd\" (UniqueName: \"kubernetes.io/projected/41ca417f-9f99-44da-b444-4ecf1b9b5d04-kube-api-access-2fwdd\") pod \"41ca417f-9f99-44da-b444-4ecf1b9b5d04\" (UID: \"41ca417f-9f99-44da-b444-4ecf1b9b5d04\") " Mar 09 09:23:14 crc kubenswrapper[4971]: I0309 09:23:14.898380 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41ca417f-9f99-44da-b444-4ecf1b9b5d04-utilities" (OuterVolumeSpecName: "utilities") pod "41ca417f-9f99-44da-b444-4ecf1b9b5d04" (UID: "41ca417f-9f99-44da-b444-4ecf1b9b5d04"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:23:14 crc kubenswrapper[4971]: I0309 09:23:14.905534 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41ca417f-9f99-44da-b444-4ecf1b9b5d04-kube-api-access-2fwdd" (OuterVolumeSpecName: "kube-api-access-2fwdd") pod "41ca417f-9f99-44da-b444-4ecf1b9b5d04" (UID: "41ca417f-9f99-44da-b444-4ecf1b9b5d04"). InnerVolumeSpecName "kube-api-access-2fwdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:23:14 crc kubenswrapper[4971]: I0309 09:23:14.932100 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k6x5k" Mar 09 09:23:14 crc kubenswrapper[4971]: I0309 09:23:14.964389 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41ca417f-9f99-44da-b444-4ecf1b9b5d04-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41ca417f-9f99-44da-b444-4ecf1b9b5d04" (UID: "41ca417f-9f99-44da-b444-4ecf1b9b5d04"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:23:14 crc kubenswrapper[4971]: I0309 09:23:14.998723 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fwdd\" (UniqueName: \"kubernetes.io/projected/41ca417f-9f99-44da-b444-4ecf1b9b5d04-kube-api-access-2fwdd\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:14 crc kubenswrapper[4971]: I0309 09:23:14.998758 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41ca417f-9f99-44da-b444-4ecf1b9b5d04-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:14 crc kubenswrapper[4971]: I0309 09:23:14.998768 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41ca417f-9f99-44da-b444-4ecf1b9b5d04-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:15 crc kubenswrapper[4971]: I0309 09:23:15.166837 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1cebb78-977e-41ad-907e-21c1c4597e28" path="/var/lib/kubelet/pods/d1cebb78-977e-41ad-907e-21c1c4597e28/volumes" Mar 09 09:23:15 crc kubenswrapper[4971]: I0309 09:23:15.286792 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7sctl" Mar 09 09:23:15 crc kubenswrapper[4971]: I0309 09:23:15.293702 4971 generic.go:334] "Generic (PLEG): container finished" podID="41ca417f-9f99-44da-b444-4ecf1b9b5d04" containerID="bd2a09ab3c73b8deb1c444ebcfe0d2774422fa15e2a2b6e929781630045384b0" exitCode=0 Mar 09 09:23:15 crc kubenswrapper[4971]: I0309 09:23:15.293769 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4dhm7" event={"ID":"41ca417f-9f99-44da-b444-4ecf1b9b5d04","Type":"ContainerDied","Data":"bd2a09ab3c73b8deb1c444ebcfe0d2774422fa15e2a2b6e929781630045384b0"} Mar 09 09:23:15 crc kubenswrapper[4971]: I0309 09:23:15.293790 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4dhm7" Mar 09 09:23:15 crc kubenswrapper[4971]: I0309 09:23:15.293805 4971 scope.go:117] "RemoveContainer" containerID="bd2a09ab3c73b8deb1c444ebcfe0d2774422fa15e2a2b6e929781630045384b0" Mar 09 09:23:15 crc kubenswrapper[4971]: I0309 09:23:15.293795 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4dhm7" event={"ID":"41ca417f-9f99-44da-b444-4ecf1b9b5d04","Type":"ContainerDied","Data":"2af0350a4b43f0cb40c6ebba6bf403deb23ccbfab656b35eeff9d9db5d4fb8ab"} Mar 09 09:23:15 crc kubenswrapper[4971]: I0309 09:23:15.296676 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg" event={"ID":"733bd451-d1e8-48db-a9fb-87a12b247c39","Type":"ContainerStarted","Data":"3f3ec83adcf3961a42e00e506fa209e9994eabd95f3078ff522c90255a87cc3d"} Mar 09 09:23:15 crc kubenswrapper[4971]: I0309 09:23:15.296725 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg" event={"ID":"733bd451-d1e8-48db-a9fb-87a12b247c39","Type":"ContainerStarted","Data":"61d9edc1658341fe2cb9fe37f92690ea33d5d0e2670b094f01a048a233d87413"} Mar 09 09:23:15 crc kubenswrapper[4971]: I0309 09:23:15.296893 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg" Mar 09 09:23:15 crc kubenswrapper[4971]: I0309 09:23:15.305320 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg" Mar 09 09:23:15 crc kubenswrapper[4971]: I0309 09:23:15.330471 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7sctl" Mar 09 09:23:15 crc kubenswrapper[4971]: I0309 09:23:15.334047 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4dhm7"] Mar 09 09:23:15 crc kubenswrapper[4971]: I0309 09:23:15.341786 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg" podStartSLOduration=4.341768729 podStartE2EDuration="4.341768729s" podCreationTimestamp="2026-03-09 09:23:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:23:15.331774395 +0000 UTC m=+198.891702215" watchObservedRunningTime="2026-03-09 09:23:15.341768729 +0000 UTC m=+198.901696529" Mar 09 09:23:15 crc kubenswrapper[4971]: I0309 09:23:15.343309 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4dhm7"] Mar 09 09:23:15 crc kubenswrapper[4971]: I0309 09:23:15.343421 4971 scope.go:117] "RemoveContainer" containerID="43be5c38038dfe8257f9ad026b51337e974c887c65f216f5868aed47a67a6135" Mar 09 09:23:15 crc kubenswrapper[4971]: I0309 09:23:15.372896 4971 scope.go:117] "RemoveContainer" containerID="db4015f80efc6015c1d6ab615103144cd9f0008f76b77dc5b13f667608a42846" Mar 09 09:23:15 crc kubenswrapper[4971]: I0309 09:23:15.406640 4971 scope.go:117] "RemoveContainer" containerID="bd2a09ab3c73b8deb1c444ebcfe0d2774422fa15e2a2b6e929781630045384b0" Mar 09 09:23:15 crc kubenswrapper[4971]: E0309 09:23:15.407600 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd2a09ab3c73b8deb1c444ebcfe0d2774422fa15e2a2b6e929781630045384b0\": container with ID starting with bd2a09ab3c73b8deb1c444ebcfe0d2774422fa15e2a2b6e929781630045384b0 not found: ID does not exist" containerID="bd2a09ab3c73b8deb1c444ebcfe0d2774422fa15e2a2b6e929781630045384b0" Mar 09 09:23:15 crc kubenswrapper[4971]: I0309 09:23:15.407644 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd2a09ab3c73b8deb1c444ebcfe0d2774422fa15e2a2b6e929781630045384b0"} err="failed to get container status \"bd2a09ab3c73b8deb1c444ebcfe0d2774422fa15e2a2b6e929781630045384b0\": rpc error: code = NotFound desc = could not find container \"bd2a09ab3c73b8deb1c444ebcfe0d2774422fa15e2a2b6e929781630045384b0\": container with ID starting with bd2a09ab3c73b8deb1c444ebcfe0d2774422fa15e2a2b6e929781630045384b0 not found: ID does not exist" Mar 09 09:23:15 crc kubenswrapper[4971]: I0309 09:23:15.407668 4971 scope.go:117] "RemoveContainer" containerID="43be5c38038dfe8257f9ad026b51337e974c887c65f216f5868aed47a67a6135" Mar 09 09:23:15 crc kubenswrapper[4971]: E0309 09:23:15.409757 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43be5c38038dfe8257f9ad026b51337e974c887c65f216f5868aed47a67a6135\": container with ID starting with 43be5c38038dfe8257f9ad026b51337e974c887c65f216f5868aed47a67a6135 not found: ID does not exist" containerID="43be5c38038dfe8257f9ad026b51337e974c887c65f216f5868aed47a67a6135" Mar 09 09:23:15 crc kubenswrapper[4971]: I0309 09:23:15.409812 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43be5c38038dfe8257f9ad026b51337e974c887c65f216f5868aed47a67a6135"} err="failed to get container status \"43be5c38038dfe8257f9ad026b51337e974c887c65f216f5868aed47a67a6135\": rpc error: code = NotFound desc = could not find container \"43be5c38038dfe8257f9ad026b51337e974c887c65f216f5868aed47a67a6135\": container with ID starting with 43be5c38038dfe8257f9ad026b51337e974c887c65f216f5868aed47a67a6135 not found: ID does not exist" Mar 09 09:23:15 crc kubenswrapper[4971]: I0309 09:23:15.409849 4971 scope.go:117] "RemoveContainer" containerID="db4015f80efc6015c1d6ab615103144cd9f0008f76b77dc5b13f667608a42846" Mar 09 09:23:15 crc kubenswrapper[4971]: E0309 09:23:15.410160 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db4015f80efc6015c1d6ab615103144cd9f0008f76b77dc5b13f667608a42846\": container with ID starting with db4015f80efc6015c1d6ab615103144cd9f0008f76b77dc5b13f667608a42846 not found: ID does not exist" containerID="db4015f80efc6015c1d6ab615103144cd9f0008f76b77dc5b13f667608a42846" Mar 09 09:23:15 crc kubenswrapper[4971]: I0309 09:23:15.410184 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db4015f80efc6015c1d6ab615103144cd9f0008f76b77dc5b13f667608a42846"} err="failed to get container status \"db4015f80efc6015c1d6ab615103144cd9f0008f76b77dc5b13f667608a42846\": rpc error: code = NotFound desc = could not find container \"db4015f80efc6015c1d6ab615103144cd9f0008f76b77dc5b13f667608a42846\": container with ID starting with db4015f80efc6015c1d6ab615103144cd9f0008f76b77dc5b13f667608a42846 not found: ID does not exist" Mar 09 09:23:16 crc kubenswrapper[4971]: I0309 09:23:16.814423 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nvzgg"] Mar 09 09:23:17 crc kubenswrapper[4971]: I0309 09:23:17.115490 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7sctl"] Mar 09 09:23:17 crc kubenswrapper[4971]: I0309 09:23:17.159186 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41ca417f-9f99-44da-b444-4ecf1b9b5d04" path="/var/lib/kubelet/pods/41ca417f-9f99-44da-b444-4ecf1b9b5d04/volumes" Mar 09 09:23:17 crc kubenswrapper[4971]: I0309 09:23:17.311643 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7sctl" podUID="f03c17cc-a83d-4187-99d2-2c91b6edb26c" containerName="registry-server" containerID="cri-o://0e842645ec6fd8f30f8c2f9f304defe3c8a7cf878f3a74bea50a0947928d9910" gracePeriod=2 Mar 09 09:23:18 crc kubenswrapper[4971]: I0309 09:23:18.221090 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7sctl" Mar 09 09:23:18 crc kubenswrapper[4971]: I0309 09:23:18.318190 4971 generic.go:334] "Generic (PLEG): container finished" podID="f03c17cc-a83d-4187-99d2-2c91b6edb26c" containerID="0e842645ec6fd8f30f8c2f9f304defe3c8a7cf878f3a74bea50a0947928d9910" exitCode=0 Mar 09 09:23:18 crc kubenswrapper[4971]: I0309 09:23:18.318230 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sctl" event={"ID":"f03c17cc-a83d-4187-99d2-2c91b6edb26c","Type":"ContainerDied","Data":"0e842645ec6fd8f30f8c2f9f304defe3c8a7cf878f3a74bea50a0947928d9910"} Mar 09 09:23:18 crc kubenswrapper[4971]: I0309 09:23:18.318261 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sctl" event={"ID":"f03c17cc-a83d-4187-99d2-2c91b6edb26c","Type":"ContainerDied","Data":"9849c714ed13c7cf7d1113e1721f50bfaf2d114e7f2b4f84c3682c9dff85dce4"} Mar 09 09:23:18 crc kubenswrapper[4971]: I0309 09:23:18.318268 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7sctl" Mar 09 09:23:18 crc kubenswrapper[4971]: I0309 09:23:18.318278 4971 scope.go:117] "RemoveContainer" containerID="0e842645ec6fd8f30f8c2f9f304defe3c8a7cf878f3a74bea50a0947928d9910" Mar 09 09:23:18 crc kubenswrapper[4971]: I0309 09:23:18.333332 4971 scope.go:117] "RemoveContainer" containerID="063f02fe68a3975788773fe2a3f20dcb2bd59eeb0119ca1c64b204cbc9776ade" Mar 09 09:23:18 crc kubenswrapper[4971]: I0309 09:23:18.348781 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7gtx\" (UniqueName: \"kubernetes.io/projected/f03c17cc-a83d-4187-99d2-2c91b6edb26c-kube-api-access-g7gtx\") pod \"f03c17cc-a83d-4187-99d2-2c91b6edb26c\" (UID: \"f03c17cc-a83d-4187-99d2-2c91b6edb26c\") " Mar 09 09:23:18 crc kubenswrapper[4971]: I0309 09:23:18.348881 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f03c17cc-a83d-4187-99d2-2c91b6edb26c-utilities\") pod \"f03c17cc-a83d-4187-99d2-2c91b6edb26c\" (UID: \"f03c17cc-a83d-4187-99d2-2c91b6edb26c\") " Mar 09 09:23:18 crc kubenswrapper[4971]: I0309 09:23:18.348931 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f03c17cc-a83d-4187-99d2-2c91b6edb26c-catalog-content\") pod \"f03c17cc-a83d-4187-99d2-2c91b6edb26c\" (UID: \"f03c17cc-a83d-4187-99d2-2c91b6edb26c\") " Mar 09 09:23:18 crc kubenswrapper[4971]: I0309 09:23:18.349814 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f03c17cc-a83d-4187-99d2-2c91b6edb26c-utilities" (OuterVolumeSpecName: "utilities") pod "f03c17cc-a83d-4187-99d2-2c91b6edb26c" (UID: "f03c17cc-a83d-4187-99d2-2c91b6edb26c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:23:18 crc kubenswrapper[4971]: I0309 09:23:18.355309 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f03c17cc-a83d-4187-99d2-2c91b6edb26c-kube-api-access-g7gtx" (OuterVolumeSpecName: "kube-api-access-g7gtx") pod "f03c17cc-a83d-4187-99d2-2c91b6edb26c" (UID: "f03c17cc-a83d-4187-99d2-2c91b6edb26c"). InnerVolumeSpecName "kube-api-access-g7gtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:23:18 crc kubenswrapper[4971]: I0309 09:23:18.364620 4971 scope.go:117] "RemoveContainer" containerID="f1730aa4237056d4df2e63efeaaec3cd995d946541bd4b39698fecf657608476" Mar 09 09:23:18 crc kubenswrapper[4971]: I0309 09:23:18.387493 4971 scope.go:117] "RemoveContainer" containerID="0e842645ec6fd8f30f8c2f9f304defe3c8a7cf878f3a74bea50a0947928d9910" Mar 09 09:23:18 crc kubenswrapper[4971]: E0309 09:23:18.387884 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e842645ec6fd8f30f8c2f9f304defe3c8a7cf878f3a74bea50a0947928d9910\": container with ID starting with 0e842645ec6fd8f30f8c2f9f304defe3c8a7cf878f3a74bea50a0947928d9910 not found: ID does not exist" containerID="0e842645ec6fd8f30f8c2f9f304defe3c8a7cf878f3a74bea50a0947928d9910" Mar 09 09:23:18 crc kubenswrapper[4971]: I0309 09:23:18.387925 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e842645ec6fd8f30f8c2f9f304defe3c8a7cf878f3a74bea50a0947928d9910"} err="failed to get container status \"0e842645ec6fd8f30f8c2f9f304defe3c8a7cf878f3a74bea50a0947928d9910\": rpc error: code = NotFound desc = could not find container \"0e842645ec6fd8f30f8c2f9f304defe3c8a7cf878f3a74bea50a0947928d9910\": container with ID starting with 0e842645ec6fd8f30f8c2f9f304defe3c8a7cf878f3a74bea50a0947928d9910 not found: ID does not exist" Mar 09 09:23:18 crc kubenswrapper[4971]: I0309 09:23:18.387980 4971 scope.go:117] "RemoveContainer" containerID="063f02fe68a3975788773fe2a3f20dcb2bd59eeb0119ca1c64b204cbc9776ade" Mar 09 09:23:18 crc kubenswrapper[4971]: E0309 09:23:18.388428 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"063f02fe68a3975788773fe2a3f20dcb2bd59eeb0119ca1c64b204cbc9776ade\": container with ID starting with 063f02fe68a3975788773fe2a3f20dcb2bd59eeb0119ca1c64b204cbc9776ade not found: ID does not exist" containerID="063f02fe68a3975788773fe2a3f20dcb2bd59eeb0119ca1c64b204cbc9776ade" Mar 09 09:23:18 crc kubenswrapper[4971]: I0309 09:23:18.388471 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"063f02fe68a3975788773fe2a3f20dcb2bd59eeb0119ca1c64b204cbc9776ade"} err="failed to get container status \"063f02fe68a3975788773fe2a3f20dcb2bd59eeb0119ca1c64b204cbc9776ade\": rpc error: code = NotFound desc = could not find container \"063f02fe68a3975788773fe2a3f20dcb2bd59eeb0119ca1c64b204cbc9776ade\": container with ID starting with 063f02fe68a3975788773fe2a3f20dcb2bd59eeb0119ca1c64b204cbc9776ade not found: ID does not exist" Mar 09 09:23:18 crc kubenswrapper[4971]: I0309 09:23:18.388499 4971 scope.go:117] "RemoveContainer" containerID="f1730aa4237056d4df2e63efeaaec3cd995d946541bd4b39698fecf657608476" Mar 09 09:23:18 crc kubenswrapper[4971]: E0309 09:23:18.388855 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1730aa4237056d4df2e63efeaaec3cd995d946541bd4b39698fecf657608476\": container with ID starting with f1730aa4237056d4df2e63efeaaec3cd995d946541bd4b39698fecf657608476 not found: ID does not exist" containerID="f1730aa4237056d4df2e63efeaaec3cd995d946541bd4b39698fecf657608476" Mar 09 09:23:18 crc kubenswrapper[4971]: I0309 09:23:18.388918 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1730aa4237056d4df2e63efeaaec3cd995d946541bd4b39698fecf657608476"} err="failed to get container status \"f1730aa4237056d4df2e63efeaaec3cd995d946541bd4b39698fecf657608476\": rpc error: code = NotFound desc = could not find container \"f1730aa4237056d4df2e63efeaaec3cd995d946541bd4b39698fecf657608476\": container with ID starting with f1730aa4237056d4df2e63efeaaec3cd995d946541bd4b39698fecf657608476 not found: ID does not exist" Mar 09 09:23:18 crc kubenswrapper[4971]: I0309 09:23:18.450171 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7gtx\" (UniqueName: \"kubernetes.io/projected/f03c17cc-a83d-4187-99d2-2c91b6edb26c-kube-api-access-g7gtx\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:18 crc kubenswrapper[4971]: I0309 09:23:18.450215 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f03c17cc-a83d-4187-99d2-2c91b6edb26c-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:18 crc kubenswrapper[4971]: I0309 09:23:18.493103 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f03c17cc-a83d-4187-99d2-2c91b6edb26c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f03c17cc-a83d-4187-99d2-2c91b6edb26c" (UID: "f03c17cc-a83d-4187-99d2-2c91b6edb26c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:23:18 crc kubenswrapper[4971]: I0309 09:23:18.552011 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f03c17cc-a83d-4187-99d2-2c91b6edb26c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:18 crc kubenswrapper[4971]: I0309 09:23:18.662316 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7sctl"] Mar 09 09:23:18 crc kubenswrapper[4971]: I0309 09:23:18.668936 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7sctl"] Mar 09 09:23:19 crc kubenswrapper[4971]: I0309 09:23:19.158157 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f03c17cc-a83d-4187-99d2-2c91b6edb26c" path="/var/lib/kubelet/pods/f03c17cc-a83d-4187-99d2-2c91b6edb26c/volumes" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.461767 4971 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 09:23:25 crc kubenswrapper[4971]: E0309 09:23:25.462531 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f03c17cc-a83d-4187-99d2-2c91b6edb26c" containerName="extract-content" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.462546 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f03c17cc-a83d-4187-99d2-2c91b6edb26c" containerName="extract-content" Mar 09 09:23:25 crc kubenswrapper[4971]: E0309 09:23:25.462558 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ca417f-9f99-44da-b444-4ecf1b9b5d04" containerName="registry-server" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.462564 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ca417f-9f99-44da-b444-4ecf1b9b5d04" containerName="registry-server" Mar 09 09:23:25 crc kubenswrapper[4971]: E0309 09:23:25.462576 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ca417f-9f99-44da-b444-4ecf1b9b5d04" containerName="extract-content" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.462581 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ca417f-9f99-44da-b444-4ecf1b9b5d04" containerName="extract-content" Mar 09 09:23:25 crc kubenswrapper[4971]: E0309 09:23:25.462603 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f03c17cc-a83d-4187-99d2-2c91b6edb26c" containerName="extract-utilities" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.462610 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f03c17cc-a83d-4187-99d2-2c91b6edb26c" containerName="extract-utilities" Mar 09 09:23:25 crc kubenswrapper[4971]: E0309 09:23:25.462625 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f03c17cc-a83d-4187-99d2-2c91b6edb26c" containerName="registry-server" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.462631 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f03c17cc-a83d-4187-99d2-2c91b6edb26c" containerName="registry-server" Mar 09 09:23:25 crc kubenswrapper[4971]: E0309 09:23:25.462639 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ca417f-9f99-44da-b444-4ecf1b9b5d04" containerName="extract-utilities" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.462645 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ca417f-9f99-44da-b444-4ecf1b9b5d04" containerName="extract-utilities" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.462765 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f03c17cc-a83d-4187-99d2-2c91b6edb26c" containerName="registry-server" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.462780 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="41ca417f-9f99-44da-b444-4ecf1b9b5d04" containerName="registry-server" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.463247 4971 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.463404 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.463762 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://c494d070274a070206d91bffce12aeabf707c05a65cdd78caff26fb6ebbf4e6c" gracePeriod=15 Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.463799 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://100768721be25fba901aba996d0fadab51f1a4ad651d107c35aa0a6f8a8a0845" gracePeriod=15 Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.463878 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://c668a23ef472b7729b682d1415709cf4bbfcfffeb017fa593e76df60aca5f202" gracePeriod=15 Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.463977 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://0f2bee379f42ccccd8273d95dd2a0a851847888acd6e7718da0e1dcb6b23d8d2" gracePeriod=15 Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.463878 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://18bcdbacb9dee52d1b773dbd316af70261b6dbf5f177e646d4b4d17814bcf3d6" gracePeriod=15 Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.464516 4971 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 09:23:25 crc kubenswrapper[4971]: E0309 09:23:25.464793 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.464807 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:23:25 crc kubenswrapper[4971]: E0309 09:23:25.464817 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.464823 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 09 09:23:25 crc kubenswrapper[4971]: E0309 09:23:25.464839 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.464846 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:23:25 crc kubenswrapper[4971]: E0309 09:23:25.464856 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.464864 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 09:23:25 crc kubenswrapper[4971]: E0309 09:23:25.464876 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.464883 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 09:23:25 crc kubenswrapper[4971]: E0309 09:23:25.464896 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.464903 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 09:23:25 crc kubenswrapper[4971]: E0309 09:23:25.464912 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.464918 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 09:23:25 crc kubenswrapper[4971]: E0309 09:23:25.464929 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.464935 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:23:25 crc kubenswrapper[4971]: E0309 09:23:25.464942 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.464949 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.465054 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.465065 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.465073 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.465081 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.465090 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.465099 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.465109 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.465117 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.465127 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:23:25 crc kubenswrapper[4971]: E0309 09:23:25.465245 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.465253 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:23:25 crc kubenswrapper[4971]: E0309 09:23:25.465263 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.465269 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.465379 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.498812 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.540667 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.540714 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.540743 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.540767 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.540783 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.540796 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.540812 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.540853 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.642016 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.642417 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.642449 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.642178 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.642505 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.642529 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.642527 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.642470 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.642632 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.642652 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.642667 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.642702 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.642744 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.642804 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.642857 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.642824 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:23:25 crc kubenswrapper[4971]: I0309 09:23:25.805227 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:23:25 crc kubenswrapper[4971]: E0309 09:23:25.828563 4971 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.238:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b21f2fdf94252 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:23:25.827195474 +0000 UTC m=+209.387123284,LastTimestamp:2026-03-09 09:23:25.827195474 +0000 UTC m=+209.387123284,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:23:26 crc kubenswrapper[4971]: I0309 09:23:26.364769 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"8391fa528a05b270a93c269c629e988d0f5095321719186bfba5ae669bc74815"} Mar 09 09:23:26 crc kubenswrapper[4971]: I0309 09:23:26.364842 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"e92a963218d54bc6c4e319258784fa500509d77ba6458712d6bcb22655ba1f9f"} Mar 09 09:23:26 crc kubenswrapper[4971]: I0309 09:23:26.365534 4971 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:26 crc kubenswrapper[4971]: I0309 09:23:26.365929 4971 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:26 crc kubenswrapper[4971]: I0309 09:23:26.366746 4971 generic.go:334] "Generic (PLEG): container finished" podID="ac5d26f5-5e17-4dd7-a334-5060a68b2d08" containerID="732cb3cce9822a22f43f2856399c978d0c7464fbbfff61e340a55dd7c8effa19" exitCode=0 Mar 09 09:23:26 crc kubenswrapper[4971]: I0309 09:23:26.366825 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ac5d26f5-5e17-4dd7-a334-5060a68b2d08","Type":"ContainerDied","Data":"732cb3cce9822a22f43f2856399c978d0c7464fbbfff61e340a55dd7c8effa19"} Mar 09 09:23:26 crc kubenswrapper[4971]: I0309 09:23:26.367503 4971 status_manager.go:851] "Failed to get status for pod" podUID="ac5d26f5-5e17-4dd7-a334-5060a68b2d08" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:26 crc kubenswrapper[4971]: I0309 09:23:26.367719 4971 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:26 crc kubenswrapper[4971]: I0309 09:23:26.368011 4971 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:26 crc kubenswrapper[4971]: I0309 09:23:26.369028 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 09 09:23:26 crc kubenswrapper[4971]: I0309 09:23:26.370439 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 09:23:26 crc kubenswrapper[4971]: I0309 09:23:26.371374 4971 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0f2bee379f42ccccd8273d95dd2a0a851847888acd6e7718da0e1dcb6b23d8d2" exitCode=0 Mar 09 09:23:26 crc kubenswrapper[4971]: I0309 09:23:26.371395 4971 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="18bcdbacb9dee52d1b773dbd316af70261b6dbf5f177e646d4b4d17814bcf3d6" exitCode=0 Mar 09 09:23:26 crc kubenswrapper[4971]: I0309 09:23:26.371403 4971 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="100768721be25fba901aba996d0fadab51f1a4ad651d107c35aa0a6f8a8a0845" exitCode=0 Mar 09 09:23:26 crc kubenswrapper[4971]: I0309 09:23:26.371413 4971 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c668a23ef472b7729b682d1415709cf4bbfcfffeb017fa593e76df60aca5f202" exitCode=2 Mar 09 09:23:26 crc kubenswrapper[4971]: I0309 09:23:26.371448 4971 scope.go:117] "RemoveContainer" containerID="0498fa34e162baaf3d51e00c839035dfb5a043d12e709f17f37859b8d3fbe083" Mar 09 09:23:26 crc kubenswrapper[4971]: E0309 09:23:26.595611 4971 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:26 crc kubenswrapper[4971]: E0309 09:23:26.596313 4971 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:26 crc kubenswrapper[4971]: E0309 09:23:26.596863 4971 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:26 crc kubenswrapper[4971]: E0309 09:23:26.597226 4971 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:26 crc kubenswrapper[4971]: E0309 09:23:26.597495 4971 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:26 crc kubenswrapper[4971]: I0309 09:23:26.597521 4971 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 09 09:23:26 crc kubenswrapper[4971]: E0309 09:23:26.597737 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="200ms" Mar 09 09:23:26 crc kubenswrapper[4971]: E0309 09:23:26.799574 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="400ms" Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.163377 4971 status_manager.go:851] "Failed to get status for pod" podUID="ac5d26f5-5e17-4dd7-a334-5060a68b2d08" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.163956 4971 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.164434 4971 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:27 crc kubenswrapper[4971]: E0309 09:23:27.201061 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="800ms" Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.382204 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.857480 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.859054 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.859328 4971 status_manager.go:851] "Failed to get status for pod" podUID="ac5d26f5-5e17-4dd7-a334-5060a68b2d08" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.859611 4971 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.859689 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.860273 4971 status_manager.go:851] "Failed to get status for pod" podUID="ac5d26f5-5e17-4dd7-a334-5060a68b2d08" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.860617 4971 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.861076 4971 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.983502 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.983665 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ac5d26f5-5e17-4dd7-a334-5060a68b2d08-kube-api-access\") pod \"ac5d26f5-5e17-4dd7-a334-5060a68b2d08\" (UID: \"ac5d26f5-5e17-4dd7-a334-5060a68b2d08\") " Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.983607 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.983708 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.983737 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ac5d26f5-5e17-4dd7-a334-5060a68b2d08-var-lock\") pod \"ac5d26f5-5e17-4dd7-a334-5060a68b2d08\" (UID: \"ac5d26f5-5e17-4dd7-a334-5060a68b2d08\") " Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.983776 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.983825 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ac5d26f5-5e17-4dd7-a334-5060a68b2d08-kubelet-dir\") pod \"ac5d26f5-5e17-4dd7-a334-5060a68b2d08\" (UID: \"ac5d26f5-5e17-4dd7-a334-5060a68b2d08\") " Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.983869 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac5d26f5-5e17-4dd7-a334-5060a68b2d08-var-lock" (OuterVolumeSpecName: "var-lock") pod "ac5d26f5-5e17-4dd7-a334-5060a68b2d08" (UID: "ac5d26f5-5e17-4dd7-a334-5060a68b2d08"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.983922 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.983964 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac5d26f5-5e17-4dd7-a334-5060a68b2d08-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ac5d26f5-5e17-4dd7-a334-5060a68b2d08" (UID: "ac5d26f5-5e17-4dd7-a334-5060a68b2d08"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.984045 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.984177 4971 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.984211 4971 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.984222 4971 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ac5d26f5-5e17-4dd7-a334-5060a68b2d08-var-lock\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.984232 4971 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ac5d26f5-5e17-4dd7-a334-5060a68b2d08-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.984242 4971 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:27 crc kubenswrapper[4971]: I0309 09:23:27.989413 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac5d26f5-5e17-4dd7-a334-5060a68b2d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ac5d26f5-5e17-4dd7-a334-5060a68b2d08" (UID: "ac5d26f5-5e17-4dd7-a334-5060a68b2d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:23:28 crc kubenswrapper[4971]: E0309 09:23:28.002537 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="1.6s" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.086035 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ac5d26f5-5e17-4dd7-a334-5060a68b2d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.392261 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ac5d26f5-5e17-4dd7-a334-5060a68b2d08","Type":"ContainerDied","Data":"7296a2a500abcb07929bae78e97ee730e2936703b117333121cca4c532b42cbc"} Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.392318 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7296a2a500abcb07929bae78e97ee730e2936703b117333121cca4c532b42cbc" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.392275 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.395425 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.396036 4971 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c494d070274a070206d91bffce12aeabf707c05a65cdd78caff26fb6ebbf4e6c" exitCode=0 Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.396088 4971 scope.go:117] "RemoveContainer" containerID="0f2bee379f42ccccd8273d95dd2a0a851847888acd6e7718da0e1dcb6b23d8d2" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.396107 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.409537 4971 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.410018 4971 status_manager.go:851] "Failed to get status for pod" podUID="ac5d26f5-5e17-4dd7-a334-5060a68b2d08" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.410248 4971 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.410819 4971 scope.go:117] "RemoveContainer" containerID="18bcdbacb9dee52d1b773dbd316af70261b6dbf5f177e646d4b4d17814bcf3d6" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.415193 4971 status_manager.go:851] "Failed to get status for pod" podUID="ac5d26f5-5e17-4dd7-a334-5060a68b2d08" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.415642 4971 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.415898 4971 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.424568 4971 scope.go:117] "RemoveContainer" containerID="100768721be25fba901aba996d0fadab51f1a4ad651d107c35aa0a6f8a8a0845" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.436922 4971 scope.go:117] "RemoveContainer" containerID="c668a23ef472b7729b682d1415709cf4bbfcfffeb017fa593e76df60aca5f202" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.449634 4971 scope.go:117] "RemoveContainer" containerID="c494d070274a070206d91bffce12aeabf707c05a65cdd78caff26fb6ebbf4e6c" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.463032 4971 scope.go:117] "RemoveContainer" containerID="b762123102ea5f47e3dda61d57f51f6e18cfb4749be3b6fb11239448485e44ce" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.477994 4971 scope.go:117] "RemoveContainer" containerID="0f2bee379f42ccccd8273d95dd2a0a851847888acd6e7718da0e1dcb6b23d8d2" Mar 09 09:23:28 crc kubenswrapper[4971]: E0309 09:23:28.479055 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f2bee379f42ccccd8273d95dd2a0a851847888acd6e7718da0e1dcb6b23d8d2\": container with ID starting with 0f2bee379f42ccccd8273d95dd2a0a851847888acd6e7718da0e1dcb6b23d8d2 not found: ID does not exist" containerID="0f2bee379f42ccccd8273d95dd2a0a851847888acd6e7718da0e1dcb6b23d8d2" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.479095 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f2bee379f42ccccd8273d95dd2a0a851847888acd6e7718da0e1dcb6b23d8d2"} err="failed to get container status \"0f2bee379f42ccccd8273d95dd2a0a851847888acd6e7718da0e1dcb6b23d8d2\": rpc error: code = NotFound desc = could not find container \"0f2bee379f42ccccd8273d95dd2a0a851847888acd6e7718da0e1dcb6b23d8d2\": container with ID starting with 0f2bee379f42ccccd8273d95dd2a0a851847888acd6e7718da0e1dcb6b23d8d2 not found: ID does not exist" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.479121 4971 scope.go:117] "RemoveContainer" containerID="18bcdbacb9dee52d1b773dbd316af70261b6dbf5f177e646d4b4d17814bcf3d6" Mar 09 09:23:28 crc kubenswrapper[4971]: E0309 09:23:28.479519 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18bcdbacb9dee52d1b773dbd316af70261b6dbf5f177e646d4b4d17814bcf3d6\": container with ID starting with 18bcdbacb9dee52d1b773dbd316af70261b6dbf5f177e646d4b4d17814bcf3d6 not found: ID does not exist" containerID="18bcdbacb9dee52d1b773dbd316af70261b6dbf5f177e646d4b4d17814bcf3d6" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.479656 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18bcdbacb9dee52d1b773dbd316af70261b6dbf5f177e646d4b4d17814bcf3d6"} err="failed to get container status \"18bcdbacb9dee52d1b773dbd316af70261b6dbf5f177e646d4b4d17814bcf3d6\": rpc error: code = NotFound desc = could not find container \"18bcdbacb9dee52d1b773dbd316af70261b6dbf5f177e646d4b4d17814bcf3d6\": container with ID starting with 18bcdbacb9dee52d1b773dbd316af70261b6dbf5f177e646d4b4d17814bcf3d6 not found: ID does not exist" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.479841 4971 scope.go:117] "RemoveContainer" containerID="100768721be25fba901aba996d0fadab51f1a4ad651d107c35aa0a6f8a8a0845" Mar 09 09:23:28 crc kubenswrapper[4971]: E0309 09:23:28.480223 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"100768721be25fba901aba996d0fadab51f1a4ad651d107c35aa0a6f8a8a0845\": container with ID starting with 100768721be25fba901aba996d0fadab51f1a4ad651d107c35aa0a6f8a8a0845 not found: ID does not exist" containerID="100768721be25fba901aba996d0fadab51f1a4ad651d107c35aa0a6f8a8a0845" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.480304 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"100768721be25fba901aba996d0fadab51f1a4ad651d107c35aa0a6f8a8a0845"} err="failed to get container status \"100768721be25fba901aba996d0fadab51f1a4ad651d107c35aa0a6f8a8a0845\": rpc error: code = NotFound desc = could not find container \"100768721be25fba901aba996d0fadab51f1a4ad651d107c35aa0a6f8a8a0845\": container with ID starting with 100768721be25fba901aba996d0fadab51f1a4ad651d107c35aa0a6f8a8a0845 not found: ID does not exist" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.480425 4971 scope.go:117] "RemoveContainer" containerID="c668a23ef472b7729b682d1415709cf4bbfcfffeb017fa593e76df60aca5f202" Mar 09 09:23:28 crc kubenswrapper[4971]: E0309 09:23:28.480919 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c668a23ef472b7729b682d1415709cf4bbfcfffeb017fa593e76df60aca5f202\": container with ID starting with c668a23ef472b7729b682d1415709cf4bbfcfffeb017fa593e76df60aca5f202 not found: ID does not exist" containerID="c668a23ef472b7729b682d1415709cf4bbfcfffeb017fa593e76df60aca5f202" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.480957 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c668a23ef472b7729b682d1415709cf4bbfcfffeb017fa593e76df60aca5f202"} err="failed to get container status \"c668a23ef472b7729b682d1415709cf4bbfcfffeb017fa593e76df60aca5f202\": rpc error: code = NotFound desc = could not find container \"c668a23ef472b7729b682d1415709cf4bbfcfffeb017fa593e76df60aca5f202\": container with ID starting with c668a23ef472b7729b682d1415709cf4bbfcfffeb017fa593e76df60aca5f202 not found: ID does not exist" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.481017 4971 scope.go:117] "RemoveContainer" containerID="c494d070274a070206d91bffce12aeabf707c05a65cdd78caff26fb6ebbf4e6c" Mar 09 09:23:28 crc kubenswrapper[4971]: E0309 09:23:28.481577 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c494d070274a070206d91bffce12aeabf707c05a65cdd78caff26fb6ebbf4e6c\": container with ID starting with c494d070274a070206d91bffce12aeabf707c05a65cdd78caff26fb6ebbf4e6c not found: ID does not exist" containerID="c494d070274a070206d91bffce12aeabf707c05a65cdd78caff26fb6ebbf4e6c" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.481714 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c494d070274a070206d91bffce12aeabf707c05a65cdd78caff26fb6ebbf4e6c"} err="failed to get container status \"c494d070274a070206d91bffce12aeabf707c05a65cdd78caff26fb6ebbf4e6c\": rpc error: code = NotFound desc = could not find container \"c494d070274a070206d91bffce12aeabf707c05a65cdd78caff26fb6ebbf4e6c\": container with ID starting with c494d070274a070206d91bffce12aeabf707c05a65cdd78caff26fb6ebbf4e6c not found: ID does not exist" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.481851 4971 scope.go:117] "RemoveContainer" containerID="b762123102ea5f47e3dda61d57f51f6e18cfb4749be3b6fb11239448485e44ce" Mar 09 09:23:28 crc kubenswrapper[4971]: E0309 09:23:28.482284 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b762123102ea5f47e3dda61d57f51f6e18cfb4749be3b6fb11239448485e44ce\": container with ID starting with b762123102ea5f47e3dda61d57f51f6e18cfb4749be3b6fb11239448485e44ce not found: ID does not exist" containerID="b762123102ea5f47e3dda61d57f51f6e18cfb4749be3b6fb11239448485e44ce" Mar 09 09:23:28 crc kubenswrapper[4971]: I0309 09:23:28.482333 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b762123102ea5f47e3dda61d57f51f6e18cfb4749be3b6fb11239448485e44ce"} err="failed to get container status \"b762123102ea5f47e3dda61d57f51f6e18cfb4749be3b6fb11239448485e44ce\": rpc error: code = NotFound desc = could not find container \"b762123102ea5f47e3dda61d57f51f6e18cfb4749be3b6fb11239448485e44ce\": container with ID starting with b762123102ea5f47e3dda61d57f51f6e18cfb4749be3b6fb11239448485e44ce not found: ID does not exist" Mar 09 09:23:29 crc kubenswrapper[4971]: I0309 09:23:29.161874 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 09 09:23:29 crc kubenswrapper[4971]: E0309 09:23:29.603915 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="3.2s" Mar 09 09:23:32 crc kubenswrapper[4971]: E0309 09:23:32.127452 4971 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.238:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b21f2fdf94252 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:23:25.827195474 +0000 UTC m=+209.387123284,LastTimestamp:2026-03-09 09:23:25.827195474 +0000 UTC m=+209.387123284,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:23:32 crc kubenswrapper[4971]: E0309 09:23:32.804596 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="6.4s" Mar 09 09:23:35 crc kubenswrapper[4971]: E0309 09:23:35.179422 4971 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.238:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" volumeName="registry-storage" Mar 09 09:23:37 crc kubenswrapper[4971]: I0309 09:23:37.163579 4971 status_manager.go:851] "Failed to get status for pod" podUID="ac5d26f5-5e17-4dd7-a334-5060a68b2d08" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:37 crc kubenswrapper[4971]: I0309 09:23:37.164288 4971 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:37 crc kubenswrapper[4971]: E0309 09:23:37.184011 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:23:37Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:23:37Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:23:37Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:23:37Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:37 crc kubenswrapper[4971]: E0309 09:23:37.184557 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:37 crc kubenswrapper[4971]: E0309 09:23:37.184917 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:37 crc kubenswrapper[4971]: E0309 09:23:37.185157 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:37 crc kubenswrapper[4971]: E0309 09:23:37.185382 4971 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:37 crc kubenswrapper[4971]: E0309 09:23:37.185405 4971 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 09:23:38 crc kubenswrapper[4971]: I0309 09:23:38.151698 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:23:38 crc kubenswrapper[4971]: I0309 09:23:38.153554 4971 status_manager.go:851] "Failed to get status for pod" podUID="ac5d26f5-5e17-4dd7-a334-5060a68b2d08" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:38 crc kubenswrapper[4971]: I0309 09:23:38.154412 4971 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:38 crc kubenswrapper[4971]: I0309 09:23:38.171385 4971 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0fb4d16-1491-462c-aee2-58e5784eeee9" Mar 09 09:23:38 crc kubenswrapper[4971]: I0309 09:23:38.171427 4971 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0fb4d16-1491-462c-aee2-58e5784eeee9" Mar 09 09:23:38 crc kubenswrapper[4971]: E0309 09:23:38.171914 4971 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:23:38 crc kubenswrapper[4971]: I0309 09:23:38.172562 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:23:38 crc kubenswrapper[4971]: I0309 09:23:38.449571 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bd6414887e9f0692e28d3467ee0a8ef8f1a9ddc30e7fba500c5f7581de199207"} Mar 09 09:23:39 crc kubenswrapper[4971]: E0309 09:23:39.206133 4971 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="7s" Mar 09 09:23:39 crc kubenswrapper[4971]: I0309 09:23:39.459210 4971 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="50b0d7f7cbcec7dc05b3fcf3b8799985b32316225f5f5f123a3a9dfd3a1cf9c8" exitCode=0 Mar 09 09:23:39 crc kubenswrapper[4971]: I0309 09:23:39.459293 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"50b0d7f7cbcec7dc05b3fcf3b8799985b32316225f5f5f123a3a9dfd3a1cf9c8"} Mar 09 09:23:39 crc kubenswrapper[4971]: I0309 09:23:39.459764 4971 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0fb4d16-1491-462c-aee2-58e5784eeee9" Mar 09 09:23:39 crc kubenswrapper[4971]: I0309 09:23:39.459817 4971 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0fb4d16-1491-462c-aee2-58e5784eeee9" Mar 09 09:23:39 crc kubenswrapper[4971]: I0309 09:23:39.460297 4971 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:39 crc kubenswrapper[4971]: E0309 09:23:39.460600 4971 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:23:39 crc kubenswrapper[4971]: I0309 09:23:39.460680 4971 status_manager.go:851] "Failed to get status for pod" podUID="ac5d26f5-5e17-4dd7-a334-5060a68b2d08" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:39 crc kubenswrapper[4971]: I0309 09:23:39.464591 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 09:23:39 crc kubenswrapper[4971]: I0309 09:23:39.465252 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 09 09:23:39 crc kubenswrapper[4971]: I0309 09:23:39.465295 4971 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="dd82286e7ee5adff627366157b3654b38161971589300a0eaaa567b811af65e2" exitCode=1 Mar 09 09:23:39 crc kubenswrapper[4971]: I0309 09:23:39.465326 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"dd82286e7ee5adff627366157b3654b38161971589300a0eaaa567b811af65e2"} Mar 09 09:23:39 crc kubenswrapper[4971]: I0309 09:23:39.465797 4971 scope.go:117] "RemoveContainer" containerID="dd82286e7ee5adff627366157b3654b38161971589300a0eaaa567b811af65e2" Mar 09 09:23:39 crc kubenswrapper[4971]: I0309 09:23:39.466056 4971 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:39 crc kubenswrapper[4971]: I0309 09:23:39.468516 4971 status_manager.go:851] "Failed to get status for pod" podUID="ac5d26f5-5e17-4dd7-a334-5060a68b2d08" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:39 crc kubenswrapper[4971]: I0309 09:23:39.468908 4971 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 09 09:23:40 crc kubenswrapper[4971]: I0309 09:23:40.472238 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 09:23:40 crc kubenswrapper[4971]: I0309 09:23:40.472981 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 09 09:23:40 crc kubenswrapper[4971]: I0309 09:23:40.473039 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"af445e0a8b99a25e6231fa6a62c1c340d0342769818156004c8da94bdfc7c326"} Mar 09 09:23:40 crc kubenswrapper[4971]: I0309 09:23:40.477644 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b193a1a971f76b1d847cd52d8102eb92ce9514bc09244c2a5fbe8c6d23b5d632"} Mar 09 09:23:40 crc kubenswrapper[4971]: I0309 09:23:40.477684 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fbb7f90e9066100765ff008bf033dd38f57a7a29850e5db63d2d64ec9bede2f5"} Mar 09 09:23:40 crc kubenswrapper[4971]: I0309 09:23:40.477694 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b03aae9cb9613e1c753696fe0ae1334ed51d7828f899b9b0066998bfc5c28460"} Mar 09 09:23:40 crc kubenswrapper[4971]: I0309 09:23:40.477704 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a5470cea106f70fb6d8cb8b01e34d2495c1277bb515f07a42f6488fa00afeb0c"} Mar 09 09:23:40 crc kubenswrapper[4971]: I0309 09:23:40.477712 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f323f72f63c85a5c7bb011d42024adecc5eb5ec779c7937c495d3d2a8622711d"} Mar 09 09:23:40 crc kubenswrapper[4971]: I0309 09:23:40.477931 4971 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0fb4d16-1491-462c-aee2-58e5784eeee9" Mar 09 09:23:40 crc kubenswrapper[4971]: I0309 09:23:40.477944 4971 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0fb4d16-1491-462c-aee2-58e5784eeee9" Mar 09 09:23:40 crc kubenswrapper[4971]: I0309 09:23:40.478232 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:23:41 crc kubenswrapper[4971]: I0309 09:23:41.839983 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" podUID="2555712b-fa0a-4831-90ca-78d22b2e48b9" containerName="oauth-openshift" containerID="cri-o://9dd9162e41be5ca1e990c283b045bc207dc6f4526631e7d93083a2d856a09d35" gracePeriod=15 Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.245676 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.279436 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-serving-cert\") pod \"2555712b-fa0a-4831-90ca-78d22b2e48b9\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.279490 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk7tf\" (UniqueName: \"kubernetes.io/projected/2555712b-fa0a-4831-90ca-78d22b2e48b9-kube-api-access-jk7tf\") pod \"2555712b-fa0a-4831-90ca-78d22b2e48b9\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.279522 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2555712b-fa0a-4831-90ca-78d22b2e48b9-audit-policies\") pod \"2555712b-fa0a-4831-90ca-78d22b2e48b9\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.279540 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-trusted-ca-bundle\") pod \"2555712b-fa0a-4831-90ca-78d22b2e48b9\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.279567 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-user-template-error\") pod \"2555712b-fa0a-4831-90ca-78d22b2e48b9\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.279582 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-user-idp-0-file-data\") pod \"2555712b-fa0a-4831-90ca-78d22b2e48b9\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.279598 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-session\") pod \"2555712b-fa0a-4831-90ca-78d22b2e48b9\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.279626 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-service-ca\") pod \"2555712b-fa0a-4831-90ca-78d22b2e48b9\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.279647 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-user-template-login\") pod \"2555712b-fa0a-4831-90ca-78d22b2e48b9\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.279666 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-cliconfig\") pod \"2555712b-fa0a-4831-90ca-78d22b2e48b9\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.279687 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2555712b-fa0a-4831-90ca-78d22b2e48b9-audit-dir\") pod \"2555712b-fa0a-4831-90ca-78d22b2e48b9\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.279708 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-router-certs\") pod \"2555712b-fa0a-4831-90ca-78d22b2e48b9\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.280064 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2555712b-fa0a-4831-90ca-78d22b2e48b9-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "2555712b-fa0a-4831-90ca-78d22b2e48b9" (UID: "2555712b-fa0a-4831-90ca-78d22b2e48b9"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.280332 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "2555712b-fa0a-4831-90ca-78d22b2e48b9" (UID: "2555712b-fa0a-4831-90ca-78d22b2e48b9"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.280458 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "2555712b-fa0a-4831-90ca-78d22b2e48b9" (UID: "2555712b-fa0a-4831-90ca-78d22b2e48b9"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.280506 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-user-template-provider-selection\") pod \"2555712b-fa0a-4831-90ca-78d22b2e48b9\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.280559 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-ocp-branding-template\") pod \"2555712b-fa0a-4831-90ca-78d22b2e48b9\" (UID: \"2555712b-fa0a-4831-90ca-78d22b2e48b9\") " Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.280763 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "2555712b-fa0a-4831-90ca-78d22b2e48b9" (UID: "2555712b-fa0a-4831-90ca-78d22b2e48b9"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.280794 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.280814 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.280827 4971 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2555712b-fa0a-4831-90ca-78d22b2e48b9-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.280836 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2555712b-fa0a-4831-90ca-78d22b2e48b9-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "2555712b-fa0a-4831-90ca-78d22b2e48b9" (UID: "2555712b-fa0a-4831-90ca-78d22b2e48b9"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.285705 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "2555712b-fa0a-4831-90ca-78d22b2e48b9" (UID: "2555712b-fa0a-4831-90ca-78d22b2e48b9"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.286011 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "2555712b-fa0a-4831-90ca-78d22b2e48b9" (UID: "2555712b-fa0a-4831-90ca-78d22b2e48b9"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.286280 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "2555712b-fa0a-4831-90ca-78d22b2e48b9" (UID: "2555712b-fa0a-4831-90ca-78d22b2e48b9"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.286659 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "2555712b-fa0a-4831-90ca-78d22b2e48b9" (UID: "2555712b-fa0a-4831-90ca-78d22b2e48b9"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.286833 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "2555712b-fa0a-4831-90ca-78d22b2e48b9" (UID: "2555712b-fa0a-4831-90ca-78d22b2e48b9"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.286974 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "2555712b-fa0a-4831-90ca-78d22b2e48b9" (UID: "2555712b-fa0a-4831-90ca-78d22b2e48b9"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.287532 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "2555712b-fa0a-4831-90ca-78d22b2e48b9" (UID: "2555712b-fa0a-4831-90ca-78d22b2e48b9"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.287619 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "2555712b-fa0a-4831-90ca-78d22b2e48b9" (UID: "2555712b-fa0a-4831-90ca-78d22b2e48b9"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.298845 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2555712b-fa0a-4831-90ca-78d22b2e48b9-kube-api-access-jk7tf" (OuterVolumeSpecName: "kube-api-access-jk7tf") pod "2555712b-fa0a-4831-90ca-78d22b2e48b9" (UID: "2555712b-fa0a-4831-90ca-78d22b2e48b9"). InnerVolumeSpecName "kube-api-access-jk7tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.382529 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.382577 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.382589 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.382602 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.382613 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.382622 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.382632 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk7tf\" (UniqueName: \"kubernetes.io/projected/2555712b-fa0a-4831-90ca-78d22b2e48b9-kube-api-access-jk7tf\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.382641 4971 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2555712b-fa0a-4831-90ca-78d22b2e48b9-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.382650 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.382658 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.382668 4971 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2555712b-fa0a-4831-90ca-78d22b2e48b9-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.492587 4971 generic.go:334] "Generic (PLEG): container finished" podID="2555712b-fa0a-4831-90ca-78d22b2e48b9" containerID="9dd9162e41be5ca1e990c283b045bc207dc6f4526631e7d93083a2d856a09d35" exitCode=0 Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.492670 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" event={"ID":"2555712b-fa0a-4831-90ca-78d22b2e48b9","Type":"ContainerDied","Data":"9dd9162e41be5ca1e990c283b045bc207dc6f4526631e7d93083a2d856a09d35"} Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.492704 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.493098 4971 scope.go:117] "RemoveContainer" containerID="9dd9162e41be5ca1e990c283b045bc207dc6f4526631e7d93083a2d856a09d35" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.493070 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" event={"ID":"2555712b-fa0a-4831-90ca-78d22b2e48b9","Type":"ContainerDied","Data":"8043e5c566deff82a35ba7b2829ca22de3a082b596fc22b1a6d56c9fb24594b5"} Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.520490 4971 scope.go:117] "RemoveContainer" containerID="9dd9162e41be5ca1e990c283b045bc207dc6f4526631e7d93083a2d856a09d35" Mar 09 09:23:42 crc kubenswrapper[4971]: E0309 09:23:42.521182 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dd9162e41be5ca1e990c283b045bc207dc6f4526631e7d93083a2d856a09d35\": container with ID starting with 9dd9162e41be5ca1e990c283b045bc207dc6f4526631e7d93083a2d856a09d35 not found: ID does not exist" containerID="9dd9162e41be5ca1e990c283b045bc207dc6f4526631e7d93083a2d856a09d35" Mar 09 09:23:42 crc kubenswrapper[4971]: I0309 09:23:42.521244 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dd9162e41be5ca1e990c283b045bc207dc6f4526631e7d93083a2d856a09d35"} err="failed to get container status \"9dd9162e41be5ca1e990c283b045bc207dc6f4526631e7d93083a2d856a09d35\": rpc error: code = NotFound desc = could not find container \"9dd9162e41be5ca1e990c283b045bc207dc6f4526631e7d93083a2d856a09d35\": container with ID starting with 9dd9162e41be5ca1e990c283b045bc207dc6f4526631e7d93083a2d856a09d35 not found: ID does not exist" Mar 09 09:23:43 crc kubenswrapper[4971]: I0309 09:23:43.038601 4971 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-nvzgg container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.29:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 09:23:43 crc kubenswrapper[4971]: I0309 09:23:43.038665 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-nvzgg" podUID="2555712b-fa0a-4831-90ca-78d22b2e48b9" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.29:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 09:23:43 crc kubenswrapper[4971]: I0309 09:23:43.176001 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:23:43 crc kubenswrapper[4971]: I0309 09:23:43.176678 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:23:43 crc kubenswrapper[4971]: I0309 09:23:43.178445 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:23:46 crc kubenswrapper[4971]: I0309 09:23:46.073813 4971 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:23:46 crc kubenswrapper[4971]: I0309 09:23:46.520926 4971 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0fb4d16-1491-462c-aee2-58e5784eeee9" Mar 09 09:23:46 crc kubenswrapper[4971]: I0309 09:23:46.520968 4971 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0fb4d16-1491-462c-aee2-58e5784eeee9" Mar 09 09:23:46 crc kubenswrapper[4971]: I0309 09:23:46.528545 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:23:46 crc kubenswrapper[4971]: E0309 09:23:46.648717 4971 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-session\": Failed to watch *v1.Secret: unknown (get secrets)" logger="UnhandledError" Mar 09 09:23:46 crc kubenswrapper[4971]: I0309 09:23:46.935192 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:23:47 crc kubenswrapper[4971]: E0309 09:23:47.003166 4971 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-serving-cert\": Failed to watch *v1.Secret: unknown (get secrets)" logger="UnhandledError" Mar 09 09:23:47 crc kubenswrapper[4971]: I0309 09:23:47.191891 4971 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="2196bbf0-0aea-4baf-aca2-6f9ce20c8bb4" Mar 09 09:23:47 crc kubenswrapper[4971]: I0309 09:23:47.526707 4971 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0fb4d16-1491-462c-aee2-58e5784eeee9" Mar 09 09:23:47 crc kubenswrapper[4971]: I0309 09:23:47.526739 4971 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0fb4d16-1491-462c-aee2-58e5784eeee9" Mar 09 09:23:47 crc kubenswrapper[4971]: I0309 09:23:47.529951 4971 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="2196bbf0-0aea-4baf-aca2-6f9ce20c8bb4" Mar 09 09:23:48 crc kubenswrapper[4971]: I0309 09:23:48.859441 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:23:48 crc kubenswrapper[4971]: I0309 09:23:48.859630 4971 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 09 09:23:48 crc kubenswrapper[4971]: I0309 09:23:48.860091 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 09 09:23:56 crc kubenswrapper[4971]: I0309 09:23:56.996714 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 09 09:23:57 crc kubenswrapper[4971]: I0309 09:23:57.092919 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 09 09:23:57 crc kubenswrapper[4971]: I0309 09:23:57.439817 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 09 09:23:58 crc kubenswrapper[4971]: I0309 09:23:58.200860 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 09 09:23:58 crc kubenswrapper[4971]: I0309 09:23:58.337316 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 09 09:23:58 crc kubenswrapper[4971]: I0309 09:23:58.399937 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 09 09:23:58 crc kubenswrapper[4971]: I0309 09:23:58.412877 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 09:23:58 crc kubenswrapper[4971]: I0309 09:23:58.663629 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 09 09:23:58 crc kubenswrapper[4971]: I0309 09:23:58.843331 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 09 09:23:58 crc kubenswrapper[4971]: I0309 09:23:58.860503 4971 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 09 09:23:58 crc kubenswrapper[4971]: I0309 09:23:58.860590 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 09 09:23:58 crc kubenswrapper[4971]: I0309 09:23:58.933758 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 09 09:23:58 crc kubenswrapper[4971]: I0309 09:23:58.956004 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 09 09:23:59 crc kubenswrapper[4971]: I0309 09:23:59.027002 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 09 09:23:59 crc kubenswrapper[4971]: I0309 09:23:59.064291 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 09 09:23:59 crc kubenswrapper[4971]: I0309 09:23:59.089714 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 09 09:23:59 crc kubenswrapper[4971]: I0309 09:23:59.278902 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 09 09:23:59 crc kubenswrapper[4971]: I0309 09:23:59.515783 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 09 09:23:59 crc kubenswrapper[4971]: I0309 09:23:59.525200 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 09 09:23:59 crc kubenswrapper[4971]: I0309 09:23:59.562788 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 09 09:23:59 crc kubenswrapper[4971]: I0309 09:23:59.580096 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 09 09:23:59 crc kubenswrapper[4971]: I0309 09:23:59.767000 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 09 09:23:59 crc kubenswrapper[4971]: I0309 09:23:59.869620 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 09 09:23:59 crc kubenswrapper[4971]: I0309 09:23:59.874523 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 09 09:23:59 crc kubenswrapper[4971]: I0309 09:23:59.889104 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 09 09:23:59 crc kubenswrapper[4971]: I0309 09:23:59.912637 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 09 09:24:00 crc kubenswrapper[4971]: I0309 09:24:00.068908 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 09 09:24:00 crc kubenswrapper[4971]: I0309 09:24:00.165238 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 09:24:00 crc kubenswrapper[4971]: I0309 09:24:00.200672 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 09 09:24:00 crc kubenswrapper[4971]: I0309 09:24:00.284429 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 09:24:00 crc kubenswrapper[4971]: I0309 09:24:00.400246 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 09 09:24:00 crc kubenswrapper[4971]: I0309 09:24:00.556504 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 09 09:24:00 crc kubenswrapper[4971]: I0309 09:24:00.671517 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 09 09:24:00 crc kubenswrapper[4971]: I0309 09:24:00.704945 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 09 09:24:00 crc kubenswrapper[4971]: I0309 09:24:00.708854 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 09 09:24:00 crc kubenswrapper[4971]: I0309 09:24:00.927603 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 09 09:24:00 crc kubenswrapper[4971]: I0309 09:24:00.939317 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 09 09:24:00 crc kubenswrapper[4971]: I0309 09:24:00.956552 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 09 09:24:00 crc kubenswrapper[4971]: I0309 09:24:00.958555 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 09 09:24:01 crc kubenswrapper[4971]: I0309 09:24:01.113900 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 09 09:24:01 crc kubenswrapper[4971]: I0309 09:24:01.278143 4971 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 09 09:24:01 crc kubenswrapper[4971]: I0309 09:24:01.366850 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 09 09:24:01 crc kubenswrapper[4971]: I0309 09:24:01.396261 4971 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 09 09:24:01 crc kubenswrapper[4971]: I0309 09:24:01.436953 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 09 09:24:01 crc kubenswrapper[4971]: I0309 09:24:01.490071 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 09 09:24:01 crc kubenswrapper[4971]: I0309 09:24:01.503999 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 09 09:24:01 crc kubenswrapper[4971]: I0309 09:24:01.526237 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 09 09:24:01 crc kubenswrapper[4971]: I0309 09:24:01.553885 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 09 09:24:01 crc kubenswrapper[4971]: I0309 09:24:01.609872 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 09:24:01 crc kubenswrapper[4971]: I0309 09:24:01.625208 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 09 09:24:01 crc kubenswrapper[4971]: I0309 09:24:01.650568 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 09 09:24:01 crc kubenswrapper[4971]: I0309 09:24:01.833429 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 09 09:24:01 crc kubenswrapper[4971]: I0309 09:24:01.947593 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 09 09:24:02 crc kubenswrapper[4971]: I0309 09:24:02.000891 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 09 09:24:02 crc kubenswrapper[4971]: I0309 09:24:02.173922 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 09 09:24:02 crc kubenswrapper[4971]: I0309 09:24:02.223803 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 09:24:02 crc kubenswrapper[4971]: I0309 09:24:02.277270 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 09 09:24:02 crc kubenswrapper[4971]: I0309 09:24:02.343400 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 09 09:24:02 crc kubenswrapper[4971]: I0309 09:24:02.355426 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 09 09:24:02 crc kubenswrapper[4971]: I0309 09:24:02.365263 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 09 09:24:02 crc kubenswrapper[4971]: I0309 09:24:02.544550 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 09 09:24:02 crc kubenswrapper[4971]: I0309 09:24:02.544592 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 09 09:24:02 crc kubenswrapper[4971]: I0309 09:24:02.554569 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 09 09:24:02 crc kubenswrapper[4971]: I0309 09:24:02.569960 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 09 09:24:02 crc kubenswrapper[4971]: I0309 09:24:02.588734 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 09 09:24:02 crc kubenswrapper[4971]: I0309 09:24:02.626862 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 09 09:24:02 crc kubenswrapper[4971]: I0309 09:24:02.641099 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 09 09:24:02 crc kubenswrapper[4971]: I0309 09:24:02.701487 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 09 09:24:02 crc kubenswrapper[4971]: I0309 09:24:02.940674 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 09 09:24:02 crc kubenswrapper[4971]: I0309 09:24:02.988277 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 09 09:24:03 crc kubenswrapper[4971]: I0309 09:24:03.019154 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 09 09:24:03 crc kubenswrapper[4971]: I0309 09:24:03.070283 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 09 09:24:03 crc kubenswrapper[4971]: I0309 09:24:03.108537 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 09 09:24:03 crc kubenswrapper[4971]: I0309 09:24:03.188320 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 09 09:24:03 crc kubenswrapper[4971]: I0309 09:24:03.260832 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 09 09:24:03 crc kubenswrapper[4971]: I0309 09:24:03.310058 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 09 09:24:03 crc kubenswrapper[4971]: I0309 09:24:03.413180 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 09 09:24:03 crc kubenswrapper[4971]: I0309 09:24:03.426314 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 09 09:24:03 crc kubenswrapper[4971]: I0309 09:24:03.440564 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 09 09:24:03 crc kubenswrapper[4971]: I0309 09:24:03.476083 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 09 09:24:03 crc kubenswrapper[4971]: I0309 09:24:03.486848 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 09 09:24:03 crc kubenswrapper[4971]: I0309 09:24:03.494480 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 09:24:03 crc kubenswrapper[4971]: I0309 09:24:03.614704 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 09 09:24:03 crc kubenswrapper[4971]: I0309 09:24:03.635656 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 09 09:24:03 crc kubenswrapper[4971]: I0309 09:24:03.649791 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 09 09:24:03 crc kubenswrapper[4971]: I0309 09:24:03.662988 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 09 09:24:03 crc kubenswrapper[4971]: I0309 09:24:03.696134 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 09 09:24:03 crc kubenswrapper[4971]: I0309 09:24:03.721217 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 09 09:24:03 crc kubenswrapper[4971]: I0309 09:24:03.825242 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 09 09:24:03 crc kubenswrapper[4971]: I0309 09:24:03.835701 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 09 09:24:03 crc kubenswrapper[4971]: I0309 09:24:03.895817 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 09 09:24:03 crc kubenswrapper[4971]: I0309 09:24:03.991092 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 09 09:24:04 crc kubenswrapper[4971]: I0309 09:24:04.003738 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 09 09:24:04 crc kubenswrapper[4971]: I0309 09:24:04.012016 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 09 09:24:04 crc kubenswrapper[4971]: I0309 09:24:04.041392 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 09 09:24:04 crc kubenswrapper[4971]: I0309 09:24:04.153428 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 09:24:04 crc kubenswrapper[4971]: I0309 09:24:04.194502 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 09 09:24:04 crc kubenswrapper[4971]: I0309 09:24:04.208319 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 09 09:24:04 crc kubenswrapper[4971]: I0309 09:24:04.326514 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 09 09:24:04 crc kubenswrapper[4971]: I0309 09:24:04.362110 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 09 09:24:04 crc kubenswrapper[4971]: I0309 09:24:04.534021 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 09 09:24:04 crc kubenswrapper[4971]: I0309 09:24:04.558230 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 09 09:24:04 crc kubenswrapper[4971]: I0309 09:24:04.560377 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 09 09:24:04 crc kubenswrapper[4971]: I0309 09:24:04.622530 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 09 09:24:04 crc kubenswrapper[4971]: I0309 09:24:04.627919 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 09 09:24:04 crc kubenswrapper[4971]: I0309 09:24:04.827708 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 09 09:24:04 crc kubenswrapper[4971]: I0309 09:24:04.845743 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 09 09:24:04 crc kubenswrapper[4971]: I0309 09:24:04.867844 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 09 09:24:04 crc kubenswrapper[4971]: I0309 09:24:04.877464 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 09 09:24:04 crc kubenswrapper[4971]: I0309 09:24:04.973416 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 09 09:24:05 crc kubenswrapper[4971]: I0309 09:24:05.037905 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 09:24:05 crc kubenswrapper[4971]: I0309 09:24:05.121673 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 09 09:24:05 crc kubenswrapper[4971]: I0309 09:24:05.207890 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 09 09:24:05 crc kubenswrapper[4971]: I0309 09:24:05.258717 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 09 09:24:05 crc kubenswrapper[4971]: I0309 09:24:05.278686 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 09 09:24:05 crc kubenswrapper[4971]: I0309 09:24:05.391646 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 09 09:24:05 crc kubenswrapper[4971]: I0309 09:24:05.420647 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 09 09:24:05 crc kubenswrapper[4971]: I0309 09:24:05.498868 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 09 09:24:05 crc kubenswrapper[4971]: I0309 09:24:05.505611 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 09 09:24:05 crc kubenswrapper[4971]: I0309 09:24:05.564719 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 09 09:24:05 crc kubenswrapper[4971]: I0309 09:24:05.568922 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 09:24:05 crc kubenswrapper[4971]: I0309 09:24:05.636773 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 09 09:24:05 crc kubenswrapper[4971]: I0309 09:24:05.695930 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 09 09:24:05 crc kubenswrapper[4971]: I0309 09:24:05.736507 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 09 09:24:05 crc kubenswrapper[4971]: I0309 09:24:05.781067 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 09:24:05 crc kubenswrapper[4971]: I0309 09:24:05.804970 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 09 09:24:05 crc kubenswrapper[4971]: I0309 09:24:05.843005 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 09 09:24:05 crc kubenswrapper[4971]: I0309 09:24:05.884220 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 09 09:24:05 crc kubenswrapper[4971]: I0309 09:24:05.993456 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 09 09:24:05 crc kubenswrapper[4971]: I0309 09:24:05.995995 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 09 09:24:06 crc kubenswrapper[4971]: I0309 09:24:06.002329 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 09 09:24:06 crc kubenswrapper[4971]: I0309 09:24:06.060693 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 09 09:24:06 crc kubenswrapper[4971]: I0309 09:24:06.113691 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 09 09:24:06 crc kubenswrapper[4971]: I0309 09:24:06.146831 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 09 09:24:06 crc kubenswrapper[4971]: I0309 09:24:06.220034 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 09 09:24:06 crc kubenswrapper[4971]: I0309 09:24:06.240988 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 09 09:24:06 crc kubenswrapper[4971]: I0309 09:24:06.367154 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 09 09:24:06 crc kubenswrapper[4971]: I0309 09:24:06.500901 4971 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 09 09:24:06 crc kubenswrapper[4971]: I0309 09:24:06.505715 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=41.50569695 podStartE2EDuration="41.50569695s" podCreationTimestamp="2026-03-09 09:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:23:46.124119821 +0000 UTC m=+229.684047631" watchObservedRunningTime="2026-03-09 09:24:06.50569695 +0000 UTC m=+250.065624770" Mar 09 09:24:06 crc kubenswrapper[4971]: I0309 09:24:06.507279 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nvzgg","openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 09:24:06 crc kubenswrapper[4971]: I0309 09:24:06.507339 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 09:24:06 crc kubenswrapper[4971]: I0309 09:24:06.512287 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:24:06 crc kubenswrapper[4971]: I0309 09:24:06.529139 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=20.529123372 podStartE2EDuration="20.529123372s" podCreationTimestamp="2026-03-09 09:23:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:24:06.52576778 +0000 UTC m=+250.085695620" watchObservedRunningTime="2026-03-09 09:24:06.529123372 +0000 UTC m=+250.089051182" Mar 09 09:24:06 crc kubenswrapper[4971]: I0309 09:24:06.592620 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 09 09:24:06 crc kubenswrapper[4971]: I0309 09:24:06.652670 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 09 09:24:06 crc kubenswrapper[4971]: I0309 09:24:06.785970 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 09:24:06 crc kubenswrapper[4971]: I0309 09:24:06.787541 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 09 09:24:06 crc kubenswrapper[4971]: I0309 09:24:06.861886 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 09 09:24:06 crc kubenswrapper[4971]: I0309 09:24:06.910121 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 09 09:24:06 crc kubenswrapper[4971]: I0309 09:24:06.951304 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 09 09:24:06 crc kubenswrapper[4971]: I0309 09:24:06.970722 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.002770 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.089057 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.102183 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.133386 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.134521 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.149248 4971 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.159514 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2555712b-fa0a-4831-90ca-78d22b2e48b9" path="/var/lib/kubelet/pods/2555712b-fa0a-4831-90ca-78d22b2e48b9/volumes" Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.165336 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.217887 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.230625 4971 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.330801 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.387986 4971 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.409582 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.435092 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.440448 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.506872 4971 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.507081 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://8391fa528a05b270a93c269c629e988d0f5095321719186bfba5ae669bc74815" gracePeriod=5 Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.584850 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.584854 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.590694 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.596348 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.618807 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.717269 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.724008 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.778336 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.792973 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.859648 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.901030 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.901936 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 09:24:07 crc kubenswrapper[4971]: I0309 09:24:07.983135 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.055716 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.104713 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.106913 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.196999 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.232538 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.299944 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.348287 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.388779 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.420327 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.478225 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.529533 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.545341 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.556188 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.589433 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.602564 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.630141 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.665638 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.668117 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.698623 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.781507 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.849072 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.859896 4971 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.859964 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.860028 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.861064 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"af445e0a8b99a25e6231fa6a62c1c340d0342769818156004c8da94bdfc7c326"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.861209 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://af445e0a8b99a25e6231fa6a62c1c340d0342769818156004c8da94bdfc7c326" gracePeriod=30 Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.887532 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.899611 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 09 09:24:08 crc kubenswrapper[4971]: I0309 09:24:08.967783 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.064386 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.143975 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.191791 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.196778 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.296606 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.333016 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.333925 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.355652 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.389859 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.413979 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.496402 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5f6f899c9c-cccsj"] Mar 09 09:24:09 crc kubenswrapper[4971]: E0309 09:24:09.496653 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac5d26f5-5e17-4dd7-a334-5060a68b2d08" containerName="installer" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.496668 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac5d26f5-5e17-4dd7-a334-5060a68b2d08" containerName="installer" Mar 09 09:24:09 crc kubenswrapper[4971]: E0309 09:24:09.499951 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.499979 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 09:24:09 crc kubenswrapper[4971]: E0309 09:24:09.499998 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2555712b-fa0a-4831-90ca-78d22b2e48b9" containerName="oauth-openshift" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.500010 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2555712b-fa0a-4831-90ca-78d22b2e48b9" containerName="oauth-openshift" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.500247 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac5d26f5-5e17-4dd7-a334-5060a68b2d08" containerName="installer" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.501714 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.501750 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="2555712b-fa0a-4831-90ca-78d22b2e48b9" containerName="oauth-openshift" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.502337 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.516017 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5f6f899c9c-cccsj"] Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.516102 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.516150 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.516262 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.516314 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.516334 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.516913 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.517039 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.517325 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.517618 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.517684 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.518675 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.519497 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.534332 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.534415 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-user-template-login\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.534643 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-system-service-ca\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.534671 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgkt2\" (UniqueName: \"kubernetes.io/projected/24b851b8-0e94-4ba2-8272-15f04fb36e4d-kube-api-access-cgkt2\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.534701 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-system-router-certs\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.534734 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-system-session\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.534759 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/24b851b8-0e94-4ba2-8272-15f04fb36e4d-audit-policies\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.534784 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.534810 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.534836 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.534856 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-user-template-error\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.534879 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.534907 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/24b851b8-0e94-4ba2-8272-15f04fb36e4d-audit-dir\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.534933 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.548938 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.556418 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.568306 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.586554 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.596526 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.608087 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.635987 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.636034 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-user-template-login\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.636067 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-system-service-ca\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.636082 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgkt2\" (UniqueName: \"kubernetes.io/projected/24b851b8-0e94-4ba2-8272-15f04fb36e4d-kube-api-access-cgkt2\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.636102 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-system-router-certs\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.636126 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-system-session\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.636147 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/24b851b8-0e94-4ba2-8272-15f04fb36e4d-audit-policies\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.636166 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.636189 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.636208 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.636223 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-user-template-error\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.636240 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.636264 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/24b851b8-0e94-4ba2-8272-15f04fb36e4d-audit-dir\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.636281 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.637131 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/24b851b8-0e94-4ba2-8272-15f04fb36e4d-audit-dir\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.637854 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/24b851b8-0e94-4ba2-8272-15f04fb36e4d-audit-policies\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.638319 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-system-service-ca\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.638749 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.638953 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.641465 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.641523 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-user-template-error\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.641892 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-system-session\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.643067 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-user-template-login\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.643242 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-system-router-certs\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.643966 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.644835 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.650514 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/24b851b8-0e94-4ba2-8272-15f04fb36e4d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.654904 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgkt2\" (UniqueName: \"kubernetes.io/projected/24b851b8-0e94-4ba2-8272-15f04fb36e4d-kube-api-access-cgkt2\") pod \"oauth-openshift-5f6f899c9c-cccsj\" (UID: \"24b851b8-0e94-4ba2-8272-15f04fb36e4d\") " pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.656215 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.688399 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.808881 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.846940 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.902321 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.908195 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 09 09:24:09 crc kubenswrapper[4971]: I0309 09:24:09.918055 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 09 09:24:10 crc kubenswrapper[4971]: I0309 09:24:10.051982 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 09 09:24:10 crc kubenswrapper[4971]: I0309 09:24:10.154524 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 09 09:24:10 crc kubenswrapper[4971]: I0309 09:24:10.207328 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 09 09:24:10 crc kubenswrapper[4971]: I0309 09:24:10.470757 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 09 09:24:10 crc kubenswrapper[4971]: I0309 09:24:10.666964 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 09 09:24:10 crc kubenswrapper[4971]: I0309 09:24:10.771425 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 09 09:24:10 crc kubenswrapper[4971]: I0309 09:24:10.881256 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 09 09:24:11 crc kubenswrapper[4971]: I0309 09:24:11.108202 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 09 09:24:11 crc kubenswrapper[4971]: I0309 09:24:11.193243 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 09 09:24:11 crc kubenswrapper[4971]: I0309 09:24:11.351157 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 09 09:24:11 crc kubenswrapper[4971]: I0309 09:24:11.765349 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 09 09:24:11 crc kubenswrapper[4971]: I0309 09:24:11.889619 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 09 09:24:11 crc kubenswrapper[4971]: I0309 09:24:11.895715 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 09 09:24:11 crc kubenswrapper[4971]: I0309 09:24:11.928165 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 09 09:24:11 crc kubenswrapper[4971]: I0309 09:24:11.963642 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.151557 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5f6f899c9c-cccsj"] Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.188071 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.323821 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.621711 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.621770 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.666527 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" event={"ID":"24b851b8-0e94-4ba2-8272-15f04fb36e4d","Type":"ContainerStarted","Data":"b3bf3ed4e03060c0ed4d87512c05f3c0e2b890c2e4c7a997989c45fe6dfc39d7"} Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.666597 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" event={"ID":"24b851b8-0e94-4ba2-8272-15f04fb36e4d","Type":"ContainerStarted","Data":"c6efa8f945ca6825a69983bfa9c7c27a47d2ee6ff9d9098e719f0a3403a7583a"} Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.666620 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.668570 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.668617 4971 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="8391fa528a05b270a93c269c629e988d0f5095321719186bfba5ae669bc74815" exitCode=137 Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.668687 4971 scope.go:117] "RemoveContainer" containerID="8391fa528a05b270a93c269c629e988d0f5095321719186bfba5ae669bc74815" Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.668819 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.684604 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" podStartSLOduration=56.684586691 podStartE2EDuration="56.684586691s" podCreationTimestamp="2026-03-09 09:23:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:24:12.684260641 +0000 UTC m=+256.244188461" watchObservedRunningTime="2026-03-09 09:24:12.684586691 +0000 UTC m=+256.244514501" Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.686284 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.686331 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.686401 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.686423 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.686442 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.686472 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.686739 4971 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.686979 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.687026 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.687396 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.691781 4971 scope.go:117] "RemoveContainer" containerID="8391fa528a05b270a93c269c629e988d0f5095321719186bfba5ae669bc74815" Mar 09 09:24:12 crc kubenswrapper[4971]: E0309 09:24:12.692316 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8391fa528a05b270a93c269c629e988d0f5095321719186bfba5ae669bc74815\": container with ID starting with 8391fa528a05b270a93c269c629e988d0f5095321719186bfba5ae669bc74815 not found: ID does not exist" containerID="8391fa528a05b270a93c269c629e988d0f5095321719186bfba5ae669bc74815" Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.692369 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8391fa528a05b270a93c269c629e988d0f5095321719186bfba5ae669bc74815"} err="failed to get container status \"8391fa528a05b270a93c269c629e988d0f5095321719186bfba5ae669bc74815\": rpc error: code = NotFound desc = could not find container \"8391fa528a05b270a93c269c629e988d0f5095321719186bfba5ae669bc74815\": container with ID starting with 8391fa528a05b270a93c269c629e988d0f5095321719186bfba5ae669bc74815 not found: ID does not exist" Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.694318 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.743551 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5f6f899c9c-cccsj" Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.787966 4971 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.788014 4971 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.788057 4971 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:12 crc kubenswrapper[4971]: I0309 09:24:12.788076 4971 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:13 crc kubenswrapper[4971]: I0309 09:24:13.160625 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 09 09:24:13 crc kubenswrapper[4971]: I0309 09:24:13.162257 4971 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 09 09:24:13 crc kubenswrapper[4971]: I0309 09:24:13.173832 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 09:24:13 crc kubenswrapper[4971]: I0309 09:24:13.173868 4971 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="20a24b46-b8e5-4c4b-bb20-52219a96fad4" Mar 09 09:24:13 crc kubenswrapper[4971]: I0309 09:24:13.179743 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 09:24:13 crc kubenswrapper[4971]: I0309 09:24:13.179931 4971 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="20a24b46-b8e5-4c4b-bb20-52219a96fad4" Mar 09 09:24:13 crc kubenswrapper[4971]: I0309 09:24:13.334295 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 09 09:24:13 crc kubenswrapper[4971]: I0309 09:24:13.404853 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 09 09:24:13 crc kubenswrapper[4971]: I0309 09:24:13.411648 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 09:24:13 crc kubenswrapper[4971]: I0309 09:24:13.516712 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 09 09:24:14 crc kubenswrapper[4971]: I0309 09:24:14.019809 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 09 09:24:14 crc kubenswrapper[4971]: I0309 09:24:14.794926 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:24:14 crc kubenswrapper[4971]: I0309 09:24:14.794987 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:24:15 crc kubenswrapper[4971]: I0309 09:24:15.504450 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 09 09:24:33 crc kubenswrapper[4971]: I0309 09:24:33.804563 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg"] Mar 09 09:24:33 crc kubenswrapper[4971]: I0309 09:24:33.805837 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg" podUID="733bd451-d1e8-48db-a9fb-87a12b247c39" containerName="route-controller-manager" containerID="cri-o://3f3ec83adcf3961a42e00e506fa209e9994eabd95f3078ff522c90255a87cc3d" gracePeriod=30 Mar 09 09:24:33 crc kubenswrapper[4971]: I0309 09:24:33.807575 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb"] Mar 09 09:24:33 crc kubenswrapper[4971]: I0309 09:24:33.807796 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" podUID="3f4aa567-6a43-43fb-860b-e012a2eb2878" containerName="controller-manager" containerID="cri-o://2e60fda8c9dd00e86bf523d0436a0777c051bbc014b328a107a7c348fd61a289" gracePeriod=30 Mar 09 09:24:33 crc kubenswrapper[4971]: I0309 09:24:33.954097 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550804-v5hhp"] Mar 09 09:24:33 crc kubenswrapper[4971]: I0309 09:24:33.955145 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550804-v5hhp" Mar 09 09:24:33 crc kubenswrapper[4971]: I0309 09:24:33.956935 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:24:33 crc kubenswrapper[4971]: I0309 09:24:33.956961 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:24:33 crc kubenswrapper[4971]: I0309 09:24:33.957074 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-xhrv2" Mar 09 09:24:33 crc kubenswrapper[4971]: I0309 09:24:33.967254 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550804-v5hhp"] Mar 09 09:24:33 crc kubenswrapper[4971]: I0309 09:24:33.990459 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2ccn\" (UniqueName: \"kubernetes.io/projected/d3170fbc-6018-4b70-9f33-54a2e285fcd3-kube-api-access-r2ccn\") pod \"auto-csr-approver-29550804-v5hhp\" (UID: \"d3170fbc-6018-4b70-9f33-54a2e285fcd3\") " pod="openshift-infra/auto-csr-approver-29550804-v5hhp" Mar 09 09:24:33 crc kubenswrapper[4971]: I0309 09:24:33.997514 4971 patch_prober.go:28] interesting pod/route-controller-manager-7449c948f6-fvmkg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" start-of-body= Mar 09 09:24:33 crc kubenswrapper[4971]: I0309 09:24:33.997565 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg" podUID="733bd451-d1e8-48db-a9fb-87a12b247c39" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.091532 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2ccn\" (UniqueName: \"kubernetes.io/projected/d3170fbc-6018-4b70-9f33-54a2e285fcd3-kube-api-access-r2ccn\") pod \"auto-csr-approver-29550804-v5hhp\" (UID: \"d3170fbc-6018-4b70-9f33-54a2e285fcd3\") " pod="openshift-infra/auto-csr-approver-29550804-v5hhp" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.122695 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2ccn\" (UniqueName: \"kubernetes.io/projected/d3170fbc-6018-4b70-9f33-54a2e285fcd3-kube-api-access-r2ccn\") pod \"auto-csr-approver-29550804-v5hhp\" (UID: \"d3170fbc-6018-4b70-9f33-54a2e285fcd3\") " pod="openshift-infra/auto-csr-approver-29550804-v5hhp" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.331148 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.385225 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.395566 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvjfw\" (UniqueName: \"kubernetes.io/projected/733bd451-d1e8-48db-a9fb-87a12b247c39-kube-api-access-lvjfw\") pod \"733bd451-d1e8-48db-a9fb-87a12b247c39\" (UID: \"733bd451-d1e8-48db-a9fb-87a12b247c39\") " Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.395619 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/733bd451-d1e8-48db-a9fb-87a12b247c39-config\") pod \"733bd451-d1e8-48db-a9fb-87a12b247c39\" (UID: \"733bd451-d1e8-48db-a9fb-87a12b247c39\") " Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.395738 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/733bd451-d1e8-48db-a9fb-87a12b247c39-serving-cert\") pod \"733bd451-d1e8-48db-a9fb-87a12b247c39\" (UID: \"733bd451-d1e8-48db-a9fb-87a12b247c39\") " Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.395773 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/733bd451-d1e8-48db-a9fb-87a12b247c39-client-ca\") pod \"733bd451-d1e8-48db-a9fb-87a12b247c39\" (UID: \"733bd451-d1e8-48db-a9fb-87a12b247c39\") " Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.396339 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/733bd451-d1e8-48db-a9fb-87a12b247c39-config" (OuterVolumeSpecName: "config") pod "733bd451-d1e8-48db-a9fb-87a12b247c39" (UID: "733bd451-d1e8-48db-a9fb-87a12b247c39"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.396369 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/733bd451-d1e8-48db-a9fb-87a12b247c39-client-ca" (OuterVolumeSpecName: "client-ca") pod "733bd451-d1e8-48db-a9fb-87a12b247c39" (UID: "733bd451-d1e8-48db-a9fb-87a12b247c39"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.398816 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/733bd451-d1e8-48db-a9fb-87a12b247c39-kube-api-access-lvjfw" (OuterVolumeSpecName: "kube-api-access-lvjfw") pod "733bd451-d1e8-48db-a9fb-87a12b247c39" (UID: "733bd451-d1e8-48db-a9fb-87a12b247c39"). InnerVolumeSpecName "kube-api-access-lvjfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.399187 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/733bd451-d1e8-48db-a9fb-87a12b247c39-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "733bd451-d1e8-48db-a9fb-87a12b247c39" (UID: "733bd451-d1e8-48db-a9fb-87a12b247c39"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.414137 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550804-v5hhp" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.496847 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f4aa567-6a43-43fb-860b-e012a2eb2878-proxy-ca-bundles\") pod \"3f4aa567-6a43-43fb-860b-e012a2eb2878\" (UID: \"3f4aa567-6a43-43fb-860b-e012a2eb2878\") " Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.496962 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4aa567-6a43-43fb-860b-e012a2eb2878-serving-cert\") pod \"3f4aa567-6a43-43fb-860b-e012a2eb2878\" (UID: \"3f4aa567-6a43-43fb-860b-e012a2eb2878\") " Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.496992 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f4aa567-6a43-43fb-860b-e012a2eb2878-config\") pod \"3f4aa567-6a43-43fb-860b-e012a2eb2878\" (UID: \"3f4aa567-6a43-43fb-860b-e012a2eb2878\") " Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.497038 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f4aa567-6a43-43fb-860b-e012a2eb2878-client-ca\") pod \"3f4aa567-6a43-43fb-860b-e012a2eb2878\" (UID: \"3f4aa567-6a43-43fb-860b-e012a2eb2878\") " Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.497076 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg7zb\" (UniqueName: \"kubernetes.io/projected/3f4aa567-6a43-43fb-860b-e012a2eb2878-kube-api-access-qg7zb\") pod \"3f4aa567-6a43-43fb-860b-e012a2eb2878\" (UID: \"3f4aa567-6a43-43fb-860b-e012a2eb2878\") " Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.497299 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/733bd451-d1e8-48db-a9fb-87a12b247c39-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.497311 4971 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/733bd451-d1e8-48db-a9fb-87a12b247c39-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.497322 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvjfw\" (UniqueName: \"kubernetes.io/projected/733bd451-d1e8-48db-a9fb-87a12b247c39-kube-api-access-lvjfw\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.497332 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/733bd451-d1e8-48db-a9fb-87a12b247c39-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.501219 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f4aa567-6a43-43fb-860b-e012a2eb2878-kube-api-access-qg7zb" (OuterVolumeSpecName: "kube-api-access-qg7zb") pod "3f4aa567-6a43-43fb-860b-e012a2eb2878" (UID: "3f4aa567-6a43-43fb-860b-e012a2eb2878"). InnerVolumeSpecName "kube-api-access-qg7zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.503895 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f4aa567-6a43-43fb-860b-e012a2eb2878-config" (OuterVolumeSpecName: "config") pod "3f4aa567-6a43-43fb-860b-e012a2eb2878" (UID: "3f4aa567-6a43-43fb-860b-e012a2eb2878"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.504420 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f4aa567-6a43-43fb-860b-e012a2eb2878-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3f4aa567-6a43-43fb-860b-e012a2eb2878" (UID: "3f4aa567-6a43-43fb-860b-e012a2eb2878"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.504817 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f4aa567-6a43-43fb-860b-e012a2eb2878-client-ca" (OuterVolumeSpecName: "client-ca") pod "3f4aa567-6a43-43fb-860b-e012a2eb2878" (UID: "3f4aa567-6a43-43fb-860b-e012a2eb2878"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.506578 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f4aa567-6a43-43fb-860b-e012a2eb2878-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3f4aa567-6a43-43fb-860b-e012a2eb2878" (UID: "3f4aa567-6a43-43fb-860b-e012a2eb2878"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.598367 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4aa567-6a43-43fb-860b-e012a2eb2878-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.598459 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f4aa567-6a43-43fb-860b-e012a2eb2878-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.598479 4971 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f4aa567-6a43-43fb-860b-e012a2eb2878-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.598521 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg7zb\" (UniqueName: \"kubernetes.io/projected/3f4aa567-6a43-43fb-860b-e012a2eb2878-kube-api-access-qg7zb\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.598541 4971 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f4aa567-6a43-43fb-860b-e012a2eb2878-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.616033 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550804-v5hhp"] Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.822918 4971 generic.go:334] "Generic (PLEG): container finished" podID="733bd451-d1e8-48db-a9fb-87a12b247c39" containerID="3f3ec83adcf3961a42e00e506fa209e9994eabd95f3078ff522c90255a87cc3d" exitCode=0 Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.823014 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg" event={"ID":"733bd451-d1e8-48db-a9fb-87a12b247c39","Type":"ContainerDied","Data":"3f3ec83adcf3961a42e00e506fa209e9994eabd95f3078ff522c90255a87cc3d"} Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.823041 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg" event={"ID":"733bd451-d1e8-48db-a9fb-87a12b247c39","Type":"ContainerDied","Data":"61d9edc1658341fe2cb9fe37f92690ea33d5d0e2670b094f01a048a233d87413"} Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.823077 4971 scope.go:117] "RemoveContainer" containerID="3f3ec83adcf3961a42e00e506fa209e9994eabd95f3078ff522c90255a87cc3d" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.823103 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.824653 4971 generic.go:334] "Generic (PLEG): container finished" podID="3f4aa567-6a43-43fb-860b-e012a2eb2878" containerID="2e60fda8c9dd00e86bf523d0436a0777c051bbc014b328a107a7c348fd61a289" exitCode=0 Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.824794 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.824828 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" event={"ID":"3f4aa567-6a43-43fb-860b-e012a2eb2878","Type":"ContainerDied","Data":"2e60fda8c9dd00e86bf523d0436a0777c051bbc014b328a107a7c348fd61a289"} Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.824848 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb" event={"ID":"3f4aa567-6a43-43fb-860b-e012a2eb2878","Type":"ContainerDied","Data":"0dcb7e2a1bf67b029b73157407b99f1beeb42683f3bf11888e8590386114183b"} Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.826581 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550804-v5hhp" event={"ID":"d3170fbc-6018-4b70-9f33-54a2e285fcd3","Type":"ContainerStarted","Data":"b51633c246076dec418ed39afcaa5dabb7068e5eb18b5a4ad6de7781f178f936"} Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.865606 4971 scope.go:117] "RemoveContainer" containerID="3f3ec83adcf3961a42e00e506fa209e9994eabd95f3078ff522c90255a87cc3d" Mar 09 09:24:34 crc kubenswrapper[4971]: E0309 09:24:34.866164 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f3ec83adcf3961a42e00e506fa209e9994eabd95f3078ff522c90255a87cc3d\": container with ID starting with 3f3ec83adcf3961a42e00e506fa209e9994eabd95f3078ff522c90255a87cc3d not found: ID does not exist" containerID="3f3ec83adcf3961a42e00e506fa209e9994eabd95f3078ff522c90255a87cc3d" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.866192 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f3ec83adcf3961a42e00e506fa209e9994eabd95f3078ff522c90255a87cc3d"} err="failed to get container status \"3f3ec83adcf3961a42e00e506fa209e9994eabd95f3078ff522c90255a87cc3d\": rpc error: code = NotFound desc = could not find container \"3f3ec83adcf3961a42e00e506fa209e9994eabd95f3078ff522c90255a87cc3d\": container with ID starting with 3f3ec83adcf3961a42e00e506fa209e9994eabd95f3078ff522c90255a87cc3d not found: ID does not exist" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.866212 4971 scope.go:117] "RemoveContainer" containerID="2e60fda8c9dd00e86bf523d0436a0777c051bbc014b328a107a7c348fd61a289" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.868289 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb"] Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.880863 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6ff784b8b5-kgfgb"] Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.886235 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg"] Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.892273 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7449c948f6-fvmkg"] Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.899818 4971 scope.go:117] "RemoveContainer" containerID="2e60fda8c9dd00e86bf523d0436a0777c051bbc014b328a107a7c348fd61a289" Mar 09 09:24:34 crc kubenswrapper[4971]: E0309 09:24:34.900252 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e60fda8c9dd00e86bf523d0436a0777c051bbc014b328a107a7c348fd61a289\": container with ID starting with 2e60fda8c9dd00e86bf523d0436a0777c051bbc014b328a107a7c348fd61a289 not found: ID does not exist" containerID="2e60fda8c9dd00e86bf523d0436a0777c051bbc014b328a107a7c348fd61a289" Mar 09 09:24:34 crc kubenswrapper[4971]: I0309 09:24:34.900307 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e60fda8c9dd00e86bf523d0436a0777c051bbc014b328a107a7c348fd61a289"} err="failed to get container status \"2e60fda8c9dd00e86bf523d0436a0777c051bbc014b328a107a7c348fd61a289\": rpc error: code = NotFound desc = could not find container \"2e60fda8c9dd00e86bf523d0436a0777c051bbc014b328a107a7c348fd61a289\": container with ID starting with 2e60fda8c9dd00e86bf523d0436a0777c051bbc014b328a107a7c348fd61a289 not found: ID does not exist" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.157920 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f4aa567-6a43-43fb-860b-e012a2eb2878" path="/var/lib/kubelet/pods/3f4aa567-6a43-43fb-860b-e012a2eb2878/volumes" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.158769 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="733bd451-d1e8-48db-a9fb-87a12b247c39" path="/var/lib/kubelet/pods/733bd451-d1e8-48db-a9fb-87a12b247c39/volumes" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.507246 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx"] Mar 09 09:24:35 crc kubenswrapper[4971]: E0309 09:24:35.507570 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4aa567-6a43-43fb-860b-e012a2eb2878" containerName="controller-manager" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.507587 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4aa567-6a43-43fb-860b-e012a2eb2878" containerName="controller-manager" Mar 09 09:24:35 crc kubenswrapper[4971]: E0309 09:24:35.507597 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="733bd451-d1e8-48db-a9fb-87a12b247c39" containerName="route-controller-manager" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.507606 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="733bd451-d1e8-48db-a9fb-87a12b247c39" containerName="route-controller-manager" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.507707 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="733bd451-d1e8-48db-a9fb-87a12b247c39" containerName="route-controller-manager" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.507722 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f4aa567-6a43-43fb-860b-e012a2eb2878" containerName="controller-manager" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.508157 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.510823 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.514179 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-f8d9f876c-gcxkz"] Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.516289 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.517264 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.517859 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.518450 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.519282 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.519318 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.521148 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.528895 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f8d9f876c-gcxkz"] Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.538178 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.538491 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.538728 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.538729 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.539316 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.546157 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.549279 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx"] Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.628122 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f500d6c0-ede1-4254-b268-652efd3b1d80-serving-cert\") pod \"controller-manager-f8d9f876c-gcxkz\" (UID: \"f500d6c0-ede1-4254-b268-652efd3b1d80\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.628195 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/538b6646-1cc1-41f8-aebf-c088acdfdbdf-client-ca\") pod \"route-controller-manager-7bf9495b6c-hdcgx\" (UID: \"538b6646-1cc1-41f8-aebf-c088acdfdbdf\") " pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.628232 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f500d6c0-ede1-4254-b268-652efd3b1d80-client-ca\") pod \"controller-manager-f8d9f876c-gcxkz\" (UID: \"f500d6c0-ede1-4254-b268-652efd3b1d80\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.628279 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/538b6646-1cc1-41f8-aebf-c088acdfdbdf-config\") pod \"route-controller-manager-7bf9495b6c-hdcgx\" (UID: \"538b6646-1cc1-41f8-aebf-c088acdfdbdf\") " pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.628335 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f500d6c0-ede1-4254-b268-652efd3b1d80-proxy-ca-bundles\") pod \"controller-manager-f8d9f876c-gcxkz\" (UID: \"f500d6c0-ede1-4254-b268-652efd3b1d80\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.628454 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7db5s\" (UniqueName: \"kubernetes.io/projected/538b6646-1cc1-41f8-aebf-c088acdfdbdf-kube-api-access-7db5s\") pod \"route-controller-manager-7bf9495b6c-hdcgx\" (UID: \"538b6646-1cc1-41f8-aebf-c088acdfdbdf\") " pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.628475 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/538b6646-1cc1-41f8-aebf-c088acdfdbdf-serving-cert\") pod \"route-controller-manager-7bf9495b6c-hdcgx\" (UID: \"538b6646-1cc1-41f8-aebf-c088acdfdbdf\") " pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.628492 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb8q9\" (UniqueName: \"kubernetes.io/projected/f500d6c0-ede1-4254-b268-652efd3b1d80-kube-api-access-wb8q9\") pod \"controller-manager-f8d9f876c-gcxkz\" (UID: \"f500d6c0-ede1-4254-b268-652efd3b1d80\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.628537 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f500d6c0-ede1-4254-b268-652efd3b1d80-config\") pod \"controller-manager-f8d9f876c-gcxkz\" (UID: \"f500d6c0-ede1-4254-b268-652efd3b1d80\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.729914 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/538b6646-1cc1-41f8-aebf-c088acdfdbdf-config\") pod \"route-controller-manager-7bf9495b6c-hdcgx\" (UID: \"538b6646-1cc1-41f8-aebf-c088acdfdbdf\") " pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.730002 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f500d6c0-ede1-4254-b268-652efd3b1d80-proxy-ca-bundles\") pod \"controller-manager-f8d9f876c-gcxkz\" (UID: \"f500d6c0-ede1-4254-b268-652efd3b1d80\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.730030 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7db5s\" (UniqueName: \"kubernetes.io/projected/538b6646-1cc1-41f8-aebf-c088acdfdbdf-kube-api-access-7db5s\") pod \"route-controller-manager-7bf9495b6c-hdcgx\" (UID: \"538b6646-1cc1-41f8-aebf-c088acdfdbdf\") " pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.730059 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/538b6646-1cc1-41f8-aebf-c088acdfdbdf-serving-cert\") pod \"route-controller-manager-7bf9495b6c-hdcgx\" (UID: \"538b6646-1cc1-41f8-aebf-c088acdfdbdf\") " pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.730083 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb8q9\" (UniqueName: \"kubernetes.io/projected/f500d6c0-ede1-4254-b268-652efd3b1d80-kube-api-access-wb8q9\") pod \"controller-manager-f8d9f876c-gcxkz\" (UID: \"f500d6c0-ede1-4254-b268-652efd3b1d80\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.730116 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f500d6c0-ede1-4254-b268-652efd3b1d80-config\") pod \"controller-manager-f8d9f876c-gcxkz\" (UID: \"f500d6c0-ede1-4254-b268-652efd3b1d80\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.730142 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f500d6c0-ede1-4254-b268-652efd3b1d80-serving-cert\") pod \"controller-manager-f8d9f876c-gcxkz\" (UID: \"f500d6c0-ede1-4254-b268-652efd3b1d80\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.730171 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/538b6646-1cc1-41f8-aebf-c088acdfdbdf-client-ca\") pod \"route-controller-manager-7bf9495b6c-hdcgx\" (UID: \"538b6646-1cc1-41f8-aebf-c088acdfdbdf\") " pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.730198 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f500d6c0-ede1-4254-b268-652efd3b1d80-client-ca\") pod \"controller-manager-f8d9f876c-gcxkz\" (UID: \"f500d6c0-ede1-4254-b268-652efd3b1d80\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.731897 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/538b6646-1cc1-41f8-aebf-c088acdfdbdf-config\") pod \"route-controller-manager-7bf9495b6c-hdcgx\" (UID: \"538b6646-1cc1-41f8-aebf-c088acdfdbdf\") " pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.732892 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f500d6c0-ede1-4254-b268-652efd3b1d80-client-ca\") pod \"controller-manager-f8d9f876c-gcxkz\" (UID: \"f500d6c0-ede1-4254-b268-652efd3b1d80\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.733871 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/538b6646-1cc1-41f8-aebf-c088acdfdbdf-client-ca\") pod \"route-controller-manager-7bf9495b6c-hdcgx\" (UID: \"538b6646-1cc1-41f8-aebf-c088acdfdbdf\") " pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.737001 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f500d6c0-ede1-4254-b268-652efd3b1d80-proxy-ca-bundles\") pod \"controller-manager-f8d9f876c-gcxkz\" (UID: \"f500d6c0-ede1-4254-b268-652efd3b1d80\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.737843 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/538b6646-1cc1-41f8-aebf-c088acdfdbdf-serving-cert\") pod \"route-controller-manager-7bf9495b6c-hdcgx\" (UID: \"538b6646-1cc1-41f8-aebf-c088acdfdbdf\") " pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.737851 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f500d6c0-ede1-4254-b268-652efd3b1d80-serving-cert\") pod \"controller-manager-f8d9f876c-gcxkz\" (UID: \"f500d6c0-ede1-4254-b268-652efd3b1d80\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.738280 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f500d6c0-ede1-4254-b268-652efd3b1d80-config\") pod \"controller-manager-f8d9f876c-gcxkz\" (UID: \"f500d6c0-ede1-4254-b268-652efd3b1d80\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.748962 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb8q9\" (UniqueName: \"kubernetes.io/projected/f500d6c0-ede1-4254-b268-652efd3b1d80-kube-api-access-wb8q9\") pod \"controller-manager-f8d9f876c-gcxkz\" (UID: \"f500d6c0-ede1-4254-b268-652efd3b1d80\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.753445 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7db5s\" (UniqueName: \"kubernetes.io/projected/538b6646-1cc1-41f8-aebf-c088acdfdbdf-kube-api-access-7db5s\") pod \"route-controller-manager-7bf9495b6c-hdcgx\" (UID: \"538b6646-1cc1-41f8-aebf-c088acdfdbdf\") " pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.832470 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550804-v5hhp" event={"ID":"d3170fbc-6018-4b70-9f33-54a2e285fcd3","Type":"ContainerStarted","Data":"eadf83ab2987345d8537254ea5c39ea61b842d6ca15febd7c3dcc9ccd44446db"} Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.834563 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.836999 4971 generic.go:334] "Generic (PLEG): container finished" podID="1ed6451f-4bc6-4dcc-b84c-413dbb95114b" containerID="ce02bb1075c284aa444bfff808d0c5b398e493fbc55a84134cd986b105388be0" exitCode=0 Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.837048 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zqnt8" event={"ID":"1ed6451f-4bc6-4dcc-b84c-413dbb95114b","Type":"ContainerDied","Data":"ce02bb1075c284aa444bfff808d0c5b398e493fbc55a84134cd986b105388be0"} Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.837406 4971 scope.go:117] "RemoveContainer" containerID="ce02bb1075c284aa444bfff808d0c5b398e493fbc55a84134cd986b105388be0" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.847079 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550804-v5hhp" podStartSLOduration=1.9638606570000001 podStartE2EDuration="2.847059774s" podCreationTimestamp="2026-03-09 09:24:33 +0000 UTC" firstStartedPulling="2026-03-09 09:24:34.622312118 +0000 UTC m=+278.182239948" lastFinishedPulling="2026-03-09 09:24:35.505511255 +0000 UTC m=+279.065439065" observedRunningTime="2026-03-09 09:24:35.845253683 +0000 UTC m=+279.405181513" watchObservedRunningTime="2026-03-09 09:24:35.847059774 +0000 UTC m=+279.406987594" Mar 09 09:24:35 crc kubenswrapper[4971]: I0309 09:24:35.875561 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" Mar 09 09:24:36 crc kubenswrapper[4971]: I0309 09:24:36.057270 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx"] Mar 09 09:24:36 crc kubenswrapper[4971]: I0309 09:24:36.111456 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f8d9f876c-gcxkz"] Mar 09 09:24:36 crc kubenswrapper[4971]: W0309 09:24:36.118202 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod538b6646_1cc1_41f8_aebf_c088acdfdbdf.slice/crio-ab209aa9ef900c25df53273a12bc8b146e6fb34691c9a9f13049b88a38f6707b WatchSource:0}: Error finding container ab209aa9ef900c25df53273a12bc8b146e6fb34691c9a9f13049b88a38f6707b: Status 404 returned error can't find the container with id ab209aa9ef900c25df53273a12bc8b146e6fb34691c9a9f13049b88a38f6707b Mar 09 09:24:36 crc kubenswrapper[4971]: W0309 09:24:36.120779 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf500d6c0_ede1_4254_b268_652efd3b1d80.slice/crio-f45e3f7b5159f4bce9dfdef5e58ff584e8fc0a25076f432769838797d176ab9a WatchSource:0}: Error finding container f45e3f7b5159f4bce9dfdef5e58ff584e8fc0a25076f432769838797d176ab9a: Status 404 returned error can't find the container with id f45e3f7b5159f4bce9dfdef5e58ff584e8fc0a25076f432769838797d176ab9a Mar 09 09:24:36 crc kubenswrapper[4971]: I0309 09:24:36.844307 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" event={"ID":"f500d6c0-ede1-4254-b268-652efd3b1d80","Type":"ContainerStarted","Data":"457e598af0ea390d908a69275531c987be62547bf87c8d2e577ef18c29a88912"} Mar 09 09:24:36 crc kubenswrapper[4971]: I0309 09:24:36.844411 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" event={"ID":"f500d6c0-ede1-4254-b268-652efd3b1d80","Type":"ContainerStarted","Data":"f45e3f7b5159f4bce9dfdef5e58ff584e8fc0a25076f432769838797d176ab9a"} Mar 09 09:24:36 crc kubenswrapper[4971]: I0309 09:24:36.847014 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" Mar 09 09:24:36 crc kubenswrapper[4971]: I0309 09:24:36.848793 4971 generic.go:334] "Generic (PLEG): container finished" podID="d3170fbc-6018-4b70-9f33-54a2e285fcd3" containerID="eadf83ab2987345d8537254ea5c39ea61b842d6ca15febd7c3dcc9ccd44446db" exitCode=0 Mar 09 09:24:36 crc kubenswrapper[4971]: I0309 09:24:36.848875 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550804-v5hhp" event={"ID":"d3170fbc-6018-4b70-9f33-54a2e285fcd3","Type":"ContainerDied","Data":"eadf83ab2987345d8537254ea5c39ea61b842d6ca15febd7c3dcc9ccd44446db"} Mar 09 09:24:36 crc kubenswrapper[4971]: I0309 09:24:36.850516 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx" event={"ID":"538b6646-1cc1-41f8-aebf-c088acdfdbdf","Type":"ContainerStarted","Data":"20cf5045f8c0e01d3bc3d361004945834aaa5250208716649a9f20509097eccf"} Mar 09 09:24:36 crc kubenswrapper[4971]: I0309 09:24:36.850556 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx" event={"ID":"538b6646-1cc1-41f8-aebf-c088acdfdbdf","Type":"ContainerStarted","Data":"ab209aa9ef900c25df53273a12bc8b146e6fb34691c9a9f13049b88a38f6707b"} Mar 09 09:24:36 crc kubenswrapper[4971]: I0309 09:24:36.850757 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx" Mar 09 09:24:36 crc kubenswrapper[4971]: I0309 09:24:36.852111 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zqnt8" event={"ID":"1ed6451f-4bc6-4dcc-b84c-413dbb95114b","Type":"ContainerStarted","Data":"a8bdae3a9aa58e61af07185b5e489fce445e10466f2b4ff1c92a91275f7934d2"} Mar 09 09:24:36 crc kubenswrapper[4971]: I0309 09:24:36.852374 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zqnt8" Mar 09 09:24:36 crc kubenswrapper[4971]: I0309 09:24:36.853736 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" Mar 09 09:24:36 crc kubenswrapper[4971]: I0309 09:24:36.854441 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zqnt8" Mar 09 09:24:36 crc kubenswrapper[4971]: I0309 09:24:36.859211 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx" Mar 09 09:24:36 crc kubenswrapper[4971]: I0309 09:24:36.867268 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" podStartSLOduration=3.867250284 podStartE2EDuration="3.867250284s" podCreationTimestamp="2026-03-09 09:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:24:36.86466555 +0000 UTC m=+280.424593350" watchObservedRunningTime="2026-03-09 09:24:36.867250284 +0000 UTC m=+280.427178104" Mar 09 09:24:36 crc kubenswrapper[4971]: I0309 09:24:36.950300 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx" podStartSLOduration=3.950283651 podStartE2EDuration="3.950283651s" podCreationTimestamp="2026-03-09 09:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:24:36.946012338 +0000 UTC m=+280.505940148" watchObservedRunningTime="2026-03-09 09:24:36.950283651 +0000 UTC m=+280.510211461" Mar 09 09:24:38 crc kubenswrapper[4971]: I0309 09:24:38.095616 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550804-v5hhp" Mar 09 09:24:38 crc kubenswrapper[4971]: I0309 09:24:38.120155 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2ccn\" (UniqueName: \"kubernetes.io/projected/d3170fbc-6018-4b70-9f33-54a2e285fcd3-kube-api-access-r2ccn\") pod \"d3170fbc-6018-4b70-9f33-54a2e285fcd3\" (UID: \"d3170fbc-6018-4b70-9f33-54a2e285fcd3\") " Mar 09 09:24:38 crc kubenswrapper[4971]: I0309 09:24:38.133575 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3170fbc-6018-4b70-9f33-54a2e285fcd3-kube-api-access-r2ccn" (OuterVolumeSpecName: "kube-api-access-r2ccn") pod "d3170fbc-6018-4b70-9f33-54a2e285fcd3" (UID: "d3170fbc-6018-4b70-9f33-54a2e285fcd3"). InnerVolumeSpecName "kube-api-access-r2ccn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:24:38 crc kubenswrapper[4971]: I0309 09:24:38.221339 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2ccn\" (UniqueName: \"kubernetes.io/projected/d3170fbc-6018-4b70-9f33-54a2e285fcd3-kube-api-access-r2ccn\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:38 crc kubenswrapper[4971]: I0309 09:24:38.863894 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550804-v5hhp" event={"ID":"d3170fbc-6018-4b70-9f33-54a2e285fcd3","Type":"ContainerDied","Data":"b51633c246076dec418ed39afcaa5dabb7068e5eb18b5a4ad6de7781f178f936"} Mar 09 09:24:38 crc kubenswrapper[4971]: I0309 09:24:38.863966 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b51633c246076dec418ed39afcaa5dabb7068e5eb18b5a4ad6de7781f178f936" Mar 09 09:24:38 crc kubenswrapper[4971]: I0309 09:24:38.864054 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550804-v5hhp" Mar 09 09:24:39 crc kubenswrapper[4971]: I0309 09:24:39.872207 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 09 09:24:39 crc kubenswrapper[4971]: I0309 09:24:39.874041 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 09:24:39 crc kubenswrapper[4971]: I0309 09:24:39.874598 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 09 09:24:39 crc kubenswrapper[4971]: I0309 09:24:39.874646 4971 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="af445e0a8b99a25e6231fa6a62c1c340d0342769818156004c8da94bdfc7c326" exitCode=137 Mar 09 09:24:39 crc kubenswrapper[4971]: I0309 09:24:39.874680 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"af445e0a8b99a25e6231fa6a62c1c340d0342769818156004c8da94bdfc7c326"} Mar 09 09:24:39 crc kubenswrapper[4971]: I0309 09:24:39.874712 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7db9df3401bc989de35204fc112e7afc1c1f63f531bda92e96dd8e585c7b8bb4"} Mar 09 09:24:39 crc kubenswrapper[4971]: I0309 09:24:39.874728 4971 scope.go:117] "RemoveContainer" containerID="dd82286e7ee5adff627366157b3654b38161971589300a0eaaa567b811af65e2" Mar 09 09:24:40 crc kubenswrapper[4971]: I0309 09:24:40.884577 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 09 09:24:40 crc kubenswrapper[4971]: I0309 09:24:40.887158 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 09:24:44 crc kubenswrapper[4971]: I0309 09:24:44.794781 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:24:44 crc kubenswrapper[4971]: I0309 09:24:44.795835 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:24:46 crc kubenswrapper[4971]: I0309 09:24:46.935890 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:24:48 crc kubenswrapper[4971]: I0309 09:24:48.859564 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:24:48 crc kubenswrapper[4971]: I0309 09:24:48.863951 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:24:56 crc kubenswrapper[4971]: I0309 09:24:56.941698 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:24:58 crc kubenswrapper[4971]: I0309 09:24:58.069706 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f8d9f876c-gcxkz"] Mar 09 09:24:58 crc kubenswrapper[4971]: I0309 09:24:58.070272 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" podUID="f500d6c0-ede1-4254-b268-652efd3b1d80" containerName="controller-manager" containerID="cri-o://457e598af0ea390d908a69275531c987be62547bf87c8d2e577ef18c29a88912" gracePeriod=30 Mar 09 09:24:58 crc kubenswrapper[4971]: I0309 09:24:58.558834 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" Mar 09 09:24:58 crc kubenswrapper[4971]: I0309 09:24:58.684391 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f500d6c0-ede1-4254-b268-652efd3b1d80-config\") pod \"f500d6c0-ede1-4254-b268-652efd3b1d80\" (UID: \"f500d6c0-ede1-4254-b268-652efd3b1d80\") " Mar 09 09:24:58 crc kubenswrapper[4971]: I0309 09:24:58.685191 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f500d6c0-ede1-4254-b268-652efd3b1d80-client-ca\") pod \"f500d6c0-ede1-4254-b268-652efd3b1d80\" (UID: \"f500d6c0-ede1-4254-b268-652efd3b1d80\") " Mar 09 09:24:58 crc kubenswrapper[4971]: I0309 09:24:58.685242 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f500d6c0-ede1-4254-b268-652efd3b1d80-serving-cert\") pod \"f500d6c0-ede1-4254-b268-652efd3b1d80\" (UID: \"f500d6c0-ede1-4254-b268-652efd3b1d80\") " Mar 09 09:24:58 crc kubenswrapper[4971]: I0309 09:24:58.685262 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f500d6c0-ede1-4254-b268-652efd3b1d80-proxy-ca-bundles\") pod \"f500d6c0-ede1-4254-b268-652efd3b1d80\" (UID: \"f500d6c0-ede1-4254-b268-652efd3b1d80\") " Mar 09 09:24:58 crc kubenswrapper[4971]: I0309 09:24:58.685301 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f500d6c0-ede1-4254-b268-652efd3b1d80-config" (OuterVolumeSpecName: "config") pod "f500d6c0-ede1-4254-b268-652efd3b1d80" (UID: "f500d6c0-ede1-4254-b268-652efd3b1d80"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:58 crc kubenswrapper[4971]: I0309 09:24:58.685323 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb8q9\" (UniqueName: \"kubernetes.io/projected/f500d6c0-ede1-4254-b268-652efd3b1d80-kube-api-access-wb8q9\") pod \"f500d6c0-ede1-4254-b268-652efd3b1d80\" (UID: \"f500d6c0-ede1-4254-b268-652efd3b1d80\") " Mar 09 09:24:58 crc kubenswrapper[4971]: I0309 09:24:58.685495 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f500d6c0-ede1-4254-b268-652efd3b1d80-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:58 crc kubenswrapper[4971]: I0309 09:24:58.686200 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f500d6c0-ede1-4254-b268-652efd3b1d80-client-ca" (OuterVolumeSpecName: "client-ca") pod "f500d6c0-ede1-4254-b268-652efd3b1d80" (UID: "f500d6c0-ede1-4254-b268-652efd3b1d80"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:58 crc kubenswrapper[4971]: I0309 09:24:58.686458 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f500d6c0-ede1-4254-b268-652efd3b1d80-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f500d6c0-ede1-4254-b268-652efd3b1d80" (UID: "f500d6c0-ede1-4254-b268-652efd3b1d80"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:58 crc kubenswrapper[4971]: I0309 09:24:58.692857 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f500d6c0-ede1-4254-b268-652efd3b1d80-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f500d6c0-ede1-4254-b268-652efd3b1d80" (UID: "f500d6c0-ede1-4254-b268-652efd3b1d80"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:24:58 crc kubenswrapper[4971]: I0309 09:24:58.692925 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f500d6c0-ede1-4254-b268-652efd3b1d80-kube-api-access-wb8q9" (OuterVolumeSpecName: "kube-api-access-wb8q9") pod "f500d6c0-ede1-4254-b268-652efd3b1d80" (UID: "f500d6c0-ede1-4254-b268-652efd3b1d80"). InnerVolumeSpecName "kube-api-access-wb8q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:24:58 crc kubenswrapper[4971]: I0309 09:24:58.786868 4971 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f500d6c0-ede1-4254-b268-652efd3b1d80-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:58 crc kubenswrapper[4971]: I0309 09:24:58.786916 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f500d6c0-ede1-4254-b268-652efd3b1d80-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:58 crc kubenswrapper[4971]: I0309 09:24:58.786931 4971 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f500d6c0-ede1-4254-b268-652efd3b1d80-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:58 crc kubenswrapper[4971]: I0309 09:24:58.786945 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb8q9\" (UniqueName: \"kubernetes.io/projected/f500d6c0-ede1-4254-b268-652efd3b1d80-kube-api-access-wb8q9\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:58 crc kubenswrapper[4971]: I0309 09:24:58.984502 4971 generic.go:334] "Generic (PLEG): container finished" podID="f500d6c0-ede1-4254-b268-652efd3b1d80" containerID="457e598af0ea390d908a69275531c987be62547bf87c8d2e577ef18c29a88912" exitCode=0 Mar 09 09:24:58 crc kubenswrapper[4971]: I0309 09:24:58.984545 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" event={"ID":"f500d6c0-ede1-4254-b268-652efd3b1d80","Type":"ContainerDied","Data":"457e598af0ea390d908a69275531c987be62547bf87c8d2e577ef18c29a88912"} Mar 09 09:24:58 crc kubenswrapper[4971]: I0309 09:24:58.984573 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" event={"ID":"f500d6c0-ede1-4254-b268-652efd3b1d80","Type":"ContainerDied","Data":"f45e3f7b5159f4bce9dfdef5e58ff584e8fc0a25076f432769838797d176ab9a"} Mar 09 09:24:58 crc kubenswrapper[4971]: I0309 09:24:58.984591 4971 scope.go:117] "RemoveContainer" containerID="457e598af0ea390d908a69275531c987be62547bf87c8d2e577ef18c29a88912" Mar 09 09:24:58 crc kubenswrapper[4971]: I0309 09:24:58.984668 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f8d9f876c-gcxkz" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.002564 4971 scope.go:117] "RemoveContainer" containerID="457e598af0ea390d908a69275531c987be62547bf87c8d2e577ef18c29a88912" Mar 09 09:24:59 crc kubenswrapper[4971]: E0309 09:24:59.003219 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"457e598af0ea390d908a69275531c987be62547bf87c8d2e577ef18c29a88912\": container with ID starting with 457e598af0ea390d908a69275531c987be62547bf87c8d2e577ef18c29a88912 not found: ID does not exist" containerID="457e598af0ea390d908a69275531c987be62547bf87c8d2e577ef18c29a88912" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.003283 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"457e598af0ea390d908a69275531c987be62547bf87c8d2e577ef18c29a88912"} err="failed to get container status \"457e598af0ea390d908a69275531c987be62547bf87c8d2e577ef18c29a88912\": rpc error: code = NotFound desc = could not find container \"457e598af0ea390d908a69275531c987be62547bf87c8d2e577ef18c29a88912\": container with ID starting with 457e598af0ea390d908a69275531c987be62547bf87c8d2e577ef18c29a88912 not found: ID does not exist" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.021567 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f8d9f876c-gcxkz"] Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.027859 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-f8d9f876c-gcxkz"] Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.158874 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f500d6c0-ede1-4254-b268-652efd3b1d80" path="/var/lib/kubelet/pods/f500d6c0-ede1-4254-b268-652efd3b1d80/volumes" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.525794 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-754b797845-qzlxj"] Mar 09 09:24:59 crc kubenswrapper[4971]: E0309 09:24:59.526052 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f500d6c0-ede1-4254-b268-652efd3b1d80" containerName="controller-manager" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.526067 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f500d6c0-ede1-4254-b268-652efd3b1d80" containerName="controller-manager" Mar 09 09:24:59 crc kubenswrapper[4971]: E0309 09:24:59.526080 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3170fbc-6018-4b70-9f33-54a2e285fcd3" containerName="oc" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.526088 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3170fbc-6018-4b70-9f33-54a2e285fcd3" containerName="oc" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.526218 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3170fbc-6018-4b70-9f33-54a2e285fcd3" containerName="oc" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.526234 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f500d6c0-ede1-4254-b268-652efd3b1d80" containerName="controller-manager" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.526675 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.529057 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.529139 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.529326 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.531120 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.535241 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.536710 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-754b797845-qzlxj"] Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.542797 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.549955 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.596341 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05c69372-25be-478b-8dd5-bd1fd9a9ed49-config\") pod \"controller-manager-754b797845-qzlxj\" (UID: \"05c69372-25be-478b-8dd5-bd1fd9a9ed49\") " pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.596405 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/05c69372-25be-478b-8dd5-bd1fd9a9ed49-proxy-ca-bundles\") pod \"controller-manager-754b797845-qzlxj\" (UID: \"05c69372-25be-478b-8dd5-bd1fd9a9ed49\") " pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.596434 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05c69372-25be-478b-8dd5-bd1fd9a9ed49-serving-cert\") pod \"controller-manager-754b797845-qzlxj\" (UID: \"05c69372-25be-478b-8dd5-bd1fd9a9ed49\") " pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.596472 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lbp6\" (UniqueName: \"kubernetes.io/projected/05c69372-25be-478b-8dd5-bd1fd9a9ed49-kube-api-access-6lbp6\") pod \"controller-manager-754b797845-qzlxj\" (UID: \"05c69372-25be-478b-8dd5-bd1fd9a9ed49\") " pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.596514 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05c69372-25be-478b-8dd5-bd1fd9a9ed49-client-ca\") pod \"controller-manager-754b797845-qzlxj\" (UID: \"05c69372-25be-478b-8dd5-bd1fd9a9ed49\") " pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.697851 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05c69372-25be-478b-8dd5-bd1fd9a9ed49-config\") pod \"controller-manager-754b797845-qzlxj\" (UID: \"05c69372-25be-478b-8dd5-bd1fd9a9ed49\") " pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.697901 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/05c69372-25be-478b-8dd5-bd1fd9a9ed49-proxy-ca-bundles\") pod \"controller-manager-754b797845-qzlxj\" (UID: \"05c69372-25be-478b-8dd5-bd1fd9a9ed49\") " pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.697920 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05c69372-25be-478b-8dd5-bd1fd9a9ed49-serving-cert\") pod \"controller-manager-754b797845-qzlxj\" (UID: \"05c69372-25be-478b-8dd5-bd1fd9a9ed49\") " pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.697951 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lbp6\" (UniqueName: \"kubernetes.io/projected/05c69372-25be-478b-8dd5-bd1fd9a9ed49-kube-api-access-6lbp6\") pod \"controller-manager-754b797845-qzlxj\" (UID: \"05c69372-25be-478b-8dd5-bd1fd9a9ed49\") " pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.697971 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05c69372-25be-478b-8dd5-bd1fd9a9ed49-client-ca\") pod \"controller-manager-754b797845-qzlxj\" (UID: \"05c69372-25be-478b-8dd5-bd1fd9a9ed49\") " pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.699260 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05c69372-25be-478b-8dd5-bd1fd9a9ed49-client-ca\") pod \"controller-manager-754b797845-qzlxj\" (UID: \"05c69372-25be-478b-8dd5-bd1fd9a9ed49\") " pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.699426 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05c69372-25be-478b-8dd5-bd1fd9a9ed49-config\") pod \"controller-manager-754b797845-qzlxj\" (UID: \"05c69372-25be-478b-8dd5-bd1fd9a9ed49\") " pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.699437 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/05c69372-25be-478b-8dd5-bd1fd9a9ed49-proxy-ca-bundles\") pod \"controller-manager-754b797845-qzlxj\" (UID: \"05c69372-25be-478b-8dd5-bd1fd9a9ed49\") " pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.701818 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05c69372-25be-478b-8dd5-bd1fd9a9ed49-serving-cert\") pod \"controller-manager-754b797845-qzlxj\" (UID: \"05c69372-25be-478b-8dd5-bd1fd9a9ed49\") " pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.716001 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lbp6\" (UniqueName: \"kubernetes.io/projected/05c69372-25be-478b-8dd5-bd1fd9a9ed49-kube-api-access-6lbp6\") pod \"controller-manager-754b797845-qzlxj\" (UID: \"05c69372-25be-478b-8dd5-bd1fd9a9ed49\") " pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" Mar 09 09:24:59 crc kubenswrapper[4971]: I0309 09:24:59.856059 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" Mar 09 09:25:00 crc kubenswrapper[4971]: I0309 09:25:00.275293 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-754b797845-qzlxj"] Mar 09 09:25:01 crc kubenswrapper[4971]: I0309 09:25:01.002132 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" event={"ID":"05c69372-25be-478b-8dd5-bd1fd9a9ed49","Type":"ContainerStarted","Data":"ef4a68a61bcac8b9648c96becf72be3e81dc78f4da1b64e34469eec52d3305e6"} Mar 09 09:25:01 crc kubenswrapper[4971]: I0309 09:25:01.002962 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" Mar 09 09:25:01 crc kubenswrapper[4971]: I0309 09:25:01.002989 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" event={"ID":"05c69372-25be-478b-8dd5-bd1fd9a9ed49","Type":"ContainerStarted","Data":"e80cd68438467db80fedeedd65c3353da5b6592391360c3284d2452409f1d673"} Mar 09 09:25:01 crc kubenswrapper[4971]: I0309 09:25:01.007074 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" Mar 09 09:25:01 crc kubenswrapper[4971]: I0309 09:25:01.018653 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" podStartSLOduration=3.018633785 podStartE2EDuration="3.018633785s" podCreationTimestamp="2026-03-09 09:24:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:25:01.017665467 +0000 UTC m=+304.577593277" watchObservedRunningTime="2026-03-09 09:25:01.018633785 +0000 UTC m=+304.578561595" Mar 09 09:25:14 crc kubenswrapper[4971]: I0309 09:25:14.795034 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:25:14 crc kubenswrapper[4971]: I0309 09:25:14.795779 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:25:14 crc kubenswrapper[4971]: I0309 09:25:14.795847 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" Mar 09 09:25:14 crc kubenswrapper[4971]: I0309 09:25:14.796649 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ae9ddb9ff311e15e0bec8cf007b9275af5870d3030b314990b85d278c01e4a3e"} pod="openshift-machine-config-operator/machine-config-daemon-p56wx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:25:14 crc kubenswrapper[4971]: I0309 09:25:14.796748 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" containerID="cri-o://ae9ddb9ff311e15e0bec8cf007b9275af5870d3030b314990b85d278c01e4a3e" gracePeriod=600 Mar 09 09:25:15 crc kubenswrapper[4971]: I0309 09:25:15.082202 4971 generic.go:334] "Generic (PLEG): container finished" podID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerID="ae9ddb9ff311e15e0bec8cf007b9275af5870d3030b314990b85d278c01e4a3e" exitCode=0 Mar 09 09:25:15 crc kubenswrapper[4971]: I0309 09:25:15.082568 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" event={"ID":"05fde3ad-1182-4b15-bb1a-f365ecc92d75","Type":"ContainerDied","Data":"ae9ddb9ff311e15e0bec8cf007b9275af5870d3030b314990b85d278c01e4a3e"} Mar 09 09:25:15 crc kubenswrapper[4971]: I0309 09:25:15.082597 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" event={"ID":"05fde3ad-1182-4b15-bb1a-f365ecc92d75","Type":"ContainerStarted","Data":"7aa603ba67328834de5950491258a16b4fddbca04efe1575ba7e19aa5d559570"} Mar 09 09:25:22 crc kubenswrapper[4971]: I0309 09:25:22.862826 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jn2xq"] Mar 09 09:25:22 crc kubenswrapper[4971]: I0309 09:25:22.864269 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:22 crc kubenswrapper[4971]: I0309 09:25:22.880135 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jn2xq"] Mar 09 09:25:23 crc kubenswrapper[4971]: I0309 09:25:23.012552 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4a0c6ad8-0175-454c-ba7e-c022d27f0078-bound-sa-token\") pod \"image-registry-66df7c8f76-jn2xq\" (UID: \"4a0c6ad8-0175-454c-ba7e-c022d27f0078\") " pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:23 crc kubenswrapper[4971]: I0309 09:25:23.013002 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4a0c6ad8-0175-454c-ba7e-c022d27f0078-registry-tls\") pod \"image-registry-66df7c8f76-jn2xq\" (UID: \"4a0c6ad8-0175-454c-ba7e-c022d27f0078\") " pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:23 crc kubenswrapper[4971]: I0309 09:25:23.013030 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4a0c6ad8-0175-454c-ba7e-c022d27f0078-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jn2xq\" (UID: \"4a0c6ad8-0175-454c-ba7e-c022d27f0078\") " pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:23 crc kubenswrapper[4971]: I0309 09:25:23.013067 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4a0c6ad8-0175-454c-ba7e-c022d27f0078-registry-certificates\") pod \"image-registry-66df7c8f76-jn2xq\" (UID: \"4a0c6ad8-0175-454c-ba7e-c022d27f0078\") " pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:23 crc kubenswrapper[4971]: I0309 09:25:23.013122 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4a0c6ad8-0175-454c-ba7e-c022d27f0078-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jn2xq\" (UID: \"4a0c6ad8-0175-454c-ba7e-c022d27f0078\") " pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:23 crc kubenswrapper[4971]: I0309 09:25:23.013168 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-jn2xq\" (UID: \"4a0c6ad8-0175-454c-ba7e-c022d27f0078\") " pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:23 crc kubenswrapper[4971]: I0309 09:25:23.013200 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp4jp\" (UniqueName: \"kubernetes.io/projected/4a0c6ad8-0175-454c-ba7e-c022d27f0078-kube-api-access-cp4jp\") pod \"image-registry-66df7c8f76-jn2xq\" (UID: \"4a0c6ad8-0175-454c-ba7e-c022d27f0078\") " pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:23 crc kubenswrapper[4971]: I0309 09:25:23.013228 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a0c6ad8-0175-454c-ba7e-c022d27f0078-trusted-ca\") pod \"image-registry-66df7c8f76-jn2xq\" (UID: \"4a0c6ad8-0175-454c-ba7e-c022d27f0078\") " pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:23 crc kubenswrapper[4971]: I0309 09:25:23.038569 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-jn2xq\" (UID: \"4a0c6ad8-0175-454c-ba7e-c022d27f0078\") " pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:23 crc kubenswrapper[4971]: I0309 09:25:23.115037 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4a0c6ad8-0175-454c-ba7e-c022d27f0078-registry-tls\") pod \"image-registry-66df7c8f76-jn2xq\" (UID: \"4a0c6ad8-0175-454c-ba7e-c022d27f0078\") " pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:23 crc kubenswrapper[4971]: I0309 09:25:23.115089 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4a0c6ad8-0175-454c-ba7e-c022d27f0078-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jn2xq\" (UID: \"4a0c6ad8-0175-454c-ba7e-c022d27f0078\") " pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:23 crc kubenswrapper[4971]: I0309 09:25:23.115132 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4a0c6ad8-0175-454c-ba7e-c022d27f0078-registry-certificates\") pod \"image-registry-66df7c8f76-jn2xq\" (UID: \"4a0c6ad8-0175-454c-ba7e-c022d27f0078\") " pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:23 crc kubenswrapper[4971]: I0309 09:25:23.115191 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4a0c6ad8-0175-454c-ba7e-c022d27f0078-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jn2xq\" (UID: \"4a0c6ad8-0175-454c-ba7e-c022d27f0078\") " pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:23 crc kubenswrapper[4971]: I0309 09:25:23.115233 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp4jp\" (UniqueName: \"kubernetes.io/projected/4a0c6ad8-0175-454c-ba7e-c022d27f0078-kube-api-access-cp4jp\") pod \"image-registry-66df7c8f76-jn2xq\" (UID: \"4a0c6ad8-0175-454c-ba7e-c022d27f0078\") " pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:23 crc kubenswrapper[4971]: I0309 09:25:23.115257 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a0c6ad8-0175-454c-ba7e-c022d27f0078-trusted-ca\") pod \"image-registry-66df7c8f76-jn2xq\" (UID: \"4a0c6ad8-0175-454c-ba7e-c022d27f0078\") " pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:23 crc kubenswrapper[4971]: I0309 09:25:23.115282 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4a0c6ad8-0175-454c-ba7e-c022d27f0078-bound-sa-token\") pod \"image-registry-66df7c8f76-jn2xq\" (UID: \"4a0c6ad8-0175-454c-ba7e-c022d27f0078\") " pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:23 crc kubenswrapper[4971]: I0309 09:25:23.115728 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4a0c6ad8-0175-454c-ba7e-c022d27f0078-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jn2xq\" (UID: \"4a0c6ad8-0175-454c-ba7e-c022d27f0078\") " pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:23 crc kubenswrapper[4971]: I0309 09:25:23.116586 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4a0c6ad8-0175-454c-ba7e-c022d27f0078-registry-certificates\") pod \"image-registry-66df7c8f76-jn2xq\" (UID: \"4a0c6ad8-0175-454c-ba7e-c022d27f0078\") " pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:23 crc kubenswrapper[4971]: I0309 09:25:23.117406 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a0c6ad8-0175-454c-ba7e-c022d27f0078-trusted-ca\") pod \"image-registry-66df7c8f76-jn2xq\" (UID: \"4a0c6ad8-0175-454c-ba7e-c022d27f0078\") " pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:23 crc kubenswrapper[4971]: I0309 09:25:23.122559 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4a0c6ad8-0175-454c-ba7e-c022d27f0078-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jn2xq\" (UID: \"4a0c6ad8-0175-454c-ba7e-c022d27f0078\") " pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:23 crc kubenswrapper[4971]: I0309 09:25:23.123085 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4a0c6ad8-0175-454c-ba7e-c022d27f0078-registry-tls\") pod \"image-registry-66df7c8f76-jn2xq\" (UID: \"4a0c6ad8-0175-454c-ba7e-c022d27f0078\") " pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:23 crc kubenswrapper[4971]: I0309 09:25:23.131550 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4a0c6ad8-0175-454c-ba7e-c022d27f0078-bound-sa-token\") pod \"image-registry-66df7c8f76-jn2xq\" (UID: \"4a0c6ad8-0175-454c-ba7e-c022d27f0078\") " pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:23 crc kubenswrapper[4971]: I0309 09:25:23.133326 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp4jp\" (UniqueName: \"kubernetes.io/projected/4a0c6ad8-0175-454c-ba7e-c022d27f0078-kube-api-access-cp4jp\") pod \"image-registry-66df7c8f76-jn2xq\" (UID: \"4a0c6ad8-0175-454c-ba7e-c022d27f0078\") " pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:23 crc kubenswrapper[4971]: I0309 09:25:23.184176 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:23 crc kubenswrapper[4971]: I0309 09:25:23.593576 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jn2xq"] Mar 09 09:25:24 crc kubenswrapper[4971]: I0309 09:25:24.134151 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" event={"ID":"4a0c6ad8-0175-454c-ba7e-c022d27f0078","Type":"ContainerStarted","Data":"b5e5f1fe5e63206180024c8b574e3df791f441fa0d01566bc2b4f00b07151202"} Mar 09 09:25:24 crc kubenswrapper[4971]: I0309 09:25:24.134490 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" event={"ID":"4a0c6ad8-0175-454c-ba7e-c022d27f0078","Type":"ContainerStarted","Data":"663f32636afaf36abafa59bdc1ca315c2f74385684b081238043db3ff5f9e535"} Mar 09 09:25:24 crc kubenswrapper[4971]: I0309 09:25:24.134505 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:24 crc kubenswrapper[4971]: I0309 09:25:24.157960 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" podStartSLOduration=2.157937493 podStartE2EDuration="2.157937493s" podCreationTimestamp="2026-03-09 09:25:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:25:24.152972622 +0000 UTC m=+327.712900432" watchObservedRunningTime="2026-03-09 09:25:24.157937493 +0000 UTC m=+327.717865313" Mar 09 09:25:31 crc kubenswrapper[4971]: I0309 09:25:31.875224 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx"] Mar 09 09:25:31 crc kubenswrapper[4971]: I0309 09:25:31.877531 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx" podUID="538b6646-1cc1-41f8-aebf-c088acdfdbdf" containerName="route-controller-manager" containerID="cri-o://20cf5045f8c0e01d3bc3d361004945834aaa5250208716649a9f20509097eccf" gracePeriod=30 Mar 09 09:25:32 crc kubenswrapper[4971]: I0309 09:25:32.178928 4971 generic.go:334] "Generic (PLEG): container finished" podID="538b6646-1cc1-41f8-aebf-c088acdfdbdf" containerID="20cf5045f8c0e01d3bc3d361004945834aaa5250208716649a9f20509097eccf" exitCode=0 Mar 09 09:25:32 crc kubenswrapper[4971]: I0309 09:25:32.179017 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx" event={"ID":"538b6646-1cc1-41f8-aebf-c088acdfdbdf","Type":"ContainerDied","Data":"20cf5045f8c0e01d3bc3d361004945834aaa5250208716649a9f20509097eccf"} Mar 09 09:25:32 crc kubenswrapper[4971]: I0309 09:25:32.313491 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx" Mar 09 09:25:32 crc kubenswrapper[4971]: I0309 09:25:32.437643 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/538b6646-1cc1-41f8-aebf-c088acdfdbdf-client-ca\") pod \"538b6646-1cc1-41f8-aebf-c088acdfdbdf\" (UID: \"538b6646-1cc1-41f8-aebf-c088acdfdbdf\") " Mar 09 09:25:32 crc kubenswrapper[4971]: I0309 09:25:32.437746 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/538b6646-1cc1-41f8-aebf-c088acdfdbdf-serving-cert\") pod \"538b6646-1cc1-41f8-aebf-c088acdfdbdf\" (UID: \"538b6646-1cc1-41f8-aebf-c088acdfdbdf\") " Mar 09 09:25:32 crc kubenswrapper[4971]: I0309 09:25:32.437847 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/538b6646-1cc1-41f8-aebf-c088acdfdbdf-config\") pod \"538b6646-1cc1-41f8-aebf-c088acdfdbdf\" (UID: \"538b6646-1cc1-41f8-aebf-c088acdfdbdf\") " Mar 09 09:25:32 crc kubenswrapper[4971]: I0309 09:25:32.437894 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7db5s\" (UniqueName: \"kubernetes.io/projected/538b6646-1cc1-41f8-aebf-c088acdfdbdf-kube-api-access-7db5s\") pod \"538b6646-1cc1-41f8-aebf-c088acdfdbdf\" (UID: \"538b6646-1cc1-41f8-aebf-c088acdfdbdf\") " Mar 09 09:25:32 crc kubenswrapper[4971]: I0309 09:25:32.438556 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/538b6646-1cc1-41f8-aebf-c088acdfdbdf-client-ca" (OuterVolumeSpecName: "client-ca") pod "538b6646-1cc1-41f8-aebf-c088acdfdbdf" (UID: "538b6646-1cc1-41f8-aebf-c088acdfdbdf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:32 crc kubenswrapper[4971]: I0309 09:25:32.438624 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/538b6646-1cc1-41f8-aebf-c088acdfdbdf-config" (OuterVolumeSpecName: "config") pod "538b6646-1cc1-41f8-aebf-c088acdfdbdf" (UID: "538b6646-1cc1-41f8-aebf-c088acdfdbdf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:32 crc kubenswrapper[4971]: I0309 09:25:32.443553 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/538b6646-1cc1-41f8-aebf-c088acdfdbdf-kube-api-access-7db5s" (OuterVolumeSpecName: "kube-api-access-7db5s") pod "538b6646-1cc1-41f8-aebf-c088acdfdbdf" (UID: "538b6646-1cc1-41f8-aebf-c088acdfdbdf"). InnerVolumeSpecName "kube-api-access-7db5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:32 crc kubenswrapper[4971]: I0309 09:25:32.444404 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/538b6646-1cc1-41f8-aebf-c088acdfdbdf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "538b6646-1cc1-41f8-aebf-c088acdfdbdf" (UID: "538b6646-1cc1-41f8-aebf-c088acdfdbdf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:32 crc kubenswrapper[4971]: I0309 09:25:32.539811 4971 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/538b6646-1cc1-41f8-aebf-c088acdfdbdf-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:32 crc kubenswrapper[4971]: I0309 09:25:32.539882 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/538b6646-1cc1-41f8-aebf-c088acdfdbdf-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:32 crc kubenswrapper[4971]: I0309 09:25:32.539901 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/538b6646-1cc1-41f8-aebf-c088acdfdbdf-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:32 crc kubenswrapper[4971]: I0309 09:25:32.539920 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7db5s\" (UniqueName: \"kubernetes.io/projected/538b6646-1cc1-41f8-aebf-c088acdfdbdf-kube-api-access-7db5s\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.185821 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx" event={"ID":"538b6646-1cc1-41f8-aebf-c088acdfdbdf","Type":"ContainerDied","Data":"ab209aa9ef900c25df53273a12bc8b146e6fb34691c9a9f13049b88a38f6707b"} Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.185872 4971 scope.go:117] "RemoveContainer" containerID="20cf5045f8c0e01d3bc3d361004945834aaa5250208716649a9f20509097eccf" Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.185992 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx" Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.208745 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx"] Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.212594 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bf9495b6c-hdcgx"] Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.542895 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6697f988b5-cjkpw"] Mar 09 09:25:33 crc kubenswrapper[4971]: E0309 09:25:33.543526 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="538b6646-1cc1-41f8-aebf-c088acdfdbdf" containerName="route-controller-manager" Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.543540 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="538b6646-1cc1-41f8-aebf-c088acdfdbdf" containerName="route-controller-manager" Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.543672 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="538b6646-1cc1-41f8-aebf-c088acdfdbdf" containerName="route-controller-manager" Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.544111 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6697f988b5-cjkpw" Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.551312 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.551571 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.551956 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.552007 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.552165 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.552640 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.556224 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6697f988b5-cjkpw"] Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.652800 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1001f897-e122-4955-94a8-f5f262c1aa58-client-ca\") pod \"route-controller-manager-6697f988b5-cjkpw\" (UID: \"1001f897-e122-4955-94a8-f5f262c1aa58\") " pod="openshift-route-controller-manager/route-controller-manager-6697f988b5-cjkpw" Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.652873 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sknsm\" (UniqueName: \"kubernetes.io/projected/1001f897-e122-4955-94a8-f5f262c1aa58-kube-api-access-sknsm\") pod \"route-controller-manager-6697f988b5-cjkpw\" (UID: \"1001f897-e122-4955-94a8-f5f262c1aa58\") " pod="openshift-route-controller-manager/route-controller-manager-6697f988b5-cjkpw" Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.652966 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1001f897-e122-4955-94a8-f5f262c1aa58-serving-cert\") pod \"route-controller-manager-6697f988b5-cjkpw\" (UID: \"1001f897-e122-4955-94a8-f5f262c1aa58\") " pod="openshift-route-controller-manager/route-controller-manager-6697f988b5-cjkpw" Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.653020 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1001f897-e122-4955-94a8-f5f262c1aa58-config\") pod \"route-controller-manager-6697f988b5-cjkpw\" (UID: \"1001f897-e122-4955-94a8-f5f262c1aa58\") " pod="openshift-route-controller-manager/route-controller-manager-6697f988b5-cjkpw" Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.753831 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1001f897-e122-4955-94a8-f5f262c1aa58-serving-cert\") pod \"route-controller-manager-6697f988b5-cjkpw\" (UID: \"1001f897-e122-4955-94a8-f5f262c1aa58\") " pod="openshift-route-controller-manager/route-controller-manager-6697f988b5-cjkpw" Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.753905 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1001f897-e122-4955-94a8-f5f262c1aa58-config\") pod \"route-controller-manager-6697f988b5-cjkpw\" (UID: \"1001f897-e122-4955-94a8-f5f262c1aa58\") " pod="openshift-route-controller-manager/route-controller-manager-6697f988b5-cjkpw" Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.753947 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1001f897-e122-4955-94a8-f5f262c1aa58-client-ca\") pod \"route-controller-manager-6697f988b5-cjkpw\" (UID: \"1001f897-e122-4955-94a8-f5f262c1aa58\") " pod="openshift-route-controller-manager/route-controller-manager-6697f988b5-cjkpw" Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.753984 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sknsm\" (UniqueName: \"kubernetes.io/projected/1001f897-e122-4955-94a8-f5f262c1aa58-kube-api-access-sknsm\") pod \"route-controller-manager-6697f988b5-cjkpw\" (UID: \"1001f897-e122-4955-94a8-f5f262c1aa58\") " pod="openshift-route-controller-manager/route-controller-manager-6697f988b5-cjkpw" Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.755233 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1001f897-e122-4955-94a8-f5f262c1aa58-client-ca\") pod \"route-controller-manager-6697f988b5-cjkpw\" (UID: \"1001f897-e122-4955-94a8-f5f262c1aa58\") " pod="openshift-route-controller-manager/route-controller-manager-6697f988b5-cjkpw" Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.755590 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1001f897-e122-4955-94a8-f5f262c1aa58-config\") pod \"route-controller-manager-6697f988b5-cjkpw\" (UID: \"1001f897-e122-4955-94a8-f5f262c1aa58\") " pod="openshift-route-controller-manager/route-controller-manager-6697f988b5-cjkpw" Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.758742 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1001f897-e122-4955-94a8-f5f262c1aa58-serving-cert\") pod \"route-controller-manager-6697f988b5-cjkpw\" (UID: \"1001f897-e122-4955-94a8-f5f262c1aa58\") " pod="openshift-route-controller-manager/route-controller-manager-6697f988b5-cjkpw" Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.770861 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sknsm\" (UniqueName: \"kubernetes.io/projected/1001f897-e122-4955-94a8-f5f262c1aa58-kube-api-access-sknsm\") pod \"route-controller-manager-6697f988b5-cjkpw\" (UID: \"1001f897-e122-4955-94a8-f5f262c1aa58\") " pod="openshift-route-controller-manager/route-controller-manager-6697f988b5-cjkpw" Mar 09 09:25:33 crc kubenswrapper[4971]: I0309 09:25:33.861787 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6697f988b5-cjkpw" Mar 09 09:25:34 crc kubenswrapper[4971]: I0309 09:25:34.273938 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6697f988b5-cjkpw"] Mar 09 09:25:34 crc kubenswrapper[4971]: W0309 09:25:34.280886 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1001f897_e122_4955_94a8_f5f262c1aa58.slice/crio-0b415393de4ba91ae2eabbae2671405f23455cff8b0643305632c2c2ecf575ec WatchSource:0}: Error finding container 0b415393de4ba91ae2eabbae2671405f23455cff8b0643305632c2c2ecf575ec: Status 404 returned error can't find the container with id 0b415393de4ba91ae2eabbae2671405f23455cff8b0643305632c2c2ecf575ec Mar 09 09:25:35 crc kubenswrapper[4971]: I0309 09:25:35.159209 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="538b6646-1cc1-41f8-aebf-c088acdfdbdf" path="/var/lib/kubelet/pods/538b6646-1cc1-41f8-aebf-c088acdfdbdf/volumes" Mar 09 09:25:35 crc kubenswrapper[4971]: I0309 09:25:35.208462 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6697f988b5-cjkpw" event={"ID":"1001f897-e122-4955-94a8-f5f262c1aa58","Type":"ContainerStarted","Data":"fb24a19ffa05ce4e12a9716080318885716b633803a8fbb8d2ae01c1cb94fe3c"} Mar 09 09:25:35 crc kubenswrapper[4971]: I0309 09:25:35.208511 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6697f988b5-cjkpw" event={"ID":"1001f897-e122-4955-94a8-f5f262c1aa58","Type":"ContainerStarted","Data":"0b415393de4ba91ae2eabbae2671405f23455cff8b0643305632c2c2ecf575ec"} Mar 09 09:25:35 crc kubenswrapper[4971]: I0309 09:25:35.209097 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6697f988b5-cjkpw" Mar 09 09:25:35 crc kubenswrapper[4971]: I0309 09:25:35.215948 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6697f988b5-cjkpw" Mar 09 09:25:35 crc kubenswrapper[4971]: I0309 09:25:35.233479 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6697f988b5-cjkpw" podStartSLOduration=4.233458587 podStartE2EDuration="4.233458587s" podCreationTimestamp="2026-03-09 09:25:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:25:35.229760056 +0000 UTC m=+338.789687876" watchObservedRunningTime="2026-03-09 09:25:35.233458587 +0000 UTC m=+338.793386397" Mar 09 09:25:43 crc kubenswrapper[4971]: I0309 09:25:43.193113 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-jn2xq" Mar 09 09:25:43 crc kubenswrapper[4971]: I0309 09:25:43.250588 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-d6xhv"] Mar 09 09:25:51 crc kubenswrapper[4971]: I0309 09:25:51.859018 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-754b797845-qzlxj"] Mar 09 09:25:51 crc kubenswrapper[4971]: I0309 09:25:51.859862 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" podUID="05c69372-25be-478b-8dd5-bd1fd9a9ed49" containerName="controller-manager" containerID="cri-o://ef4a68a61bcac8b9648c96becf72be3e81dc78f4da1b64e34469eec52d3305e6" gracePeriod=30 Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.276864 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.318269 4971 generic.go:334] "Generic (PLEG): container finished" podID="05c69372-25be-478b-8dd5-bd1fd9a9ed49" containerID="ef4a68a61bcac8b9648c96becf72be3e81dc78f4da1b64e34469eec52d3305e6" exitCode=0 Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.318314 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" event={"ID":"05c69372-25be-478b-8dd5-bd1fd9a9ed49","Type":"ContainerDied","Data":"ef4a68a61bcac8b9648c96becf72be3e81dc78f4da1b64e34469eec52d3305e6"} Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.318388 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" event={"ID":"05c69372-25be-478b-8dd5-bd1fd9a9ed49","Type":"ContainerDied","Data":"e80cd68438467db80fedeedd65c3353da5b6592391360c3284d2452409f1d673"} Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.318409 4971 scope.go:117] "RemoveContainer" containerID="ef4a68a61bcac8b9648c96becf72be3e81dc78f4da1b64e34469eec52d3305e6" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.318320 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-754b797845-qzlxj" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.334765 4971 scope.go:117] "RemoveContainer" containerID="ef4a68a61bcac8b9648c96becf72be3e81dc78f4da1b64e34469eec52d3305e6" Mar 09 09:25:52 crc kubenswrapper[4971]: E0309 09:25:52.335233 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef4a68a61bcac8b9648c96becf72be3e81dc78f4da1b64e34469eec52d3305e6\": container with ID starting with ef4a68a61bcac8b9648c96becf72be3e81dc78f4da1b64e34469eec52d3305e6 not found: ID does not exist" containerID="ef4a68a61bcac8b9648c96becf72be3e81dc78f4da1b64e34469eec52d3305e6" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.335269 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef4a68a61bcac8b9648c96becf72be3e81dc78f4da1b64e34469eec52d3305e6"} err="failed to get container status \"ef4a68a61bcac8b9648c96becf72be3e81dc78f4da1b64e34469eec52d3305e6\": rpc error: code = NotFound desc = could not find container \"ef4a68a61bcac8b9648c96becf72be3e81dc78f4da1b64e34469eec52d3305e6\": container with ID starting with ef4a68a61bcac8b9648c96becf72be3e81dc78f4da1b64e34469eec52d3305e6 not found: ID does not exist" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.413999 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05c69372-25be-478b-8dd5-bd1fd9a9ed49-config\") pod \"05c69372-25be-478b-8dd5-bd1fd9a9ed49\" (UID: \"05c69372-25be-478b-8dd5-bd1fd9a9ed49\") " Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.414068 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05c69372-25be-478b-8dd5-bd1fd9a9ed49-serving-cert\") pod \"05c69372-25be-478b-8dd5-bd1fd9a9ed49\" (UID: \"05c69372-25be-478b-8dd5-bd1fd9a9ed49\") " Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.414171 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lbp6\" (UniqueName: \"kubernetes.io/projected/05c69372-25be-478b-8dd5-bd1fd9a9ed49-kube-api-access-6lbp6\") pod \"05c69372-25be-478b-8dd5-bd1fd9a9ed49\" (UID: \"05c69372-25be-478b-8dd5-bd1fd9a9ed49\") " Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.414249 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05c69372-25be-478b-8dd5-bd1fd9a9ed49-client-ca\") pod \"05c69372-25be-478b-8dd5-bd1fd9a9ed49\" (UID: \"05c69372-25be-478b-8dd5-bd1fd9a9ed49\") " Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.414303 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/05c69372-25be-478b-8dd5-bd1fd9a9ed49-proxy-ca-bundles\") pod \"05c69372-25be-478b-8dd5-bd1fd9a9ed49\" (UID: \"05c69372-25be-478b-8dd5-bd1fd9a9ed49\") " Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.414875 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05c69372-25be-478b-8dd5-bd1fd9a9ed49-client-ca" (OuterVolumeSpecName: "client-ca") pod "05c69372-25be-478b-8dd5-bd1fd9a9ed49" (UID: "05c69372-25be-478b-8dd5-bd1fd9a9ed49"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.415029 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05c69372-25be-478b-8dd5-bd1fd9a9ed49-config" (OuterVolumeSpecName: "config") pod "05c69372-25be-478b-8dd5-bd1fd9a9ed49" (UID: "05c69372-25be-478b-8dd5-bd1fd9a9ed49"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.415268 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05c69372-25be-478b-8dd5-bd1fd9a9ed49-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "05c69372-25be-478b-8dd5-bd1fd9a9ed49" (UID: "05c69372-25be-478b-8dd5-bd1fd9a9ed49"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.419378 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05c69372-25be-478b-8dd5-bd1fd9a9ed49-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "05c69372-25be-478b-8dd5-bd1fd9a9ed49" (UID: "05c69372-25be-478b-8dd5-bd1fd9a9ed49"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.419434 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05c69372-25be-478b-8dd5-bd1fd9a9ed49-kube-api-access-6lbp6" (OuterVolumeSpecName: "kube-api-access-6lbp6") pod "05c69372-25be-478b-8dd5-bd1fd9a9ed49" (UID: "05c69372-25be-478b-8dd5-bd1fd9a9ed49"). InnerVolumeSpecName "kube-api-access-6lbp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.515489 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lbp6\" (UniqueName: \"kubernetes.io/projected/05c69372-25be-478b-8dd5-bd1fd9a9ed49-kube-api-access-6lbp6\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.515528 4971 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05c69372-25be-478b-8dd5-bd1fd9a9ed49-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.515541 4971 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/05c69372-25be-478b-8dd5-bd1fd9a9ed49-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.515551 4971 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05c69372-25be-478b-8dd5-bd1fd9a9ed49-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.515561 4971 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05c69372-25be-478b-8dd5-bd1fd9a9ed49-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.649457 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-754b797845-qzlxj"] Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.653735 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-754b797845-qzlxj"] Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.673519 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ft9v2"] Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.673839 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ft9v2" podUID="1054c243-8a85-4262-ba12-2ee5643d0255" containerName="registry-server" containerID="cri-o://abaa47a6c4c1d5ffbb9e38d7598036e3eb6edaf32ce6d33ac34fcdeec5e1fcaa" gracePeriod=30 Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.676999 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5cqt9"] Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.678615 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5cqt9" podUID="e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd" containerName="registry-server" containerID="cri-o://d0dedb7d2f2fe08ca2cbef2591a2bf14007765ab92efea23107dc3b3ed5fe3b5" gracePeriod=30 Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.686808 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zqnt8"] Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.687043 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-zqnt8" podUID="1ed6451f-4bc6-4dcc-b84c-413dbb95114b" containerName="marketplace-operator" containerID="cri-o://a8bdae3a9aa58e61af07185b5e489fce445e10466f2b4ff1c92a91275f7934d2" gracePeriod=30 Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.697782 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4dfvc"] Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.698051 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4dfvc" podUID="9cb8b120-bccf-4c59-9c72-83c6169e3411" containerName="registry-server" containerID="cri-o://aa6447d542dcde765db807feb3b3cb233af3d71acee837533cf1280dbf811020" gracePeriod=30 Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.707740 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-26997"] Mar 09 09:25:52 crc kubenswrapper[4971]: E0309 09:25:52.708033 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05c69372-25be-478b-8dd5-bd1fd9a9ed49" containerName="controller-manager" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.708052 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="05c69372-25be-478b-8dd5-bd1fd9a9ed49" containerName="controller-manager" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.708194 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="05c69372-25be-478b-8dd5-bd1fd9a9ed49" containerName="controller-manager" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.708653 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-26997" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.716604 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k6x5k"] Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.716932 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k6x5k" podUID="dbe25e82-76e3-4639-98f8-75a1e7f51c19" containerName="registry-server" containerID="cri-o://2e0b71b37e6c446367e99b79a27d62f1311e67c1646f01f7c2d6ee4a8fcd7acd" gracePeriod=30 Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.721810 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-26997"] Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.818456 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3cda571e-d5b5-4436-8846-df239e1c4b79-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-26997\" (UID: \"3cda571e-d5b5-4436-8846-df239e1c4b79\") " pod="openshift-marketplace/marketplace-operator-79b997595-26997" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.818515 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3cda571e-d5b5-4436-8846-df239e1c4b79-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-26997\" (UID: \"3cda571e-d5b5-4436-8846-df239e1c4b79\") " pod="openshift-marketplace/marketplace-operator-79b997595-26997" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.818630 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54nc5\" (UniqueName: \"kubernetes.io/projected/3cda571e-d5b5-4436-8846-df239e1c4b79-kube-api-access-54nc5\") pod \"marketplace-operator-79b997595-26997\" (UID: \"3cda571e-d5b5-4436-8846-df239e1c4b79\") " pod="openshift-marketplace/marketplace-operator-79b997595-26997" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.919712 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3cda571e-d5b5-4436-8846-df239e1c4b79-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-26997\" (UID: \"3cda571e-d5b5-4436-8846-df239e1c4b79\") " pod="openshift-marketplace/marketplace-operator-79b997595-26997" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.919806 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54nc5\" (UniqueName: \"kubernetes.io/projected/3cda571e-d5b5-4436-8846-df239e1c4b79-kube-api-access-54nc5\") pod \"marketplace-operator-79b997595-26997\" (UID: \"3cda571e-d5b5-4436-8846-df239e1c4b79\") " pod="openshift-marketplace/marketplace-operator-79b997595-26997" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.919837 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3cda571e-d5b5-4436-8846-df239e1c4b79-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-26997\" (UID: \"3cda571e-d5b5-4436-8846-df239e1c4b79\") " pod="openshift-marketplace/marketplace-operator-79b997595-26997" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.922109 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3cda571e-d5b5-4436-8846-df239e1c4b79-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-26997\" (UID: \"3cda571e-d5b5-4436-8846-df239e1c4b79\") " pod="openshift-marketplace/marketplace-operator-79b997595-26997" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.924208 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3cda571e-d5b5-4436-8846-df239e1c4b79-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-26997\" (UID: \"3cda571e-d5b5-4436-8846-df239e1c4b79\") " pod="openshift-marketplace/marketplace-operator-79b997595-26997" Mar 09 09:25:52 crc kubenswrapper[4971]: I0309 09:25:52.940337 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54nc5\" (UniqueName: \"kubernetes.io/projected/3cda571e-d5b5-4436-8846-df239e1c4b79-kube-api-access-54nc5\") pod \"marketplace-operator-79b997595-26997\" (UID: \"3cda571e-d5b5-4436-8846-df239e1c4b79\") " pod="openshift-marketplace/marketplace-operator-79b997595-26997" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.028605 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-26997" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.107988 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zqnt8" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.158478 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05c69372-25be-478b-8dd5-bd1fd9a9ed49" path="/var/lib/kubelet/pods/05c69372-25be-478b-8dd5-bd1fd9a9ed49/volumes" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.215822 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k6x5k" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.224464 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4dfvc" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.225454 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ed6451f-4bc6-4dcc-b84c-413dbb95114b-marketplace-trusted-ca\") pod \"1ed6451f-4bc6-4dcc-b84c-413dbb95114b\" (UID: \"1ed6451f-4bc6-4dcc-b84c-413dbb95114b\") " Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.225530 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9wjh\" (UniqueName: \"kubernetes.io/projected/1ed6451f-4bc6-4dcc-b84c-413dbb95114b-kube-api-access-j9wjh\") pod \"1ed6451f-4bc6-4dcc-b84c-413dbb95114b\" (UID: \"1ed6451f-4bc6-4dcc-b84c-413dbb95114b\") " Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.225566 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ed6451f-4bc6-4dcc-b84c-413dbb95114b-marketplace-operator-metrics\") pod \"1ed6451f-4bc6-4dcc-b84c-413dbb95114b\" (UID: \"1ed6451f-4bc6-4dcc-b84c-413dbb95114b\") " Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.226686 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ed6451f-4bc6-4dcc-b84c-413dbb95114b-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "1ed6451f-4bc6-4dcc-b84c-413dbb95114b" (UID: "1ed6451f-4bc6-4dcc-b84c-413dbb95114b"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.230465 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5cqt9" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.240037 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed6451f-4bc6-4dcc-b84c-413dbb95114b-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "1ed6451f-4bc6-4dcc-b84c-413dbb95114b" (UID: "1ed6451f-4bc6-4dcc-b84c-413dbb95114b"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.241576 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ed6451f-4bc6-4dcc-b84c-413dbb95114b-kube-api-access-j9wjh" (OuterVolumeSpecName: "kube-api-access-j9wjh") pod "1ed6451f-4bc6-4dcc-b84c-413dbb95114b" (UID: "1ed6451f-4bc6-4dcc-b84c-413dbb95114b"). InnerVolumeSpecName "kube-api-access-j9wjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.243339 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ft9v2" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.325653 4971 generic.go:334] "Generic (PLEG): container finished" podID="e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd" containerID="d0dedb7d2f2fe08ca2cbef2591a2bf14007765ab92efea23107dc3b3ed5fe3b5" exitCode=0 Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.325704 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cqt9" event={"ID":"e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd","Type":"ContainerDied","Data":"d0dedb7d2f2fe08ca2cbef2591a2bf14007765ab92efea23107dc3b3ed5fe3b5"} Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.325737 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5cqt9" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.325754 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cqt9" event={"ID":"e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd","Type":"ContainerDied","Data":"010e36b761268009098dac8b6b3acec97266f39fcdde4db68cb4c18464d08624"} Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.325772 4971 scope.go:117] "RemoveContainer" containerID="d0dedb7d2f2fe08ca2cbef2591a2bf14007765ab92efea23107dc3b3ed5fe3b5" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.326453 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2k26\" (UniqueName: \"kubernetes.io/projected/dbe25e82-76e3-4639-98f8-75a1e7f51c19-kube-api-access-z2k26\") pod \"dbe25e82-76e3-4639-98f8-75a1e7f51c19\" (UID: \"dbe25e82-76e3-4639-98f8-75a1e7f51c19\") " Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.326543 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cb8b120-bccf-4c59-9c72-83c6169e3411-utilities\") pod \"9cb8b120-bccf-4c59-9c72-83c6169e3411\" (UID: \"9cb8b120-bccf-4c59-9c72-83c6169e3411\") " Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.326576 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cb8b120-bccf-4c59-9c72-83c6169e3411-catalog-content\") pod \"9cb8b120-bccf-4c59-9c72-83c6169e3411\" (UID: \"9cb8b120-bccf-4c59-9c72-83c6169e3411\") " Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.326615 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbe25e82-76e3-4639-98f8-75a1e7f51c19-utilities\") pod \"dbe25e82-76e3-4639-98f8-75a1e7f51c19\" (UID: \"dbe25e82-76e3-4639-98f8-75a1e7f51c19\") " Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.326654 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbe25e82-76e3-4639-98f8-75a1e7f51c19-catalog-content\") pod \"dbe25e82-76e3-4639-98f8-75a1e7f51c19\" (UID: \"dbe25e82-76e3-4639-98f8-75a1e7f51c19\") " Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.326691 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6tj7\" (UniqueName: \"kubernetes.io/projected/9cb8b120-bccf-4c59-9c72-83c6169e3411-kube-api-access-s6tj7\") pod \"9cb8b120-bccf-4c59-9c72-83c6169e3411\" (UID: \"9cb8b120-bccf-4c59-9c72-83c6169e3411\") " Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.327002 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9wjh\" (UniqueName: \"kubernetes.io/projected/1ed6451f-4bc6-4dcc-b84c-413dbb95114b-kube-api-access-j9wjh\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.327019 4971 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ed6451f-4bc6-4dcc-b84c-413dbb95114b-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.327030 4971 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ed6451f-4bc6-4dcc-b84c-413dbb95114b-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.328582 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbe25e82-76e3-4639-98f8-75a1e7f51c19-utilities" (OuterVolumeSpecName: "utilities") pod "dbe25e82-76e3-4639-98f8-75a1e7f51c19" (UID: "dbe25e82-76e3-4639-98f8-75a1e7f51c19"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.328942 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbe25e82-76e3-4639-98f8-75a1e7f51c19-kube-api-access-z2k26" (OuterVolumeSpecName: "kube-api-access-z2k26") pod "dbe25e82-76e3-4639-98f8-75a1e7f51c19" (UID: "dbe25e82-76e3-4639-98f8-75a1e7f51c19"). InnerVolumeSpecName "kube-api-access-z2k26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.330065 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cb8b120-bccf-4c59-9c72-83c6169e3411-utilities" (OuterVolumeSpecName: "utilities") pod "9cb8b120-bccf-4c59-9c72-83c6169e3411" (UID: "9cb8b120-bccf-4c59-9c72-83c6169e3411"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.330257 4971 generic.go:334] "Generic (PLEG): container finished" podID="1ed6451f-4bc6-4dcc-b84c-413dbb95114b" containerID="a8bdae3a9aa58e61af07185b5e489fce445e10466f2b4ff1c92a91275f7934d2" exitCode=0 Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.330324 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zqnt8" event={"ID":"1ed6451f-4bc6-4dcc-b84c-413dbb95114b","Type":"ContainerDied","Data":"a8bdae3a9aa58e61af07185b5e489fce445e10466f2b4ff1c92a91275f7934d2"} Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.330369 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zqnt8" event={"ID":"1ed6451f-4bc6-4dcc-b84c-413dbb95114b","Type":"ContainerDied","Data":"2af6caf033d893c4e6413834973ee705fc8147c41f3512d296b54c5573bd67f3"} Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.330433 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zqnt8" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.332142 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cb8b120-bccf-4c59-9c72-83c6169e3411-kube-api-access-s6tj7" (OuterVolumeSpecName: "kube-api-access-s6tj7") pod "9cb8b120-bccf-4c59-9c72-83c6169e3411" (UID: "9cb8b120-bccf-4c59-9c72-83c6169e3411"). InnerVolumeSpecName "kube-api-access-s6tj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.335473 4971 generic.go:334] "Generic (PLEG): container finished" podID="1054c243-8a85-4262-ba12-2ee5643d0255" containerID="abaa47a6c4c1d5ffbb9e38d7598036e3eb6edaf32ce6d33ac34fcdeec5e1fcaa" exitCode=0 Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.335533 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft9v2" event={"ID":"1054c243-8a85-4262-ba12-2ee5643d0255","Type":"ContainerDied","Data":"abaa47a6c4c1d5ffbb9e38d7598036e3eb6edaf32ce6d33ac34fcdeec5e1fcaa"} Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.335557 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft9v2" event={"ID":"1054c243-8a85-4262-ba12-2ee5643d0255","Type":"ContainerDied","Data":"d9608f12a07ea913b8527583933f427cec7f268c21001b412b23477ab6fd1bff"} Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.335606 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ft9v2" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.341293 4971 generic.go:334] "Generic (PLEG): container finished" podID="9cb8b120-bccf-4c59-9c72-83c6169e3411" containerID="aa6447d542dcde765db807feb3b3cb233af3d71acee837533cf1280dbf811020" exitCode=0 Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.341341 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4dfvc" event={"ID":"9cb8b120-bccf-4c59-9c72-83c6169e3411","Type":"ContainerDied","Data":"aa6447d542dcde765db807feb3b3cb233af3d71acee837533cf1280dbf811020"} Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.341475 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4dfvc" event={"ID":"9cb8b120-bccf-4c59-9c72-83c6169e3411","Type":"ContainerDied","Data":"742562f6700e65f8d7ab8ba92800039b08e1ad773349376590e6b20dbf5dc557"} Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.341535 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4dfvc" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.345319 4971 generic.go:334] "Generic (PLEG): container finished" podID="dbe25e82-76e3-4639-98f8-75a1e7f51c19" containerID="2e0b71b37e6c446367e99b79a27d62f1311e67c1646f01f7c2d6ee4a8fcd7acd" exitCode=0 Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.345409 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6x5k" event={"ID":"dbe25e82-76e3-4639-98f8-75a1e7f51c19","Type":"ContainerDied","Data":"2e0b71b37e6c446367e99b79a27d62f1311e67c1646f01f7c2d6ee4a8fcd7acd"} Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.345441 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6x5k" event={"ID":"dbe25e82-76e3-4639-98f8-75a1e7f51c19","Type":"ContainerDied","Data":"f40d96482a75c7f4f80cb47f53babf9da0fb8431f7bfe6f7e27f28f91e255023"} Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.349535 4971 scope.go:117] "RemoveContainer" containerID="b7a531e1b81878202962e62594813fb92ae7ec185e8d5b1a61dd5208e5026e17" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.351006 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k6x5k" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.360797 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zqnt8"] Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.364143 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zqnt8"] Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.379649 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cb8b120-bccf-4c59-9c72-83c6169e3411-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9cb8b120-bccf-4c59-9c72-83c6169e3411" (UID: "9cb8b120-bccf-4c59-9c72-83c6169e3411"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.383468 4971 scope.go:117] "RemoveContainer" containerID="03e353df500485a1110e6c0becf141e132839caf456fbe001811b6193153f555" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.398579 4971 scope.go:117] "RemoveContainer" containerID="d0dedb7d2f2fe08ca2cbef2591a2bf14007765ab92efea23107dc3b3ed5fe3b5" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.401086 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0dedb7d2f2fe08ca2cbef2591a2bf14007765ab92efea23107dc3b3ed5fe3b5\": container with ID starting with d0dedb7d2f2fe08ca2cbef2591a2bf14007765ab92efea23107dc3b3ed5fe3b5 not found: ID does not exist" containerID="d0dedb7d2f2fe08ca2cbef2591a2bf14007765ab92efea23107dc3b3ed5fe3b5" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.401127 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0dedb7d2f2fe08ca2cbef2591a2bf14007765ab92efea23107dc3b3ed5fe3b5"} err="failed to get container status \"d0dedb7d2f2fe08ca2cbef2591a2bf14007765ab92efea23107dc3b3ed5fe3b5\": rpc error: code = NotFound desc = could not find container \"d0dedb7d2f2fe08ca2cbef2591a2bf14007765ab92efea23107dc3b3ed5fe3b5\": container with ID starting with d0dedb7d2f2fe08ca2cbef2591a2bf14007765ab92efea23107dc3b3ed5fe3b5 not found: ID does not exist" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.401157 4971 scope.go:117] "RemoveContainer" containerID="b7a531e1b81878202962e62594813fb92ae7ec185e8d5b1a61dd5208e5026e17" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.401513 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7a531e1b81878202962e62594813fb92ae7ec185e8d5b1a61dd5208e5026e17\": container with ID starting with b7a531e1b81878202962e62594813fb92ae7ec185e8d5b1a61dd5208e5026e17 not found: ID does not exist" containerID="b7a531e1b81878202962e62594813fb92ae7ec185e8d5b1a61dd5208e5026e17" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.401544 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7a531e1b81878202962e62594813fb92ae7ec185e8d5b1a61dd5208e5026e17"} err="failed to get container status \"b7a531e1b81878202962e62594813fb92ae7ec185e8d5b1a61dd5208e5026e17\": rpc error: code = NotFound desc = could not find container \"b7a531e1b81878202962e62594813fb92ae7ec185e8d5b1a61dd5208e5026e17\": container with ID starting with b7a531e1b81878202962e62594813fb92ae7ec185e8d5b1a61dd5208e5026e17 not found: ID does not exist" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.401561 4971 scope.go:117] "RemoveContainer" containerID="03e353df500485a1110e6c0becf141e132839caf456fbe001811b6193153f555" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.402058 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03e353df500485a1110e6c0becf141e132839caf456fbe001811b6193153f555\": container with ID starting with 03e353df500485a1110e6c0becf141e132839caf456fbe001811b6193153f555 not found: ID does not exist" containerID="03e353df500485a1110e6c0becf141e132839caf456fbe001811b6193153f555" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.402087 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03e353df500485a1110e6c0becf141e132839caf456fbe001811b6193153f555"} err="failed to get container status \"03e353df500485a1110e6c0becf141e132839caf456fbe001811b6193153f555\": rpc error: code = NotFound desc = could not find container \"03e353df500485a1110e6c0becf141e132839caf456fbe001811b6193153f555\": container with ID starting with 03e353df500485a1110e6c0becf141e132839caf456fbe001811b6193153f555 not found: ID does not exist" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.402108 4971 scope.go:117] "RemoveContainer" containerID="a8bdae3a9aa58e61af07185b5e489fce445e10466f2b4ff1c92a91275f7934d2" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.416608 4971 scope.go:117] "RemoveContainer" containerID="ce02bb1075c284aa444bfff808d0c5b398e493fbc55a84134cd986b105388be0" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.428453 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf854\" (UniqueName: \"kubernetes.io/projected/1054c243-8a85-4262-ba12-2ee5643d0255-kube-api-access-hf854\") pod \"1054c243-8a85-4262-ba12-2ee5643d0255\" (UID: \"1054c243-8a85-4262-ba12-2ee5643d0255\") " Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.428497 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srmbw\" (UniqueName: \"kubernetes.io/projected/e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd-kube-api-access-srmbw\") pod \"e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd\" (UID: \"e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd\") " Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.428535 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1054c243-8a85-4262-ba12-2ee5643d0255-catalog-content\") pod \"1054c243-8a85-4262-ba12-2ee5643d0255\" (UID: \"1054c243-8a85-4262-ba12-2ee5643d0255\") " Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.428589 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1054c243-8a85-4262-ba12-2ee5643d0255-utilities\") pod \"1054c243-8a85-4262-ba12-2ee5643d0255\" (UID: \"1054c243-8a85-4262-ba12-2ee5643d0255\") " Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.428620 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd-utilities\") pod \"e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd\" (UID: \"e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd\") " Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.428648 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd-catalog-content\") pod \"e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd\" (UID: \"e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd\") " Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.428964 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cb8b120-bccf-4c59-9c72-83c6169e3411-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.428985 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cb8b120-bccf-4c59-9c72-83c6169e3411-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.428999 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbe25e82-76e3-4639-98f8-75a1e7f51c19-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.429011 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6tj7\" (UniqueName: \"kubernetes.io/projected/9cb8b120-bccf-4c59-9c72-83c6169e3411-kube-api-access-s6tj7\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.429024 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2k26\" (UniqueName: \"kubernetes.io/projected/dbe25e82-76e3-4639-98f8-75a1e7f51c19-kube-api-access-z2k26\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.429438 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1054c243-8a85-4262-ba12-2ee5643d0255-utilities" (OuterVolumeSpecName: "utilities") pod "1054c243-8a85-4262-ba12-2ee5643d0255" (UID: "1054c243-8a85-4262-ba12-2ee5643d0255"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.429898 4971 scope.go:117] "RemoveContainer" containerID="a8bdae3a9aa58e61af07185b5e489fce445e10466f2b4ff1c92a91275f7934d2" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.430529 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd-utilities" (OuterVolumeSpecName: "utilities") pod "e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd" (UID: "e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.431588 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd-kube-api-access-srmbw" (OuterVolumeSpecName: "kube-api-access-srmbw") pod "e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd" (UID: "e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd"). InnerVolumeSpecName "kube-api-access-srmbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.432638 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1054c243-8a85-4262-ba12-2ee5643d0255-kube-api-access-hf854" (OuterVolumeSpecName: "kube-api-access-hf854") pod "1054c243-8a85-4262-ba12-2ee5643d0255" (UID: "1054c243-8a85-4262-ba12-2ee5643d0255"). InnerVolumeSpecName "kube-api-access-hf854". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.432662 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8bdae3a9aa58e61af07185b5e489fce445e10466f2b4ff1c92a91275f7934d2\": container with ID starting with a8bdae3a9aa58e61af07185b5e489fce445e10466f2b4ff1c92a91275f7934d2 not found: ID does not exist" containerID="a8bdae3a9aa58e61af07185b5e489fce445e10466f2b4ff1c92a91275f7934d2" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.432689 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8bdae3a9aa58e61af07185b5e489fce445e10466f2b4ff1c92a91275f7934d2"} err="failed to get container status \"a8bdae3a9aa58e61af07185b5e489fce445e10466f2b4ff1c92a91275f7934d2\": rpc error: code = NotFound desc = could not find container \"a8bdae3a9aa58e61af07185b5e489fce445e10466f2b4ff1c92a91275f7934d2\": container with ID starting with a8bdae3a9aa58e61af07185b5e489fce445e10466f2b4ff1c92a91275f7934d2 not found: ID does not exist" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.432712 4971 scope.go:117] "RemoveContainer" containerID="ce02bb1075c284aa444bfff808d0c5b398e493fbc55a84134cd986b105388be0" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.435147 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce02bb1075c284aa444bfff808d0c5b398e493fbc55a84134cd986b105388be0\": container with ID starting with ce02bb1075c284aa444bfff808d0c5b398e493fbc55a84134cd986b105388be0 not found: ID does not exist" containerID="ce02bb1075c284aa444bfff808d0c5b398e493fbc55a84134cd986b105388be0" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.435187 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce02bb1075c284aa444bfff808d0c5b398e493fbc55a84134cd986b105388be0"} err="failed to get container status \"ce02bb1075c284aa444bfff808d0c5b398e493fbc55a84134cd986b105388be0\": rpc error: code = NotFound desc = could not find container \"ce02bb1075c284aa444bfff808d0c5b398e493fbc55a84134cd986b105388be0\": container with ID starting with ce02bb1075c284aa444bfff808d0c5b398e493fbc55a84134cd986b105388be0 not found: ID does not exist" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.435220 4971 scope.go:117] "RemoveContainer" containerID="abaa47a6c4c1d5ffbb9e38d7598036e3eb6edaf32ce6d33ac34fcdeec5e1fcaa" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.451077 4971 scope.go:117] "RemoveContainer" containerID="e8a19e868089f0f66d6f010190443aadd0d112b4c95f6118e51f84124b498f83" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.470147 4971 scope.go:117] "RemoveContainer" containerID="abfa4c21dc6ec470e2bbbcf8e4aafc576c7aa170f7f11636f67174966cb1a62f" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.488396 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1054c243-8a85-4262-ba12-2ee5643d0255-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1054c243-8a85-4262-ba12-2ee5643d0255" (UID: "1054c243-8a85-4262-ba12-2ee5643d0255"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.490731 4971 scope.go:117] "RemoveContainer" containerID="abaa47a6c4c1d5ffbb9e38d7598036e3eb6edaf32ce6d33ac34fcdeec5e1fcaa" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.491322 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abaa47a6c4c1d5ffbb9e38d7598036e3eb6edaf32ce6d33ac34fcdeec5e1fcaa\": container with ID starting with abaa47a6c4c1d5ffbb9e38d7598036e3eb6edaf32ce6d33ac34fcdeec5e1fcaa not found: ID does not exist" containerID="abaa47a6c4c1d5ffbb9e38d7598036e3eb6edaf32ce6d33ac34fcdeec5e1fcaa" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.491383 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abaa47a6c4c1d5ffbb9e38d7598036e3eb6edaf32ce6d33ac34fcdeec5e1fcaa"} err="failed to get container status \"abaa47a6c4c1d5ffbb9e38d7598036e3eb6edaf32ce6d33ac34fcdeec5e1fcaa\": rpc error: code = NotFound desc = could not find container \"abaa47a6c4c1d5ffbb9e38d7598036e3eb6edaf32ce6d33ac34fcdeec5e1fcaa\": container with ID starting with abaa47a6c4c1d5ffbb9e38d7598036e3eb6edaf32ce6d33ac34fcdeec5e1fcaa not found: ID does not exist" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.491412 4971 scope.go:117] "RemoveContainer" containerID="e8a19e868089f0f66d6f010190443aadd0d112b4c95f6118e51f84124b498f83" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.492160 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8a19e868089f0f66d6f010190443aadd0d112b4c95f6118e51f84124b498f83\": container with ID starting with e8a19e868089f0f66d6f010190443aadd0d112b4c95f6118e51f84124b498f83 not found: ID does not exist" containerID="e8a19e868089f0f66d6f010190443aadd0d112b4c95f6118e51f84124b498f83" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.492188 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8a19e868089f0f66d6f010190443aadd0d112b4c95f6118e51f84124b498f83"} err="failed to get container status \"e8a19e868089f0f66d6f010190443aadd0d112b4c95f6118e51f84124b498f83\": rpc error: code = NotFound desc = could not find container \"e8a19e868089f0f66d6f010190443aadd0d112b4c95f6118e51f84124b498f83\": container with ID starting with e8a19e868089f0f66d6f010190443aadd0d112b4c95f6118e51f84124b498f83 not found: ID does not exist" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.492208 4971 scope.go:117] "RemoveContainer" containerID="abfa4c21dc6ec470e2bbbcf8e4aafc576c7aa170f7f11636f67174966cb1a62f" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.492494 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abfa4c21dc6ec470e2bbbcf8e4aafc576c7aa170f7f11636f67174966cb1a62f\": container with ID starting with abfa4c21dc6ec470e2bbbcf8e4aafc576c7aa170f7f11636f67174966cb1a62f not found: ID does not exist" containerID="abfa4c21dc6ec470e2bbbcf8e4aafc576c7aa170f7f11636f67174966cb1a62f" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.492515 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abfa4c21dc6ec470e2bbbcf8e4aafc576c7aa170f7f11636f67174966cb1a62f"} err="failed to get container status \"abfa4c21dc6ec470e2bbbcf8e4aafc576c7aa170f7f11636f67174966cb1a62f\": rpc error: code = NotFound desc = could not find container \"abfa4c21dc6ec470e2bbbcf8e4aafc576c7aa170f7f11636f67174966cb1a62f\": container with ID starting with abfa4c21dc6ec470e2bbbcf8e4aafc576c7aa170f7f11636f67174966cb1a62f not found: ID does not exist" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.492532 4971 scope.go:117] "RemoveContainer" containerID="aa6447d542dcde765db807feb3b3cb233af3d71acee837533cf1280dbf811020" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.500316 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbe25e82-76e3-4639-98f8-75a1e7f51c19-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dbe25e82-76e3-4639-98f8-75a1e7f51c19" (UID: "dbe25e82-76e3-4639-98f8-75a1e7f51c19"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.501067 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd" (UID: "e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.504561 4971 scope.go:117] "RemoveContainer" containerID="ea1420ceee293501f737d2da43ca644d5c18cc4d20a9f9e75467463ce26da0d8" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.518913 4971 scope.go:117] "RemoveContainer" containerID="51663fd9e42e2c45a32c57f4ae3dde98d1a770eb73da12d3f796683e7dba1759" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.531829 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbe25e82-76e3-4639-98f8-75a1e7f51c19-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.531889 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1054c243-8a85-4262-ba12-2ee5643d0255-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.531904 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.531915 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.531958 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hf854\" (UniqueName: \"kubernetes.io/projected/1054c243-8a85-4262-ba12-2ee5643d0255-kube-api-access-hf854\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.531972 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srmbw\" (UniqueName: \"kubernetes.io/projected/e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd-kube-api-access-srmbw\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.531983 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1054c243-8a85-4262-ba12-2ee5643d0255-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.535013 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-26997"] Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.536048 4971 scope.go:117] "RemoveContainer" containerID="aa6447d542dcde765db807feb3b3cb233af3d71acee837533cf1280dbf811020" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.536462 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa6447d542dcde765db807feb3b3cb233af3d71acee837533cf1280dbf811020\": container with ID starting with aa6447d542dcde765db807feb3b3cb233af3d71acee837533cf1280dbf811020 not found: ID does not exist" containerID="aa6447d542dcde765db807feb3b3cb233af3d71acee837533cf1280dbf811020" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.536497 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa6447d542dcde765db807feb3b3cb233af3d71acee837533cf1280dbf811020"} err="failed to get container status \"aa6447d542dcde765db807feb3b3cb233af3d71acee837533cf1280dbf811020\": rpc error: code = NotFound desc = could not find container \"aa6447d542dcde765db807feb3b3cb233af3d71acee837533cf1280dbf811020\": container with ID starting with aa6447d542dcde765db807feb3b3cb233af3d71acee837533cf1280dbf811020 not found: ID does not exist" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.536524 4971 scope.go:117] "RemoveContainer" containerID="ea1420ceee293501f737d2da43ca644d5c18cc4d20a9f9e75467463ce26da0d8" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.536861 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea1420ceee293501f737d2da43ca644d5c18cc4d20a9f9e75467463ce26da0d8\": container with ID starting with ea1420ceee293501f737d2da43ca644d5c18cc4d20a9f9e75467463ce26da0d8 not found: ID does not exist" containerID="ea1420ceee293501f737d2da43ca644d5c18cc4d20a9f9e75467463ce26da0d8" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.536932 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea1420ceee293501f737d2da43ca644d5c18cc4d20a9f9e75467463ce26da0d8"} err="failed to get container status \"ea1420ceee293501f737d2da43ca644d5c18cc4d20a9f9e75467463ce26da0d8\": rpc error: code = NotFound desc = could not find container \"ea1420ceee293501f737d2da43ca644d5c18cc4d20a9f9e75467463ce26da0d8\": container with ID starting with ea1420ceee293501f737d2da43ca644d5c18cc4d20a9f9e75467463ce26da0d8 not found: ID does not exist" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.536950 4971 scope.go:117] "RemoveContainer" containerID="51663fd9e42e2c45a32c57f4ae3dde98d1a770eb73da12d3f796683e7dba1759" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.537384 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51663fd9e42e2c45a32c57f4ae3dde98d1a770eb73da12d3f796683e7dba1759\": container with ID starting with 51663fd9e42e2c45a32c57f4ae3dde98d1a770eb73da12d3f796683e7dba1759 not found: ID does not exist" containerID="51663fd9e42e2c45a32c57f4ae3dde98d1a770eb73da12d3f796683e7dba1759" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.537429 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51663fd9e42e2c45a32c57f4ae3dde98d1a770eb73da12d3f796683e7dba1759"} err="failed to get container status \"51663fd9e42e2c45a32c57f4ae3dde98d1a770eb73da12d3f796683e7dba1759\": rpc error: code = NotFound desc = could not find container \"51663fd9e42e2c45a32c57f4ae3dde98d1a770eb73da12d3f796683e7dba1759\": container with ID starting with 51663fd9e42e2c45a32c57f4ae3dde98d1a770eb73da12d3f796683e7dba1759 not found: ID does not exist" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.537480 4971 scope.go:117] "RemoveContainer" containerID="2e0b71b37e6c446367e99b79a27d62f1311e67c1646f01f7c2d6ee4a8fcd7acd" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.554622 4971 scope.go:117] "RemoveContainer" containerID="8cc8fcea12f3bf050a8bfd06f356ba185ad008fc4401b9a286faeb1b4d65e39d" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.560424 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-f8d9f876c-66mp8"] Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.561640 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbe25e82-76e3-4639-98f8-75a1e7f51c19" containerName="extract-content" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.561686 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbe25e82-76e3-4639-98f8-75a1e7f51c19" containerName="extract-content" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.561697 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd" containerName="extract-content" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.561703 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd" containerName="extract-content" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.561709 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cb8b120-bccf-4c59-9c72-83c6169e3411" containerName="registry-server" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.561717 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb8b120-bccf-4c59-9c72-83c6169e3411" containerName="registry-server" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.561727 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbe25e82-76e3-4639-98f8-75a1e7f51c19" containerName="extract-utilities" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.561733 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbe25e82-76e3-4639-98f8-75a1e7f51c19" containerName="extract-utilities" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.561768 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cb8b120-bccf-4c59-9c72-83c6169e3411" containerName="extract-utilities" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.561774 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb8b120-bccf-4c59-9c72-83c6169e3411" containerName="extract-utilities" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.561782 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed6451f-4bc6-4dcc-b84c-413dbb95114b" containerName="marketplace-operator" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.561788 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed6451f-4bc6-4dcc-b84c-413dbb95114b" containerName="marketplace-operator" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.561796 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd" containerName="extract-utilities" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.561802 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd" containerName="extract-utilities" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.561811 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cb8b120-bccf-4c59-9c72-83c6169e3411" containerName="extract-content" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.561841 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb8b120-bccf-4c59-9c72-83c6169e3411" containerName="extract-content" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.561848 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbe25e82-76e3-4639-98f8-75a1e7f51c19" containerName="registry-server" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.561854 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbe25e82-76e3-4639-98f8-75a1e7f51c19" containerName="registry-server" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.561862 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1054c243-8a85-4262-ba12-2ee5643d0255" containerName="extract-utilities" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.561868 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1054c243-8a85-4262-ba12-2ee5643d0255" containerName="extract-utilities" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.561878 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1054c243-8a85-4262-ba12-2ee5643d0255" containerName="extract-content" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.561884 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1054c243-8a85-4262-ba12-2ee5643d0255" containerName="extract-content" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.561889 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1054c243-8a85-4262-ba12-2ee5643d0255" containerName="registry-server" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.561895 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1054c243-8a85-4262-ba12-2ee5643d0255" containerName="registry-server" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.561928 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd" containerName="registry-server" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.561933 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd" containerName="registry-server" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.561940 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed6451f-4bc6-4dcc-b84c-413dbb95114b" containerName="marketplace-operator" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.561946 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed6451f-4bc6-4dcc-b84c-413dbb95114b" containerName="marketplace-operator" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.562091 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cb8b120-bccf-4c59-9c72-83c6169e3411" containerName="registry-server" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.562101 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbe25e82-76e3-4639-98f8-75a1e7f51c19" containerName="registry-server" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.562110 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd" containerName="registry-server" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.562118 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ed6451f-4bc6-4dcc-b84c-413dbb95114b" containerName="marketplace-operator" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.562127 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ed6451f-4bc6-4dcc-b84c-413dbb95114b" containerName="marketplace-operator" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.562164 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1054c243-8a85-4262-ba12-2ee5643d0255" containerName="registry-server" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.562587 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f8d9f876c-66mp8" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.564419 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.566820 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.566996 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.567461 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.567552 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.567794 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.569590 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f8d9f876c-66mp8"] Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.573396 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.585744 4971 scope.go:117] "RemoveContainer" containerID="a00ac93510f82d2dcb361af8d5e5a517a32e9e2abc45a8ac63848e6c1c2ff32f" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.604930 4971 scope.go:117] "RemoveContainer" containerID="2e0b71b37e6c446367e99b79a27d62f1311e67c1646f01f7c2d6ee4a8fcd7acd" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.605332 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e0b71b37e6c446367e99b79a27d62f1311e67c1646f01f7c2d6ee4a8fcd7acd\": container with ID starting with 2e0b71b37e6c446367e99b79a27d62f1311e67c1646f01f7c2d6ee4a8fcd7acd not found: ID does not exist" containerID="2e0b71b37e6c446367e99b79a27d62f1311e67c1646f01f7c2d6ee4a8fcd7acd" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.605387 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e0b71b37e6c446367e99b79a27d62f1311e67c1646f01f7c2d6ee4a8fcd7acd"} err="failed to get container status \"2e0b71b37e6c446367e99b79a27d62f1311e67c1646f01f7c2d6ee4a8fcd7acd\": rpc error: code = NotFound desc = could not find container \"2e0b71b37e6c446367e99b79a27d62f1311e67c1646f01f7c2d6ee4a8fcd7acd\": container with ID starting with 2e0b71b37e6c446367e99b79a27d62f1311e67c1646f01f7c2d6ee4a8fcd7acd not found: ID does not exist" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.605437 4971 scope.go:117] "RemoveContainer" containerID="8cc8fcea12f3bf050a8bfd06f356ba185ad008fc4401b9a286faeb1b4d65e39d" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.605707 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cc8fcea12f3bf050a8bfd06f356ba185ad008fc4401b9a286faeb1b4d65e39d\": container with ID starting with 8cc8fcea12f3bf050a8bfd06f356ba185ad008fc4401b9a286faeb1b4d65e39d not found: ID does not exist" containerID="8cc8fcea12f3bf050a8bfd06f356ba185ad008fc4401b9a286faeb1b4d65e39d" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.605731 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cc8fcea12f3bf050a8bfd06f356ba185ad008fc4401b9a286faeb1b4d65e39d"} err="failed to get container status \"8cc8fcea12f3bf050a8bfd06f356ba185ad008fc4401b9a286faeb1b4d65e39d\": rpc error: code = NotFound desc = could not find container \"8cc8fcea12f3bf050a8bfd06f356ba185ad008fc4401b9a286faeb1b4d65e39d\": container with ID starting with 8cc8fcea12f3bf050a8bfd06f356ba185ad008fc4401b9a286faeb1b4d65e39d not found: ID does not exist" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.605746 4971 scope.go:117] "RemoveContainer" containerID="a00ac93510f82d2dcb361af8d5e5a517a32e9e2abc45a8ac63848e6c1c2ff32f" Mar 09 09:25:53 crc kubenswrapper[4971]: E0309 09:25:53.606049 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a00ac93510f82d2dcb361af8d5e5a517a32e9e2abc45a8ac63848e6c1c2ff32f\": container with ID starting with a00ac93510f82d2dcb361af8d5e5a517a32e9e2abc45a8ac63848e6c1c2ff32f not found: ID does not exist" containerID="a00ac93510f82d2dcb361af8d5e5a517a32e9e2abc45a8ac63848e6c1c2ff32f" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.606089 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a00ac93510f82d2dcb361af8d5e5a517a32e9e2abc45a8ac63848e6c1c2ff32f"} err="failed to get container status \"a00ac93510f82d2dcb361af8d5e5a517a32e9e2abc45a8ac63848e6c1c2ff32f\": rpc error: code = NotFound desc = could not find container \"a00ac93510f82d2dcb361af8d5e5a517a32e9e2abc45a8ac63848e6c1c2ff32f\": container with ID starting with a00ac93510f82d2dcb361af8d5e5a517a32e9e2abc45a8ac63848e6c1c2ff32f not found: ID does not exist" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.659512 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5cqt9"] Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.664776 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5cqt9"] Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.668683 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ft9v2"] Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.678326 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ft9v2"] Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.690951 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k6x5k"] Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.695607 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k6x5k"] Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.705112 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4dfvc"] Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.717980 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4dfvc"] Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.734429 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwzd2\" (UniqueName: \"kubernetes.io/projected/8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c-kube-api-access-lwzd2\") pod \"controller-manager-f8d9f876c-66mp8\" (UID: \"8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-66mp8" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.734492 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c-proxy-ca-bundles\") pod \"controller-manager-f8d9f876c-66mp8\" (UID: \"8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-66mp8" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.734528 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c-serving-cert\") pod \"controller-manager-f8d9f876c-66mp8\" (UID: \"8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-66mp8" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.734571 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c-config\") pod \"controller-manager-f8d9f876c-66mp8\" (UID: \"8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-66mp8" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.734608 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c-client-ca\") pod \"controller-manager-f8d9f876c-66mp8\" (UID: \"8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-66mp8" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.835451 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c-serving-cert\") pod \"controller-manager-f8d9f876c-66mp8\" (UID: \"8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-66mp8" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.835520 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c-config\") pod \"controller-manager-f8d9f876c-66mp8\" (UID: \"8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-66mp8" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.835563 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c-client-ca\") pod \"controller-manager-f8d9f876c-66mp8\" (UID: \"8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-66mp8" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.835609 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwzd2\" (UniqueName: \"kubernetes.io/projected/8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c-kube-api-access-lwzd2\") pod \"controller-manager-f8d9f876c-66mp8\" (UID: \"8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-66mp8" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.835645 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c-proxy-ca-bundles\") pod \"controller-manager-f8d9f876c-66mp8\" (UID: \"8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-66mp8" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.836721 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c-proxy-ca-bundles\") pod \"controller-manager-f8d9f876c-66mp8\" (UID: \"8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-66mp8" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.836759 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c-client-ca\") pod \"controller-manager-f8d9f876c-66mp8\" (UID: \"8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-66mp8" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.837120 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c-config\") pod \"controller-manager-f8d9f876c-66mp8\" (UID: \"8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-66mp8" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.840542 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c-serving-cert\") pod \"controller-manager-f8d9f876c-66mp8\" (UID: \"8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-66mp8" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.853326 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwzd2\" (UniqueName: \"kubernetes.io/projected/8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c-kube-api-access-lwzd2\") pod \"controller-manager-f8d9f876c-66mp8\" (UID: \"8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c\") " pod="openshift-controller-manager/controller-manager-f8d9f876c-66mp8" Mar 09 09:25:53 crc kubenswrapper[4971]: I0309 09:25:53.886976 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f8d9f876c-66mp8" Mar 09 09:25:54 crc kubenswrapper[4971]: I0309 09:25:54.062224 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f8d9f876c-66mp8"] Mar 09 09:25:54 crc kubenswrapper[4971]: W0309 09:25:54.072496 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e4a8b97_7b89_47e6_afd0_88c09e2fdc5c.slice/crio-087fbaa3971d0afc48b199e7d1163dfea4d929a50b31db5500c0d2d5c30917de WatchSource:0}: Error finding container 087fbaa3971d0afc48b199e7d1163dfea4d929a50b31db5500c0d2d5c30917de: Status 404 returned error can't find the container with id 087fbaa3971d0afc48b199e7d1163dfea4d929a50b31db5500c0d2d5c30917de Mar 09 09:25:54 crc kubenswrapper[4971]: I0309 09:25:54.362736 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-26997" event={"ID":"3cda571e-d5b5-4436-8846-df239e1c4b79","Type":"ContainerStarted","Data":"37ea26073d902fb0c43565327a520dc429871fd950051d916c62b61a1480984d"} Mar 09 09:25:54 crc kubenswrapper[4971]: I0309 09:25:54.363123 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-26997" event={"ID":"3cda571e-d5b5-4436-8846-df239e1c4b79","Type":"ContainerStarted","Data":"b6bcc92df43e2be0a6ad45d905de05550724ba879ce4dda2cbbedd2f50880709"} Mar 09 09:25:54 crc kubenswrapper[4971]: I0309 09:25:54.363144 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-26997" Mar 09 09:25:54 crc kubenswrapper[4971]: I0309 09:25:54.367135 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f8d9f876c-66mp8" event={"ID":"8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c","Type":"ContainerStarted","Data":"468d22ea798abb7e3d12b6f54705e7e70bbcee5dbe2175bbe1c1e95b4016c1b3"} Mar 09 09:25:54 crc kubenswrapper[4971]: I0309 09:25:54.367176 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f8d9f876c-66mp8" event={"ID":"8e4a8b97-7b89-47e6-afd0-88c09e2fdc5c","Type":"ContainerStarted","Data":"087fbaa3971d0afc48b199e7d1163dfea4d929a50b31db5500c0d2d5c30917de"} Mar 09 09:25:54 crc kubenswrapper[4971]: I0309 09:25:54.367210 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-26997" Mar 09 09:25:54 crc kubenswrapper[4971]: I0309 09:25:54.399979 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-26997" podStartSLOduration=2.3999586600000002 podStartE2EDuration="2.39995866s" podCreationTimestamp="2026-03-09 09:25:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:25:54.396442684 +0000 UTC m=+357.956370514" watchObservedRunningTime="2026-03-09 09:25:54.39995866 +0000 UTC m=+357.959886470" Mar 09 09:25:54 crc kubenswrapper[4971]: I0309 09:25:54.414621 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-f8d9f876c-66mp8" podStartSLOduration=3.414605529 podStartE2EDuration="3.414605529s" podCreationTimestamp="2026-03-09 09:25:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:25:54.413242316 +0000 UTC m=+357.973170146" watchObservedRunningTime="2026-03-09 09:25:54.414605529 +0000 UTC m=+357.974533339" Mar 09 09:25:54 crc kubenswrapper[4971]: I0309 09:25:54.885466 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-przm2"] Mar 09 09:25:54 crc kubenswrapper[4971]: I0309 09:25:54.887044 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-przm2" Mar 09 09:25:54 crc kubenswrapper[4971]: I0309 09:25:54.889902 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 09:25:54 crc kubenswrapper[4971]: I0309 09:25:54.895961 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-przm2"] Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.048619 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a31dcdaa-d065-40ad-b444-896e8f2524bc-catalog-content\") pod \"redhat-marketplace-przm2\" (UID: \"a31dcdaa-d065-40ad-b444-896e8f2524bc\") " pod="openshift-marketplace/redhat-marketplace-przm2" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.048661 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z6qc\" (UniqueName: \"kubernetes.io/projected/a31dcdaa-d065-40ad-b444-896e8f2524bc-kube-api-access-5z6qc\") pod \"redhat-marketplace-przm2\" (UID: \"a31dcdaa-d065-40ad-b444-896e8f2524bc\") " pod="openshift-marketplace/redhat-marketplace-przm2" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.048713 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a31dcdaa-d065-40ad-b444-896e8f2524bc-utilities\") pod \"redhat-marketplace-przm2\" (UID: \"a31dcdaa-d065-40ad-b444-896e8f2524bc\") " pod="openshift-marketplace/redhat-marketplace-przm2" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.084562 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9v8xz"] Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.086799 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9v8xz" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.088677 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.097396 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9v8xz"] Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.150225 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z6qc\" (UniqueName: \"kubernetes.io/projected/a31dcdaa-d065-40ad-b444-896e8f2524bc-kube-api-access-5z6qc\") pod \"redhat-marketplace-przm2\" (UID: \"a31dcdaa-d065-40ad-b444-896e8f2524bc\") " pod="openshift-marketplace/redhat-marketplace-przm2" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.150274 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a31dcdaa-d065-40ad-b444-896e8f2524bc-catalog-content\") pod \"redhat-marketplace-przm2\" (UID: \"a31dcdaa-d065-40ad-b444-896e8f2524bc\") " pod="openshift-marketplace/redhat-marketplace-przm2" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.150306 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a31dcdaa-d065-40ad-b444-896e8f2524bc-utilities\") pod \"redhat-marketplace-przm2\" (UID: \"a31dcdaa-d065-40ad-b444-896e8f2524bc\") " pod="openshift-marketplace/redhat-marketplace-przm2" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.150758 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a31dcdaa-d065-40ad-b444-896e8f2524bc-utilities\") pod \"redhat-marketplace-przm2\" (UID: \"a31dcdaa-d065-40ad-b444-896e8f2524bc\") " pod="openshift-marketplace/redhat-marketplace-przm2" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.150889 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a31dcdaa-d065-40ad-b444-896e8f2524bc-catalog-content\") pod \"redhat-marketplace-przm2\" (UID: \"a31dcdaa-d065-40ad-b444-896e8f2524bc\") " pod="openshift-marketplace/redhat-marketplace-przm2" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.157079 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1054c243-8a85-4262-ba12-2ee5643d0255" path="/var/lib/kubelet/pods/1054c243-8a85-4262-ba12-2ee5643d0255/volumes" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.157791 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ed6451f-4bc6-4dcc-b84c-413dbb95114b" path="/var/lib/kubelet/pods/1ed6451f-4bc6-4dcc-b84c-413dbb95114b/volumes" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.158240 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cb8b120-bccf-4c59-9c72-83c6169e3411" path="/var/lib/kubelet/pods/9cb8b120-bccf-4c59-9c72-83c6169e3411/volumes" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.159267 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbe25e82-76e3-4639-98f8-75a1e7f51c19" path="/var/lib/kubelet/pods/dbe25e82-76e3-4639-98f8-75a1e7f51c19/volumes" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.159849 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd" path="/var/lib/kubelet/pods/e807cb52-ad3e-4bb8-85d1-b6e3ee6870dd/volumes" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.169678 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z6qc\" (UniqueName: \"kubernetes.io/projected/a31dcdaa-d065-40ad-b444-896e8f2524bc-kube-api-access-5z6qc\") pod \"redhat-marketplace-przm2\" (UID: \"a31dcdaa-d065-40ad-b444-896e8f2524bc\") " pod="openshift-marketplace/redhat-marketplace-przm2" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.205921 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-przm2" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.251590 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02ebe249-212b-44fd-87f9-3c8db2c3b826-catalog-content\") pod \"community-operators-9v8xz\" (UID: \"02ebe249-212b-44fd-87f9-3c8db2c3b826\") " pod="openshift-marketplace/community-operators-9v8xz" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.251712 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02ebe249-212b-44fd-87f9-3c8db2c3b826-utilities\") pod \"community-operators-9v8xz\" (UID: \"02ebe249-212b-44fd-87f9-3c8db2c3b826\") " pod="openshift-marketplace/community-operators-9v8xz" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.251765 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx45n\" (UniqueName: \"kubernetes.io/projected/02ebe249-212b-44fd-87f9-3c8db2c3b826-kube-api-access-nx45n\") pod \"community-operators-9v8xz\" (UID: \"02ebe249-212b-44fd-87f9-3c8db2c3b826\") " pod="openshift-marketplace/community-operators-9v8xz" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.353561 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02ebe249-212b-44fd-87f9-3c8db2c3b826-catalog-content\") pod \"community-operators-9v8xz\" (UID: \"02ebe249-212b-44fd-87f9-3c8db2c3b826\") " pod="openshift-marketplace/community-operators-9v8xz" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.353656 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02ebe249-212b-44fd-87f9-3c8db2c3b826-utilities\") pod \"community-operators-9v8xz\" (UID: \"02ebe249-212b-44fd-87f9-3c8db2c3b826\") " pod="openshift-marketplace/community-operators-9v8xz" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.353711 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx45n\" (UniqueName: \"kubernetes.io/projected/02ebe249-212b-44fd-87f9-3c8db2c3b826-kube-api-access-nx45n\") pod \"community-operators-9v8xz\" (UID: \"02ebe249-212b-44fd-87f9-3c8db2c3b826\") " pod="openshift-marketplace/community-operators-9v8xz" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.354423 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02ebe249-212b-44fd-87f9-3c8db2c3b826-catalog-content\") pod \"community-operators-9v8xz\" (UID: \"02ebe249-212b-44fd-87f9-3c8db2c3b826\") " pod="openshift-marketplace/community-operators-9v8xz" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.354526 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02ebe249-212b-44fd-87f9-3c8db2c3b826-utilities\") pod \"community-operators-9v8xz\" (UID: \"02ebe249-212b-44fd-87f9-3c8db2c3b826\") " pod="openshift-marketplace/community-operators-9v8xz" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.375788 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx45n\" (UniqueName: \"kubernetes.io/projected/02ebe249-212b-44fd-87f9-3c8db2c3b826-kube-api-access-nx45n\") pod \"community-operators-9v8xz\" (UID: \"02ebe249-212b-44fd-87f9-3c8db2c3b826\") " pod="openshift-marketplace/community-operators-9v8xz" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.380893 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-f8d9f876c-66mp8" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.400934 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-f8d9f876c-66mp8" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.406026 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9v8xz" Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.450193 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-przm2"] Mar 09 09:25:55 crc kubenswrapper[4971]: I0309 09:25:55.622290 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9v8xz"] Mar 09 09:25:55 crc kubenswrapper[4971]: W0309 09:25:55.624045 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02ebe249_212b_44fd_87f9_3c8db2c3b826.slice/crio-23aeccd20330515235e1c074078bb22a05148521812684504fd473de10d5131a WatchSource:0}: Error finding container 23aeccd20330515235e1c074078bb22a05148521812684504fd473de10d5131a: Status 404 returned error can't find the container with id 23aeccd20330515235e1c074078bb22a05148521812684504fd473de10d5131a Mar 09 09:25:56 crc kubenswrapper[4971]: I0309 09:25:56.391050 4971 generic.go:334] "Generic (PLEG): container finished" podID="a31dcdaa-d065-40ad-b444-896e8f2524bc" containerID="82280c93e49b3604314ab7c073094cf3c7225cf37e56eca7fc997e57bd42e1c8" exitCode=0 Mar 09 09:25:56 crc kubenswrapper[4971]: I0309 09:25:56.391156 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-przm2" event={"ID":"a31dcdaa-d065-40ad-b444-896e8f2524bc","Type":"ContainerDied","Data":"82280c93e49b3604314ab7c073094cf3c7225cf37e56eca7fc997e57bd42e1c8"} Mar 09 09:25:56 crc kubenswrapper[4971]: I0309 09:25:56.391450 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-przm2" event={"ID":"a31dcdaa-d065-40ad-b444-896e8f2524bc","Type":"ContainerStarted","Data":"9900968b0180cb9cbdc05474949564257fe29aad69b9975998359e50f897e0a0"} Mar 09 09:25:56 crc kubenswrapper[4971]: I0309 09:25:56.393330 4971 generic.go:334] "Generic (PLEG): container finished" podID="02ebe249-212b-44fd-87f9-3c8db2c3b826" containerID="4a3f7bbdc784658174581915946ffe133ca608dc459390c6d7dcf9cb5c84b7e5" exitCode=0 Mar 09 09:25:56 crc kubenswrapper[4971]: I0309 09:25:56.393416 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9v8xz" event={"ID":"02ebe249-212b-44fd-87f9-3c8db2c3b826","Type":"ContainerDied","Data":"4a3f7bbdc784658174581915946ffe133ca608dc459390c6d7dcf9cb5c84b7e5"} Mar 09 09:25:56 crc kubenswrapper[4971]: I0309 09:25:56.393466 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9v8xz" event={"ID":"02ebe249-212b-44fd-87f9-3c8db2c3b826","Type":"ContainerStarted","Data":"23aeccd20330515235e1c074078bb22a05148521812684504fd473de10d5131a"} Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.285096 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pqnjj"] Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.286275 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pqnjj" Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.291269 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.299036 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pqnjj"] Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.383487 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef1bcb9-ac3d-4891-8308-d53d5acf90ac-utilities\") pod \"certified-operators-pqnjj\" (UID: \"cef1bcb9-ac3d-4891-8308-d53d5acf90ac\") " pod="openshift-marketplace/certified-operators-pqnjj" Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.383599 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmt28\" (UniqueName: \"kubernetes.io/projected/cef1bcb9-ac3d-4891-8308-d53d5acf90ac-kube-api-access-kmt28\") pod \"certified-operators-pqnjj\" (UID: \"cef1bcb9-ac3d-4891-8308-d53d5acf90ac\") " pod="openshift-marketplace/certified-operators-pqnjj" Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.383650 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef1bcb9-ac3d-4891-8308-d53d5acf90ac-catalog-content\") pod \"certified-operators-pqnjj\" (UID: \"cef1bcb9-ac3d-4891-8308-d53d5acf90ac\") " pod="openshift-marketplace/certified-operators-pqnjj" Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.401638 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-przm2" event={"ID":"a31dcdaa-d065-40ad-b444-896e8f2524bc","Type":"ContainerStarted","Data":"b6bc75887fad3f18f047a8c9df760e07d4bde397b23808e425eb65994f35b0d1"} Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.405448 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9v8xz" event={"ID":"02ebe249-212b-44fd-87f9-3c8db2c3b826","Type":"ContainerStarted","Data":"9b67b694ddeb0d3bd22c53a8bd844238f81ad1e69f46fa235960fe8e373a5a69"} Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.485127 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qdtbr"] Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.486438 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qdtbr" Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.486509 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef1bcb9-ac3d-4891-8308-d53d5acf90ac-catalog-content\") pod \"certified-operators-pqnjj\" (UID: \"cef1bcb9-ac3d-4891-8308-d53d5acf90ac\") " pod="openshift-marketplace/certified-operators-pqnjj" Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.486591 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef1bcb9-ac3d-4891-8308-d53d5acf90ac-utilities\") pod \"certified-operators-pqnjj\" (UID: \"cef1bcb9-ac3d-4891-8308-d53d5acf90ac\") " pod="openshift-marketplace/certified-operators-pqnjj" Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.486644 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmt28\" (UniqueName: \"kubernetes.io/projected/cef1bcb9-ac3d-4891-8308-d53d5acf90ac-kube-api-access-kmt28\") pod \"certified-operators-pqnjj\" (UID: \"cef1bcb9-ac3d-4891-8308-d53d5acf90ac\") " pod="openshift-marketplace/certified-operators-pqnjj" Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.487180 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cef1bcb9-ac3d-4891-8308-d53d5acf90ac-catalog-content\") pod \"certified-operators-pqnjj\" (UID: \"cef1bcb9-ac3d-4891-8308-d53d5acf90ac\") " pod="openshift-marketplace/certified-operators-pqnjj" Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.487652 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cef1bcb9-ac3d-4891-8308-d53d5acf90ac-utilities\") pod \"certified-operators-pqnjj\" (UID: \"cef1bcb9-ac3d-4891-8308-d53d5acf90ac\") " pod="openshift-marketplace/certified-operators-pqnjj" Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.489771 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.497718 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qdtbr"] Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.513088 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmt28\" (UniqueName: \"kubernetes.io/projected/cef1bcb9-ac3d-4891-8308-d53d5acf90ac-kube-api-access-kmt28\") pod \"certified-operators-pqnjj\" (UID: \"cef1bcb9-ac3d-4891-8308-d53d5acf90ac\") " pod="openshift-marketplace/certified-operators-pqnjj" Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.587681 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/136304aa-bacf-46c4-8994-bc6491555b4c-catalog-content\") pod \"redhat-operators-qdtbr\" (UID: \"136304aa-bacf-46c4-8994-bc6491555b4c\") " pod="openshift-marketplace/redhat-operators-qdtbr" Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.587911 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/136304aa-bacf-46c4-8994-bc6491555b4c-utilities\") pod \"redhat-operators-qdtbr\" (UID: \"136304aa-bacf-46c4-8994-bc6491555b4c\") " pod="openshift-marketplace/redhat-operators-qdtbr" Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.588110 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hczm\" (UniqueName: \"kubernetes.io/projected/136304aa-bacf-46c4-8994-bc6491555b4c-kube-api-access-2hczm\") pod \"redhat-operators-qdtbr\" (UID: \"136304aa-bacf-46c4-8994-bc6491555b4c\") " pod="openshift-marketplace/redhat-operators-qdtbr" Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.638660 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pqnjj" Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.689424 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hczm\" (UniqueName: \"kubernetes.io/projected/136304aa-bacf-46c4-8994-bc6491555b4c-kube-api-access-2hczm\") pod \"redhat-operators-qdtbr\" (UID: \"136304aa-bacf-46c4-8994-bc6491555b4c\") " pod="openshift-marketplace/redhat-operators-qdtbr" Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.689489 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/136304aa-bacf-46c4-8994-bc6491555b4c-catalog-content\") pod \"redhat-operators-qdtbr\" (UID: \"136304aa-bacf-46c4-8994-bc6491555b4c\") " pod="openshift-marketplace/redhat-operators-qdtbr" Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.689528 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/136304aa-bacf-46c4-8994-bc6491555b4c-utilities\") pod \"redhat-operators-qdtbr\" (UID: \"136304aa-bacf-46c4-8994-bc6491555b4c\") " pod="openshift-marketplace/redhat-operators-qdtbr" Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.690092 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/136304aa-bacf-46c4-8994-bc6491555b4c-catalog-content\") pod \"redhat-operators-qdtbr\" (UID: \"136304aa-bacf-46c4-8994-bc6491555b4c\") " pod="openshift-marketplace/redhat-operators-qdtbr" Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.690143 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/136304aa-bacf-46c4-8994-bc6491555b4c-utilities\") pod \"redhat-operators-qdtbr\" (UID: \"136304aa-bacf-46c4-8994-bc6491555b4c\") " pod="openshift-marketplace/redhat-operators-qdtbr" Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.707001 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hczm\" (UniqueName: \"kubernetes.io/projected/136304aa-bacf-46c4-8994-bc6491555b4c-kube-api-access-2hczm\") pod \"redhat-operators-qdtbr\" (UID: \"136304aa-bacf-46c4-8994-bc6491555b4c\") " pod="openshift-marketplace/redhat-operators-qdtbr" Mar 09 09:25:57 crc kubenswrapper[4971]: I0309 09:25:57.804320 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qdtbr" Mar 09 09:25:58 crc kubenswrapper[4971]: I0309 09:25:58.035641 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pqnjj"] Mar 09 09:25:58 crc kubenswrapper[4971]: I0309 09:25:58.192959 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qdtbr"] Mar 09 09:25:58 crc kubenswrapper[4971]: W0309 09:25:58.250001 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod136304aa_bacf_46c4_8994_bc6491555b4c.slice/crio-e494d40b40bc11c77d67c8360913d487ac4ee76b2645f7b585d24b65c2a7de77 WatchSource:0}: Error finding container e494d40b40bc11c77d67c8360913d487ac4ee76b2645f7b585d24b65c2a7de77: Status 404 returned error can't find the container with id e494d40b40bc11c77d67c8360913d487ac4ee76b2645f7b585d24b65c2a7de77 Mar 09 09:25:58 crc kubenswrapper[4971]: I0309 09:25:58.411240 4971 generic.go:334] "Generic (PLEG): container finished" podID="cef1bcb9-ac3d-4891-8308-d53d5acf90ac" containerID="40e5a961876801830b05ec79e323699ae905849cc2d75445bc2b244ce99fe605" exitCode=0 Mar 09 09:25:58 crc kubenswrapper[4971]: I0309 09:25:58.412071 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pqnjj" event={"ID":"cef1bcb9-ac3d-4891-8308-d53d5acf90ac","Type":"ContainerDied","Data":"40e5a961876801830b05ec79e323699ae905849cc2d75445bc2b244ce99fe605"} Mar 09 09:25:58 crc kubenswrapper[4971]: I0309 09:25:58.412099 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pqnjj" event={"ID":"cef1bcb9-ac3d-4891-8308-d53d5acf90ac","Type":"ContainerStarted","Data":"ff46bc6d84bd431111073f4227d1da7057aaf2ee0ebd6eb24cfe9b01d0afd88f"} Mar 09 09:25:58 crc kubenswrapper[4971]: I0309 09:25:58.415930 4971 generic.go:334] "Generic (PLEG): container finished" podID="136304aa-bacf-46c4-8994-bc6491555b4c" containerID="17badbf9321fa92c5dc5b74de5e6fe44e1a673f5f207f1298e0bb0a6c1cdfa39" exitCode=0 Mar 09 09:25:58 crc kubenswrapper[4971]: I0309 09:25:58.416015 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qdtbr" event={"ID":"136304aa-bacf-46c4-8994-bc6491555b4c","Type":"ContainerDied","Data":"17badbf9321fa92c5dc5b74de5e6fe44e1a673f5f207f1298e0bb0a6c1cdfa39"} Mar 09 09:25:58 crc kubenswrapper[4971]: I0309 09:25:58.416045 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qdtbr" event={"ID":"136304aa-bacf-46c4-8994-bc6491555b4c","Type":"ContainerStarted","Data":"e494d40b40bc11c77d67c8360913d487ac4ee76b2645f7b585d24b65c2a7de77"} Mar 09 09:25:58 crc kubenswrapper[4971]: I0309 09:25:58.420435 4971 generic.go:334] "Generic (PLEG): container finished" podID="a31dcdaa-d065-40ad-b444-896e8f2524bc" containerID="b6bc75887fad3f18f047a8c9df760e07d4bde397b23808e425eb65994f35b0d1" exitCode=0 Mar 09 09:25:58 crc kubenswrapper[4971]: I0309 09:25:58.420497 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-przm2" event={"ID":"a31dcdaa-d065-40ad-b444-896e8f2524bc","Type":"ContainerDied","Data":"b6bc75887fad3f18f047a8c9df760e07d4bde397b23808e425eb65994f35b0d1"} Mar 09 09:25:58 crc kubenswrapper[4971]: I0309 09:25:58.424243 4971 generic.go:334] "Generic (PLEG): container finished" podID="02ebe249-212b-44fd-87f9-3c8db2c3b826" containerID="9b67b694ddeb0d3bd22c53a8bd844238f81ad1e69f46fa235960fe8e373a5a69" exitCode=0 Mar 09 09:25:58 crc kubenswrapper[4971]: I0309 09:25:58.424284 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9v8xz" event={"ID":"02ebe249-212b-44fd-87f9-3c8db2c3b826","Type":"ContainerDied","Data":"9b67b694ddeb0d3bd22c53a8bd844238f81ad1e69f46fa235960fe8e373a5a69"} Mar 09 09:25:59 crc kubenswrapper[4971]: I0309 09:25:59.434665 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9v8xz" event={"ID":"02ebe249-212b-44fd-87f9-3c8db2c3b826","Type":"ContainerStarted","Data":"5cc7555bec3e72637dec29ad212fbd7ba71248794cf263c2f7c103f3e10fd5d7"} Mar 09 09:25:59 crc kubenswrapper[4971]: I0309 09:25:59.441364 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pqnjj" event={"ID":"cef1bcb9-ac3d-4891-8308-d53d5acf90ac","Type":"ContainerStarted","Data":"17702b8c2132bc891d44e7d80962c588dc0c55a756affb5b3dfc22978fc835fd"} Mar 09 09:25:59 crc kubenswrapper[4971]: I0309 09:25:59.443016 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qdtbr" event={"ID":"136304aa-bacf-46c4-8994-bc6491555b4c","Type":"ContainerStarted","Data":"4fe6306bc3f55d53751b9f47cfd0ba2544e2262e316a9ce074047165e8af53d9"} Mar 09 09:25:59 crc kubenswrapper[4971]: I0309 09:25:59.445248 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-przm2" event={"ID":"a31dcdaa-d065-40ad-b444-896e8f2524bc","Type":"ContainerStarted","Data":"bb90495c3a63a03e1e40869de82e1a82ad0b77b1e8170f053c476514c26bf51f"} Mar 09 09:25:59 crc kubenswrapper[4971]: I0309 09:25:59.457078 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9v8xz" podStartSLOduration=1.986602246 podStartE2EDuration="4.457059505s" podCreationTimestamp="2026-03-09 09:25:55 +0000 UTC" firstStartedPulling="2026-03-09 09:25:56.395276379 +0000 UTC m=+359.955204189" lastFinishedPulling="2026-03-09 09:25:58.865733638 +0000 UTC m=+362.425661448" observedRunningTime="2026-03-09 09:25:59.454154893 +0000 UTC m=+363.014082713" watchObservedRunningTime="2026-03-09 09:25:59.457059505 +0000 UTC m=+363.016987315" Mar 09 09:25:59 crc kubenswrapper[4971]: I0309 09:25:59.475831 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-przm2" podStartSLOduration=2.833747238 podStartE2EDuration="5.475683391s" podCreationTimestamp="2026-03-09 09:25:54 +0000 UTC" firstStartedPulling="2026-03-09 09:25:56.395946845 +0000 UTC m=+359.955874655" lastFinishedPulling="2026-03-09 09:25:59.037882998 +0000 UTC m=+362.597810808" observedRunningTime="2026-03-09 09:25:59.469801277 +0000 UTC m=+363.029729097" watchObservedRunningTime="2026-03-09 09:25:59.475683391 +0000 UTC m=+363.035611201" Mar 09 09:26:00 crc kubenswrapper[4971]: I0309 09:26:00.135521 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550806-5x6nv"] Mar 09 09:26:00 crc kubenswrapper[4971]: I0309 09:26:00.136156 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550806-5x6nv" Mar 09 09:26:00 crc kubenswrapper[4971]: I0309 09:26:00.138273 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:26:00 crc kubenswrapper[4971]: I0309 09:26:00.139949 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:26:00 crc kubenswrapper[4971]: I0309 09:26:00.140406 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-xhrv2" Mar 09 09:26:00 crc kubenswrapper[4971]: I0309 09:26:00.144379 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550806-5x6nv"] Mar 09 09:26:00 crc kubenswrapper[4971]: I0309 09:26:00.323806 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfjkv\" (UniqueName: \"kubernetes.io/projected/a31b7627-abfd-4227-b142-0fcdca9e2b0b-kube-api-access-nfjkv\") pod \"auto-csr-approver-29550806-5x6nv\" (UID: \"a31b7627-abfd-4227-b142-0fcdca9e2b0b\") " pod="openshift-infra/auto-csr-approver-29550806-5x6nv" Mar 09 09:26:00 crc kubenswrapper[4971]: I0309 09:26:00.425245 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfjkv\" (UniqueName: \"kubernetes.io/projected/a31b7627-abfd-4227-b142-0fcdca9e2b0b-kube-api-access-nfjkv\") pod \"auto-csr-approver-29550806-5x6nv\" (UID: \"a31b7627-abfd-4227-b142-0fcdca9e2b0b\") " pod="openshift-infra/auto-csr-approver-29550806-5x6nv" Mar 09 09:26:00 crc kubenswrapper[4971]: I0309 09:26:00.461387 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfjkv\" (UniqueName: \"kubernetes.io/projected/a31b7627-abfd-4227-b142-0fcdca9e2b0b-kube-api-access-nfjkv\") pod \"auto-csr-approver-29550806-5x6nv\" (UID: \"a31b7627-abfd-4227-b142-0fcdca9e2b0b\") " pod="openshift-infra/auto-csr-approver-29550806-5x6nv" Mar 09 09:26:00 crc kubenswrapper[4971]: I0309 09:26:00.472885 4971 generic.go:334] "Generic (PLEG): container finished" podID="cef1bcb9-ac3d-4891-8308-d53d5acf90ac" containerID="17702b8c2132bc891d44e7d80962c588dc0c55a756affb5b3dfc22978fc835fd" exitCode=0 Mar 09 09:26:00 crc kubenswrapper[4971]: I0309 09:26:00.472990 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pqnjj" event={"ID":"cef1bcb9-ac3d-4891-8308-d53d5acf90ac","Type":"ContainerDied","Data":"17702b8c2132bc891d44e7d80962c588dc0c55a756affb5b3dfc22978fc835fd"} Mar 09 09:26:00 crc kubenswrapper[4971]: I0309 09:26:00.476012 4971 generic.go:334] "Generic (PLEG): container finished" podID="136304aa-bacf-46c4-8994-bc6491555b4c" containerID="4fe6306bc3f55d53751b9f47cfd0ba2544e2262e316a9ce074047165e8af53d9" exitCode=0 Mar 09 09:26:00 crc kubenswrapper[4971]: I0309 09:26:00.476071 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qdtbr" event={"ID":"136304aa-bacf-46c4-8994-bc6491555b4c","Type":"ContainerDied","Data":"4fe6306bc3f55d53751b9f47cfd0ba2544e2262e316a9ce074047165e8af53d9"} Mar 09 09:26:00 crc kubenswrapper[4971]: I0309 09:26:00.759620 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550806-5x6nv" Mar 09 09:26:01 crc kubenswrapper[4971]: I0309 09:26:01.236591 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550806-5x6nv"] Mar 09 09:26:01 crc kubenswrapper[4971]: W0309 09:26:01.241037 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda31b7627_abfd_4227_b142_0fcdca9e2b0b.slice/crio-daf4bf1d36bbaf13b1bf069d9db630988d4faa1868a66838219bb6fd5db72d18 WatchSource:0}: Error finding container daf4bf1d36bbaf13b1bf069d9db630988d4faa1868a66838219bb6fd5db72d18: Status 404 returned error can't find the container with id daf4bf1d36bbaf13b1bf069d9db630988d4faa1868a66838219bb6fd5db72d18 Mar 09 09:26:01 crc kubenswrapper[4971]: I0309 09:26:01.483656 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qdtbr" event={"ID":"136304aa-bacf-46c4-8994-bc6491555b4c","Type":"ContainerStarted","Data":"ad08581f967d380ca801adb8890f5d12dca4d70f5eb5c663941596b45668b862"} Mar 09 09:26:01 crc kubenswrapper[4971]: I0309 09:26:01.485364 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550806-5x6nv" event={"ID":"a31b7627-abfd-4227-b142-0fcdca9e2b0b","Type":"ContainerStarted","Data":"daf4bf1d36bbaf13b1bf069d9db630988d4faa1868a66838219bb6fd5db72d18"} Mar 09 09:26:01 crc kubenswrapper[4971]: I0309 09:26:01.487889 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pqnjj" event={"ID":"cef1bcb9-ac3d-4891-8308-d53d5acf90ac","Type":"ContainerStarted","Data":"963df49a96f932bf73502a021f882e5a150236a09ef6d7c04ea3dfb898e37877"} Mar 09 09:26:01 crc kubenswrapper[4971]: I0309 09:26:01.500468 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qdtbr" podStartSLOduration=2.017064448 podStartE2EDuration="4.500449963s" podCreationTimestamp="2026-03-09 09:25:57 +0000 UTC" firstStartedPulling="2026-03-09 09:25:58.417432787 +0000 UTC m=+361.977360597" lastFinishedPulling="2026-03-09 09:26:00.900818302 +0000 UTC m=+364.460746112" observedRunningTime="2026-03-09 09:26:01.499892339 +0000 UTC m=+365.059820149" watchObservedRunningTime="2026-03-09 09:26:01.500449963 +0000 UTC m=+365.060377773" Mar 09 09:26:01 crc kubenswrapper[4971]: I0309 09:26:01.519870 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pqnjj" podStartSLOduration=1.859866334 podStartE2EDuration="4.519850369s" podCreationTimestamp="2026-03-09 09:25:57 +0000 UTC" firstStartedPulling="2026-03-09 09:25:58.412797133 +0000 UTC m=+361.972724933" lastFinishedPulling="2026-03-09 09:26:01.072781158 +0000 UTC m=+364.632708968" observedRunningTime="2026-03-09 09:26:01.51744736 +0000 UTC m=+365.077375170" watchObservedRunningTime="2026-03-09 09:26:01.519850369 +0000 UTC m=+365.079778179" Mar 09 09:26:02 crc kubenswrapper[4971]: I0309 09:26:02.495454 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550806-5x6nv" event={"ID":"a31b7627-abfd-4227-b142-0fcdca9e2b0b","Type":"ContainerStarted","Data":"dd5eef1804aa68fd008d1b1c595dfd66ff453887a895d5ba26105d768f3b6e03"} Mar 09 09:26:02 crc kubenswrapper[4971]: I0309 09:26:02.513430 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550806-5x6nv" podStartSLOduration=1.6491204480000001 podStartE2EDuration="2.513415378s" podCreationTimestamp="2026-03-09 09:26:00 +0000 UTC" firstStartedPulling="2026-03-09 09:26:01.243165265 +0000 UTC m=+364.803093075" lastFinishedPulling="2026-03-09 09:26:02.107460205 +0000 UTC m=+365.667388005" observedRunningTime="2026-03-09 09:26:02.512906676 +0000 UTC m=+366.072834496" watchObservedRunningTime="2026-03-09 09:26:02.513415378 +0000 UTC m=+366.073343198" Mar 09 09:26:03 crc kubenswrapper[4971]: I0309 09:26:03.501202 4971 generic.go:334] "Generic (PLEG): container finished" podID="a31b7627-abfd-4227-b142-0fcdca9e2b0b" containerID="dd5eef1804aa68fd008d1b1c595dfd66ff453887a895d5ba26105d768f3b6e03" exitCode=0 Mar 09 09:26:03 crc kubenswrapper[4971]: I0309 09:26:03.501256 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550806-5x6nv" event={"ID":"a31b7627-abfd-4227-b142-0fcdca9e2b0b","Type":"ContainerDied","Data":"dd5eef1804aa68fd008d1b1c595dfd66ff453887a895d5ba26105d768f3b6e03"} Mar 09 09:26:04 crc kubenswrapper[4971]: I0309 09:26:04.891567 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550806-5x6nv" Mar 09 09:26:05 crc kubenswrapper[4971]: I0309 09:26:05.093860 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfjkv\" (UniqueName: \"kubernetes.io/projected/a31b7627-abfd-4227-b142-0fcdca9e2b0b-kube-api-access-nfjkv\") pod \"a31b7627-abfd-4227-b142-0fcdca9e2b0b\" (UID: \"a31b7627-abfd-4227-b142-0fcdca9e2b0b\") " Mar 09 09:26:05 crc kubenswrapper[4971]: I0309 09:26:05.102662 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31b7627-abfd-4227-b142-0fcdca9e2b0b-kube-api-access-nfjkv" (OuterVolumeSpecName: "kube-api-access-nfjkv") pod "a31b7627-abfd-4227-b142-0fcdca9e2b0b" (UID: "a31b7627-abfd-4227-b142-0fcdca9e2b0b"). InnerVolumeSpecName "kube-api-access-nfjkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:05 crc kubenswrapper[4971]: I0309 09:26:05.196274 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfjkv\" (UniqueName: \"kubernetes.io/projected/a31b7627-abfd-4227-b142-0fcdca9e2b0b-kube-api-access-nfjkv\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:05 crc kubenswrapper[4971]: I0309 09:26:05.206484 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-przm2" Mar 09 09:26:05 crc kubenswrapper[4971]: I0309 09:26:05.206528 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-przm2" Mar 09 09:26:05 crc kubenswrapper[4971]: I0309 09:26:05.261109 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-przm2" Mar 09 09:26:05 crc kubenswrapper[4971]: I0309 09:26:05.406932 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9v8xz" Mar 09 09:26:05 crc kubenswrapper[4971]: I0309 09:26:05.406981 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9v8xz" Mar 09 09:26:05 crc kubenswrapper[4971]: I0309 09:26:05.451491 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9v8xz" Mar 09 09:26:05 crc kubenswrapper[4971]: I0309 09:26:05.515325 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550806-5x6nv" event={"ID":"a31b7627-abfd-4227-b142-0fcdca9e2b0b","Type":"ContainerDied","Data":"daf4bf1d36bbaf13b1bf069d9db630988d4faa1868a66838219bb6fd5db72d18"} Mar 09 09:26:05 crc kubenswrapper[4971]: I0309 09:26:05.515416 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="daf4bf1d36bbaf13b1bf069d9db630988d4faa1868a66838219bb6fd5db72d18" Mar 09 09:26:05 crc kubenswrapper[4971]: I0309 09:26:05.515585 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550806-5x6nv" Mar 09 09:26:05 crc kubenswrapper[4971]: I0309 09:26:05.556033 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-przm2" Mar 09 09:26:05 crc kubenswrapper[4971]: I0309 09:26:05.556160 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9v8xz" Mar 09 09:26:07 crc kubenswrapper[4971]: I0309 09:26:07.639392 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pqnjj" Mar 09 09:26:07 crc kubenswrapper[4971]: I0309 09:26:07.640195 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pqnjj" Mar 09 09:26:07 crc kubenswrapper[4971]: I0309 09:26:07.682048 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pqnjj" Mar 09 09:26:07 crc kubenswrapper[4971]: I0309 09:26:07.804785 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qdtbr" Mar 09 09:26:07 crc kubenswrapper[4971]: I0309 09:26:07.804844 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qdtbr" Mar 09 09:26:07 crc kubenswrapper[4971]: I0309 09:26:07.846941 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qdtbr" Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.302952 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" podUID="7ddfae4b-5893-4e15-a983-1adb19c5970e" containerName="registry" containerID="cri-o://29e0f6f01e4e568b808e86167b263dfc4bac34d6888c852ac22672675caf489e" gracePeriod=30 Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.538785 4971 generic.go:334] "Generic (PLEG): container finished" podID="7ddfae4b-5893-4e15-a983-1adb19c5970e" containerID="29e0f6f01e4e568b808e86167b263dfc4bac34d6888c852ac22672675caf489e" exitCode=0 Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.539458 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" event={"ID":"7ddfae4b-5893-4e15-a983-1adb19c5970e","Type":"ContainerDied","Data":"29e0f6f01e4e568b808e86167b263dfc4bac34d6888c852ac22672675caf489e"} Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.583043 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pqnjj" Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.587373 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qdtbr" Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.728338 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.843551 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7ddfae4b-5893-4e15-a983-1adb19c5970e-ca-trust-extracted\") pod \"7ddfae4b-5893-4e15-a983-1adb19c5970e\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.843610 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7ddfae4b-5893-4e15-a983-1adb19c5970e-bound-sa-token\") pod \"7ddfae4b-5893-4e15-a983-1adb19c5970e\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.843635 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ddfae4b-5893-4e15-a983-1adb19c5970e-trusted-ca\") pod \"7ddfae4b-5893-4e15-a983-1adb19c5970e\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.843651 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7ddfae4b-5893-4e15-a983-1adb19c5970e-installation-pull-secrets\") pod \"7ddfae4b-5893-4e15-a983-1adb19c5970e\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.843807 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"7ddfae4b-5893-4e15-a983-1adb19c5970e\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.843827 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xlzb\" (UniqueName: \"kubernetes.io/projected/7ddfae4b-5893-4e15-a983-1adb19c5970e-kube-api-access-8xlzb\") pod \"7ddfae4b-5893-4e15-a983-1adb19c5970e\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.843888 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7ddfae4b-5893-4e15-a983-1adb19c5970e-registry-certificates\") pod \"7ddfae4b-5893-4e15-a983-1adb19c5970e\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.843905 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7ddfae4b-5893-4e15-a983-1adb19c5970e-registry-tls\") pod \"7ddfae4b-5893-4e15-a983-1adb19c5970e\" (UID: \"7ddfae4b-5893-4e15-a983-1adb19c5970e\") " Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.845092 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ddfae4b-5893-4e15-a983-1adb19c5970e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7ddfae4b-5893-4e15-a983-1adb19c5970e" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.845406 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ddfae4b-5893-4e15-a983-1adb19c5970e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7ddfae4b-5893-4e15-a983-1adb19c5970e" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.851385 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ddfae4b-5893-4e15-a983-1adb19c5970e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7ddfae4b-5893-4e15-a983-1adb19c5970e" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.851671 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ddfae4b-5893-4e15-a983-1adb19c5970e-kube-api-access-8xlzb" (OuterVolumeSpecName: "kube-api-access-8xlzb") pod "7ddfae4b-5893-4e15-a983-1adb19c5970e" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e"). InnerVolumeSpecName "kube-api-access-8xlzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.857466 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ddfae4b-5893-4e15-a983-1adb19c5970e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7ddfae4b-5893-4e15-a983-1adb19c5970e" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.857642 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ddfae4b-5893-4e15-a983-1adb19c5970e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7ddfae4b-5893-4e15-a983-1adb19c5970e" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.867072 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "7ddfae4b-5893-4e15-a983-1adb19c5970e" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.881919 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ddfae4b-5893-4e15-a983-1adb19c5970e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7ddfae4b-5893-4e15-a983-1adb19c5970e" (UID: "7ddfae4b-5893-4e15-a983-1adb19c5970e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.944996 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xlzb\" (UniqueName: \"kubernetes.io/projected/7ddfae4b-5893-4e15-a983-1adb19c5970e-kube-api-access-8xlzb\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.945244 4971 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7ddfae4b-5893-4e15-a983-1adb19c5970e-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.945256 4971 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7ddfae4b-5893-4e15-a983-1adb19c5970e-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.945266 4971 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7ddfae4b-5893-4e15-a983-1adb19c5970e-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.945276 4971 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7ddfae4b-5893-4e15-a983-1adb19c5970e-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.945286 4971 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ddfae4b-5893-4e15-a983-1adb19c5970e-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:08 crc kubenswrapper[4971]: I0309 09:26:08.945294 4971 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7ddfae4b-5893-4e15-a983-1adb19c5970e-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:09 crc kubenswrapper[4971]: I0309 09:26:09.547496 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" event={"ID":"7ddfae4b-5893-4e15-a983-1adb19c5970e","Type":"ContainerDied","Data":"2a9d85b0ab65b31748dff0688cc9a2b07211e62ad6aacbb46b667420aa51f3cc"} Mar 09 09:26:09 crc kubenswrapper[4971]: I0309 09:26:09.547573 4971 scope.go:117] "RemoveContainer" containerID="29e0f6f01e4e568b808e86167b263dfc4bac34d6888c852ac22672675caf489e" Mar 09 09:26:09 crc kubenswrapper[4971]: I0309 09:26:09.547629 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-d6xhv" Mar 09 09:26:09 crc kubenswrapper[4971]: I0309 09:26:09.564278 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-d6xhv"] Mar 09 09:26:09 crc kubenswrapper[4971]: I0309 09:26:09.567771 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-d6xhv"] Mar 09 09:26:11 crc kubenswrapper[4971]: I0309 09:26:11.158971 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ddfae4b-5893-4e15-a983-1adb19c5970e" path="/var/lib/kubelet/pods/7ddfae4b-5893-4e15-a983-1adb19c5970e/volumes" Mar 09 09:27:44 crc kubenswrapper[4971]: I0309 09:27:44.794559 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:27:44 crc kubenswrapper[4971]: I0309 09:27:44.795659 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:28:00 crc kubenswrapper[4971]: I0309 09:28:00.133113 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550808-w22d4"] Mar 09 09:28:00 crc kubenswrapper[4971]: E0309 09:28:00.133917 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a31b7627-abfd-4227-b142-0fcdca9e2b0b" containerName="oc" Mar 09 09:28:00 crc kubenswrapper[4971]: I0309 09:28:00.133932 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a31b7627-abfd-4227-b142-0fcdca9e2b0b" containerName="oc" Mar 09 09:28:00 crc kubenswrapper[4971]: E0309 09:28:00.133950 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ddfae4b-5893-4e15-a983-1adb19c5970e" containerName="registry" Mar 09 09:28:00 crc kubenswrapper[4971]: I0309 09:28:00.133959 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ddfae4b-5893-4e15-a983-1adb19c5970e" containerName="registry" Mar 09 09:28:00 crc kubenswrapper[4971]: I0309 09:28:00.134076 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a31b7627-abfd-4227-b142-0fcdca9e2b0b" containerName="oc" Mar 09 09:28:00 crc kubenswrapper[4971]: I0309 09:28:00.134092 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ddfae4b-5893-4e15-a983-1adb19c5970e" containerName="registry" Mar 09 09:28:00 crc kubenswrapper[4971]: I0309 09:28:00.134540 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550808-w22d4" Mar 09 09:28:00 crc kubenswrapper[4971]: I0309 09:28:00.137712 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:28:00 crc kubenswrapper[4971]: I0309 09:28:00.137770 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:28:00 crc kubenswrapper[4971]: I0309 09:28:00.137849 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-xhrv2" Mar 09 09:28:00 crc kubenswrapper[4971]: I0309 09:28:00.142643 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550808-w22d4"] Mar 09 09:28:00 crc kubenswrapper[4971]: I0309 09:28:00.236019 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxz7n\" (UniqueName: \"kubernetes.io/projected/ee51aea1-202c-473d-ac89-4db3058e25a1-kube-api-access-rxz7n\") pod \"auto-csr-approver-29550808-w22d4\" (UID: \"ee51aea1-202c-473d-ac89-4db3058e25a1\") " pod="openshift-infra/auto-csr-approver-29550808-w22d4" Mar 09 09:28:00 crc kubenswrapper[4971]: I0309 09:28:00.337860 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxz7n\" (UniqueName: \"kubernetes.io/projected/ee51aea1-202c-473d-ac89-4db3058e25a1-kube-api-access-rxz7n\") pod \"auto-csr-approver-29550808-w22d4\" (UID: \"ee51aea1-202c-473d-ac89-4db3058e25a1\") " pod="openshift-infra/auto-csr-approver-29550808-w22d4" Mar 09 09:28:00 crc kubenswrapper[4971]: I0309 09:28:00.359435 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxz7n\" (UniqueName: \"kubernetes.io/projected/ee51aea1-202c-473d-ac89-4db3058e25a1-kube-api-access-rxz7n\") pod \"auto-csr-approver-29550808-w22d4\" (UID: \"ee51aea1-202c-473d-ac89-4db3058e25a1\") " pod="openshift-infra/auto-csr-approver-29550808-w22d4" Mar 09 09:28:00 crc kubenswrapper[4971]: I0309 09:28:00.483059 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550808-w22d4" Mar 09 09:28:00 crc kubenswrapper[4971]: I0309 09:28:00.646007 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550808-w22d4"] Mar 09 09:28:00 crc kubenswrapper[4971]: W0309 09:28:00.652037 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee51aea1_202c_473d_ac89_4db3058e25a1.slice/crio-15f1d2769ca27136817966b9c995854f39469e670e4eb42ea453d8b973c025e2 WatchSource:0}: Error finding container 15f1d2769ca27136817966b9c995854f39469e670e4eb42ea453d8b973c025e2: Status 404 returned error can't find the container with id 15f1d2769ca27136817966b9c995854f39469e670e4eb42ea453d8b973c025e2 Mar 09 09:28:00 crc kubenswrapper[4971]: I0309 09:28:00.654139 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:28:01 crc kubenswrapper[4971]: I0309 09:28:01.207105 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550808-w22d4" event={"ID":"ee51aea1-202c-473d-ac89-4db3058e25a1","Type":"ContainerStarted","Data":"15f1d2769ca27136817966b9c995854f39469e670e4eb42ea453d8b973c025e2"} Mar 09 09:28:02 crc kubenswrapper[4971]: I0309 09:28:02.214698 4971 generic.go:334] "Generic (PLEG): container finished" podID="ee51aea1-202c-473d-ac89-4db3058e25a1" containerID="fd3a314e7c2e35d412daa54c7b0d57ffeb9845b8a321765df386308d1d9eae61" exitCode=0 Mar 09 09:28:02 crc kubenswrapper[4971]: I0309 09:28:02.214744 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550808-w22d4" event={"ID":"ee51aea1-202c-473d-ac89-4db3058e25a1","Type":"ContainerDied","Data":"fd3a314e7c2e35d412daa54c7b0d57ffeb9845b8a321765df386308d1d9eae61"} Mar 09 09:28:03 crc kubenswrapper[4971]: I0309 09:28:03.416384 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550808-w22d4" Mar 09 09:28:03 crc kubenswrapper[4971]: I0309 09:28:03.579099 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxz7n\" (UniqueName: \"kubernetes.io/projected/ee51aea1-202c-473d-ac89-4db3058e25a1-kube-api-access-rxz7n\") pod \"ee51aea1-202c-473d-ac89-4db3058e25a1\" (UID: \"ee51aea1-202c-473d-ac89-4db3058e25a1\") " Mar 09 09:28:03 crc kubenswrapper[4971]: I0309 09:28:03.584962 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee51aea1-202c-473d-ac89-4db3058e25a1-kube-api-access-rxz7n" (OuterVolumeSpecName: "kube-api-access-rxz7n") pod "ee51aea1-202c-473d-ac89-4db3058e25a1" (UID: "ee51aea1-202c-473d-ac89-4db3058e25a1"). InnerVolumeSpecName "kube-api-access-rxz7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:28:03 crc kubenswrapper[4971]: I0309 09:28:03.680790 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxz7n\" (UniqueName: \"kubernetes.io/projected/ee51aea1-202c-473d-ac89-4db3058e25a1-kube-api-access-rxz7n\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:04 crc kubenswrapper[4971]: I0309 09:28:04.226508 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550808-w22d4" event={"ID":"ee51aea1-202c-473d-ac89-4db3058e25a1","Type":"ContainerDied","Data":"15f1d2769ca27136817966b9c995854f39469e670e4eb42ea453d8b973c025e2"} Mar 09 09:28:04 crc kubenswrapper[4971]: I0309 09:28:04.226792 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15f1d2769ca27136817966b9c995854f39469e670e4eb42ea453d8b973c025e2" Mar 09 09:28:04 crc kubenswrapper[4971]: I0309 09:28:04.226561 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550808-w22d4" Mar 09 09:28:04 crc kubenswrapper[4971]: I0309 09:28:04.464208 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550802-d8cbz"] Mar 09 09:28:04 crc kubenswrapper[4971]: I0309 09:28:04.467126 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550802-d8cbz"] Mar 09 09:28:05 crc kubenswrapper[4971]: I0309 09:28:05.158721 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="603b9f27-06c0-4fe8-8cc3-416122462369" path="/var/lib/kubelet/pods/603b9f27-06c0-4fe8-8cc3-416122462369/volumes" Mar 09 09:28:14 crc kubenswrapper[4971]: I0309 09:28:14.794507 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:28:14 crc kubenswrapper[4971]: I0309 09:28:14.795070 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:28:44 crc kubenswrapper[4971]: I0309 09:28:44.795342 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:28:44 crc kubenswrapper[4971]: I0309 09:28:44.795941 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:28:44 crc kubenswrapper[4971]: I0309 09:28:44.795987 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" Mar 09 09:28:44 crc kubenswrapper[4971]: I0309 09:28:44.796554 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7aa603ba67328834de5950491258a16b4fddbca04efe1575ba7e19aa5d559570"} pod="openshift-machine-config-operator/machine-config-daemon-p56wx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:28:44 crc kubenswrapper[4971]: I0309 09:28:44.796607 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" containerID="cri-o://7aa603ba67328834de5950491258a16b4fddbca04efe1575ba7e19aa5d559570" gracePeriod=600 Mar 09 09:28:45 crc kubenswrapper[4971]: I0309 09:28:45.566606 4971 generic.go:334] "Generic (PLEG): container finished" podID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerID="7aa603ba67328834de5950491258a16b4fddbca04efe1575ba7e19aa5d559570" exitCode=0 Mar 09 09:28:45 crc kubenswrapper[4971]: I0309 09:28:45.566678 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" event={"ID":"05fde3ad-1182-4b15-bb1a-f365ecc92d75","Type":"ContainerDied","Data":"7aa603ba67328834de5950491258a16b4fddbca04efe1575ba7e19aa5d559570"} Mar 09 09:28:45 crc kubenswrapper[4971]: I0309 09:28:45.567194 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" event={"ID":"05fde3ad-1182-4b15-bb1a-f365ecc92d75","Type":"ContainerStarted","Data":"75bb88e6db008edd2980d5e44e1931a66833b416d839996571ee8b190f030a3c"} Mar 09 09:28:45 crc kubenswrapper[4971]: I0309 09:28:45.567238 4971 scope.go:117] "RemoveContainer" containerID="ae9ddb9ff311e15e0bec8cf007b9275af5870d3030b314990b85d278c01e4a3e" Mar 09 09:28:57 crc kubenswrapper[4971]: I0309 09:28:57.654976 4971 scope.go:117] "RemoveContainer" containerID="5f39f2443e019d56e40c2f46b6f04fe595e46e9d8c4aed977f878cae5f6bd534" Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.133574 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550810-ls2c9"] Mar 09 09:30:00 crc kubenswrapper[4971]: E0309 09:30:00.134735 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee51aea1-202c-473d-ac89-4db3058e25a1" containerName="oc" Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.134759 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee51aea1-202c-473d-ac89-4db3058e25a1" containerName="oc" Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.134855 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee51aea1-202c-473d-ac89-4db3058e25a1" containerName="oc" Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.135236 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550810-ls2c9" Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.137955 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.138953 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.139161 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550810-mnxgz"] Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.140015 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-mnxgz" Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.141213 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-xhrv2" Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.143485 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.144758 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550810-ls2c9"] Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.145647 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.151340 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm6w6\" (UniqueName: \"kubernetes.io/projected/b5a4e33a-3851-4e23-8f30-c766b7326dc0-kube-api-access-mm6w6\") pod \"auto-csr-approver-29550810-ls2c9\" (UID: \"b5a4e33a-3851-4e23-8f30-c766b7326dc0\") " pod="openshift-infra/auto-csr-approver-29550810-ls2c9" Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.158752 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550810-mnxgz"] Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.252598 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1b03fc2-bfd0-4c40-9d94-9df45ca324ca-config-volume\") pod \"collect-profiles-29550810-mnxgz\" (UID: \"d1b03fc2-bfd0-4c40-9d94-9df45ca324ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-mnxgz" Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.252840 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm6w6\" (UniqueName: \"kubernetes.io/projected/b5a4e33a-3851-4e23-8f30-c766b7326dc0-kube-api-access-mm6w6\") pod \"auto-csr-approver-29550810-ls2c9\" (UID: \"b5a4e33a-3851-4e23-8f30-c766b7326dc0\") " pod="openshift-infra/auto-csr-approver-29550810-ls2c9" Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.253065 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1b03fc2-bfd0-4c40-9d94-9df45ca324ca-secret-volume\") pod \"collect-profiles-29550810-mnxgz\" (UID: \"d1b03fc2-bfd0-4c40-9d94-9df45ca324ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-mnxgz" Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.253131 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqzqv\" (UniqueName: \"kubernetes.io/projected/d1b03fc2-bfd0-4c40-9d94-9df45ca324ca-kube-api-access-fqzqv\") pod \"collect-profiles-29550810-mnxgz\" (UID: \"d1b03fc2-bfd0-4c40-9d94-9df45ca324ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-mnxgz" Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.275883 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm6w6\" (UniqueName: \"kubernetes.io/projected/b5a4e33a-3851-4e23-8f30-c766b7326dc0-kube-api-access-mm6w6\") pod \"auto-csr-approver-29550810-ls2c9\" (UID: \"b5a4e33a-3851-4e23-8f30-c766b7326dc0\") " pod="openshift-infra/auto-csr-approver-29550810-ls2c9" Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.355595 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1b03fc2-bfd0-4c40-9d94-9df45ca324ca-config-volume\") pod \"collect-profiles-29550810-mnxgz\" (UID: \"d1b03fc2-bfd0-4c40-9d94-9df45ca324ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-mnxgz" Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.355787 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1b03fc2-bfd0-4c40-9d94-9df45ca324ca-secret-volume\") pod \"collect-profiles-29550810-mnxgz\" (UID: \"d1b03fc2-bfd0-4c40-9d94-9df45ca324ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-mnxgz" Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.355820 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqzqv\" (UniqueName: \"kubernetes.io/projected/d1b03fc2-bfd0-4c40-9d94-9df45ca324ca-kube-api-access-fqzqv\") pod \"collect-profiles-29550810-mnxgz\" (UID: \"d1b03fc2-bfd0-4c40-9d94-9df45ca324ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-mnxgz" Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.357312 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1b03fc2-bfd0-4c40-9d94-9df45ca324ca-config-volume\") pod \"collect-profiles-29550810-mnxgz\" (UID: \"d1b03fc2-bfd0-4c40-9d94-9df45ca324ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-mnxgz" Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.369494 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1b03fc2-bfd0-4c40-9d94-9df45ca324ca-secret-volume\") pod \"collect-profiles-29550810-mnxgz\" (UID: \"d1b03fc2-bfd0-4c40-9d94-9df45ca324ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-mnxgz" Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.375131 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqzqv\" (UniqueName: \"kubernetes.io/projected/d1b03fc2-bfd0-4c40-9d94-9df45ca324ca-kube-api-access-fqzqv\") pod \"collect-profiles-29550810-mnxgz\" (UID: \"d1b03fc2-bfd0-4c40-9d94-9df45ca324ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-mnxgz" Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.453767 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550810-ls2c9" Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.467116 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-mnxgz" Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.687152 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550810-ls2c9"] Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.852365 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550810-mnxgz"] Mar 09 09:30:00 crc kubenswrapper[4971]: W0309 09:30:00.855317 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b03fc2_bfd0_4c40_9d94_9df45ca324ca.slice/crio-f2124e441c8f7332a38946991e5d9aac21bd547b07eb1f5080843a5bc2772106 WatchSource:0}: Error finding container f2124e441c8f7332a38946991e5d9aac21bd547b07eb1f5080843a5bc2772106: Status 404 returned error can't find the container with id f2124e441c8f7332a38946991e5d9aac21bd547b07eb1f5080843a5bc2772106 Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.976455 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-mnxgz" event={"ID":"d1b03fc2-bfd0-4c40-9d94-9df45ca324ca","Type":"ContainerStarted","Data":"28e8d00e88609a040296db3ae88e57fd599dbd2eb57e24a81568b4d7217f0e44"} Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.976561 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-mnxgz" event={"ID":"d1b03fc2-bfd0-4c40-9d94-9df45ca324ca","Type":"ContainerStarted","Data":"f2124e441c8f7332a38946991e5d9aac21bd547b07eb1f5080843a5bc2772106"} Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.977615 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550810-ls2c9" event={"ID":"b5a4e33a-3851-4e23-8f30-c766b7326dc0","Type":"ContainerStarted","Data":"9cd51e81019d7262becbab068d3bf26a5042fbe106da7a6dd023ac1517bc6c18"} Mar 09 09:30:00 crc kubenswrapper[4971]: I0309 09:30:00.990385 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-mnxgz" podStartSLOduration=0.990366279 podStartE2EDuration="990.366279ms" podCreationTimestamp="2026-03-09 09:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:30:00.989811154 +0000 UTC m=+604.549738984" watchObservedRunningTime="2026-03-09 09:30:00.990366279 +0000 UTC m=+604.550294089" Mar 09 09:30:01 crc kubenswrapper[4971]: I0309 09:30:01.987758 4971 generic.go:334] "Generic (PLEG): container finished" podID="d1b03fc2-bfd0-4c40-9d94-9df45ca324ca" containerID="28e8d00e88609a040296db3ae88e57fd599dbd2eb57e24a81568b4d7217f0e44" exitCode=0 Mar 09 09:30:01 crc kubenswrapper[4971]: I0309 09:30:01.987806 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-mnxgz" event={"ID":"d1b03fc2-bfd0-4c40-9d94-9df45ca324ca","Type":"ContainerDied","Data":"28e8d00e88609a040296db3ae88e57fd599dbd2eb57e24a81568b4d7217f0e44"} Mar 09 09:30:01 crc kubenswrapper[4971]: I0309 09:30:01.990099 4971 generic.go:334] "Generic (PLEG): container finished" podID="b5a4e33a-3851-4e23-8f30-c766b7326dc0" containerID="d907e0b21b6ee71fb5dc6e199d32241ec70725df639686feb43c09ab193fa9d4" exitCode=0 Mar 09 09:30:01 crc kubenswrapper[4971]: I0309 09:30:01.990145 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550810-ls2c9" event={"ID":"b5a4e33a-3851-4e23-8f30-c766b7326dc0","Type":"ContainerDied","Data":"d907e0b21b6ee71fb5dc6e199d32241ec70725df639686feb43c09ab193fa9d4"} Mar 09 09:30:03 crc kubenswrapper[4971]: I0309 09:30:03.222696 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-mnxgz" Mar 09 09:30:03 crc kubenswrapper[4971]: I0309 09:30:03.231279 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550810-ls2c9" Mar 09 09:30:03 crc kubenswrapper[4971]: I0309 09:30:03.291318 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1b03fc2-bfd0-4c40-9d94-9df45ca324ca-config-volume\") pod \"d1b03fc2-bfd0-4c40-9d94-9df45ca324ca\" (UID: \"d1b03fc2-bfd0-4c40-9d94-9df45ca324ca\") " Mar 09 09:30:03 crc kubenswrapper[4971]: I0309 09:30:03.291415 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm6w6\" (UniqueName: \"kubernetes.io/projected/b5a4e33a-3851-4e23-8f30-c766b7326dc0-kube-api-access-mm6w6\") pod \"b5a4e33a-3851-4e23-8f30-c766b7326dc0\" (UID: \"b5a4e33a-3851-4e23-8f30-c766b7326dc0\") " Mar 09 09:30:03 crc kubenswrapper[4971]: I0309 09:30:03.291592 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1b03fc2-bfd0-4c40-9d94-9df45ca324ca-secret-volume\") pod \"d1b03fc2-bfd0-4c40-9d94-9df45ca324ca\" (UID: \"d1b03fc2-bfd0-4c40-9d94-9df45ca324ca\") " Mar 09 09:30:03 crc kubenswrapper[4971]: I0309 09:30:03.291627 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqzqv\" (UniqueName: \"kubernetes.io/projected/d1b03fc2-bfd0-4c40-9d94-9df45ca324ca-kube-api-access-fqzqv\") pod \"d1b03fc2-bfd0-4c40-9d94-9df45ca324ca\" (UID: \"d1b03fc2-bfd0-4c40-9d94-9df45ca324ca\") " Mar 09 09:30:03 crc kubenswrapper[4971]: I0309 09:30:03.293760 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1b03fc2-bfd0-4c40-9d94-9df45ca324ca-config-volume" (OuterVolumeSpecName: "config-volume") pod "d1b03fc2-bfd0-4c40-9d94-9df45ca324ca" (UID: "d1b03fc2-bfd0-4c40-9d94-9df45ca324ca"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:30:03 crc kubenswrapper[4971]: I0309 09:30:03.297282 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1b03fc2-bfd0-4c40-9d94-9df45ca324ca-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d1b03fc2-bfd0-4c40-9d94-9df45ca324ca" (UID: "d1b03fc2-bfd0-4c40-9d94-9df45ca324ca"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:30:03 crc kubenswrapper[4971]: I0309 09:30:03.297981 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1b03fc2-bfd0-4c40-9d94-9df45ca324ca-kube-api-access-fqzqv" (OuterVolumeSpecName: "kube-api-access-fqzqv") pod "d1b03fc2-bfd0-4c40-9d94-9df45ca324ca" (UID: "d1b03fc2-bfd0-4c40-9d94-9df45ca324ca"). InnerVolumeSpecName "kube-api-access-fqzqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:30:03 crc kubenswrapper[4971]: I0309 09:30:03.298132 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5a4e33a-3851-4e23-8f30-c766b7326dc0-kube-api-access-mm6w6" (OuterVolumeSpecName: "kube-api-access-mm6w6") pod "b5a4e33a-3851-4e23-8f30-c766b7326dc0" (UID: "b5a4e33a-3851-4e23-8f30-c766b7326dc0"). InnerVolumeSpecName "kube-api-access-mm6w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:30:03 crc kubenswrapper[4971]: I0309 09:30:03.393516 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqzqv\" (UniqueName: \"kubernetes.io/projected/d1b03fc2-bfd0-4c40-9d94-9df45ca324ca-kube-api-access-fqzqv\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:03 crc kubenswrapper[4971]: I0309 09:30:03.393753 4971 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1b03fc2-bfd0-4c40-9d94-9df45ca324ca-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:03 crc kubenswrapper[4971]: I0309 09:30:03.393849 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm6w6\" (UniqueName: \"kubernetes.io/projected/b5a4e33a-3851-4e23-8f30-c766b7326dc0-kube-api-access-mm6w6\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:03 crc kubenswrapper[4971]: I0309 09:30:03.393908 4971 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1b03fc2-bfd0-4c40-9d94-9df45ca324ca-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:04 crc kubenswrapper[4971]: I0309 09:30:04.003522 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-mnxgz" event={"ID":"d1b03fc2-bfd0-4c40-9d94-9df45ca324ca","Type":"ContainerDied","Data":"f2124e441c8f7332a38946991e5d9aac21bd547b07eb1f5080843a5bc2772106"} Mar 09 09:30:04 crc kubenswrapper[4971]: I0309 09:30:04.003570 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-mnxgz" Mar 09 09:30:04 crc kubenswrapper[4971]: I0309 09:30:04.003591 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2124e441c8f7332a38946991e5d9aac21bd547b07eb1f5080843a5bc2772106" Mar 09 09:30:04 crc kubenswrapper[4971]: I0309 09:30:04.005645 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550810-ls2c9" event={"ID":"b5a4e33a-3851-4e23-8f30-c766b7326dc0","Type":"ContainerDied","Data":"9cd51e81019d7262becbab068d3bf26a5042fbe106da7a6dd023ac1517bc6c18"} Mar 09 09:30:04 crc kubenswrapper[4971]: I0309 09:30:04.005694 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cd51e81019d7262becbab068d3bf26a5042fbe106da7a6dd023ac1517bc6c18" Mar 09 09:30:04 crc kubenswrapper[4971]: I0309 09:30:04.005716 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550810-ls2c9" Mar 09 09:30:04 crc kubenswrapper[4971]: I0309 09:30:04.290577 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550804-v5hhp"] Mar 09 09:30:04 crc kubenswrapper[4971]: I0309 09:30:04.294372 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550804-v5hhp"] Mar 09 09:30:05 crc kubenswrapper[4971]: I0309 09:30:05.157614 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3170fbc-6018-4b70-9f33-54a2e285fcd3" path="/var/lib/kubelet/pods/d3170fbc-6018-4b70-9f33-54a2e285fcd3/volumes" Mar 09 09:30:57 crc kubenswrapper[4971]: I0309 09:30:57.707472 4971 scope.go:117] "RemoveContainer" containerID="eadf83ab2987345d8537254ea5c39ea61b842d6ca15febd7c3dcc9ccd44446db" Mar 09 09:31:14 crc kubenswrapper[4971]: I0309 09:31:14.795197 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:31:14 crc kubenswrapper[4971]: I0309 09:31:14.795810 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.205802 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9bhsp"] Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.206790 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="ovn-controller" containerID="cri-o://3ae752a69b1dd4eb466ba0c9bbc03f430c2b83a5c4fdf84f5e7ed37decf875fb" gracePeriod=30 Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.206912 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://59358bfee0d3ea4c0990d9d6584366a5db170d4147cf5b334ba997c5f9d7dccd" gracePeriod=30 Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.206896 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="northd" containerID="cri-o://29f020bddec94a203a1e569fae2e369d84702dcc8bcf68f016fda56798a62056" gracePeriod=30 Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.206985 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="kube-rbac-proxy-node" containerID="cri-o://ab8e58e71df7ac41fdd077b1c5bfca8fd7227fc8613cc3e2c3c8de0c0a40e293" gracePeriod=30 Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.207053 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="ovn-acl-logging" containerID="cri-o://dfc0a02ca9288e6b478c1746877e3532b5930f4df108cd5d58548211cabdd334" gracePeriod=30 Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.207061 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="nbdb" containerID="cri-o://bd01cd5952c339dc781705cc90d089db0e24e789af8a57a40f3362e254be1183" gracePeriod=30 Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.207016 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="sbdb" containerID="cri-o://51ead60c81aa051f1a5833235079813532303907fca8e7927390f1507f388da8" gracePeriod=30 Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.237082 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="ovnkube-controller" containerID="cri-o://e5c7c4811b81491cc4a4d2d125c7b4af5ee22ec7af2ae7307b7f80054dd21d7b" gracePeriod=30 Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.412153 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bhsp_3a2ffbc4-02ce-4bfc-8732-7364ac5878e6/ovn-acl-logging/0.log" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.413127 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bhsp_3a2ffbc4-02ce-4bfc-8732-7364ac5878e6/ovn-controller/0.log" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.413959 4971 generic.go:334] "Generic (PLEG): container finished" podID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerID="e5c7c4811b81491cc4a4d2d125c7b4af5ee22ec7af2ae7307b7f80054dd21d7b" exitCode=0 Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.413983 4971 generic.go:334] "Generic (PLEG): container finished" podID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerID="51ead60c81aa051f1a5833235079813532303907fca8e7927390f1507f388da8" exitCode=0 Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.413990 4971 generic.go:334] "Generic (PLEG): container finished" podID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerID="29f020bddec94a203a1e569fae2e369d84702dcc8bcf68f016fda56798a62056" exitCode=0 Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.413997 4971 generic.go:334] "Generic (PLEG): container finished" podID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerID="59358bfee0d3ea4c0990d9d6584366a5db170d4147cf5b334ba997c5f9d7dccd" exitCode=0 Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.414006 4971 generic.go:334] "Generic (PLEG): container finished" podID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerID="ab8e58e71df7ac41fdd077b1c5bfca8fd7227fc8613cc3e2c3c8de0c0a40e293" exitCode=0 Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.414015 4971 generic.go:334] "Generic (PLEG): container finished" podID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerID="dfc0a02ca9288e6b478c1746877e3532b5930f4df108cd5d58548211cabdd334" exitCode=143 Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.414022 4971 generic.go:334] "Generic (PLEG): container finished" podID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerID="3ae752a69b1dd4eb466ba0c9bbc03f430c2b83a5c4fdf84f5e7ed37decf875fb" exitCode=143 Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.414035 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" event={"ID":"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6","Type":"ContainerDied","Data":"e5c7c4811b81491cc4a4d2d125c7b4af5ee22ec7af2ae7307b7f80054dd21d7b"} Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.414117 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" event={"ID":"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6","Type":"ContainerDied","Data":"51ead60c81aa051f1a5833235079813532303907fca8e7927390f1507f388da8"} Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.414136 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" event={"ID":"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6","Type":"ContainerDied","Data":"29f020bddec94a203a1e569fae2e369d84702dcc8bcf68f016fda56798a62056"} Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.414150 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" event={"ID":"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6","Type":"ContainerDied","Data":"59358bfee0d3ea4c0990d9d6584366a5db170d4147cf5b334ba997c5f9d7dccd"} Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.414220 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" event={"ID":"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6","Type":"ContainerDied","Data":"ab8e58e71df7ac41fdd077b1c5bfca8fd7227fc8613cc3e2c3c8de0c0a40e293"} Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.414254 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" event={"ID":"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6","Type":"ContainerDied","Data":"dfc0a02ca9288e6b478c1746877e3532b5930f4df108cd5d58548211cabdd334"} Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.414266 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" event={"ID":"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6","Type":"ContainerDied","Data":"3ae752a69b1dd4eb466ba0c9bbc03f430c2b83a5c4fdf84f5e7ed37decf875fb"} Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.416028 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-572n5_156929ae-cd9c-46c6-8bf1-bc28162f6917/kube-multus/0.log" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.416075 4971 generic.go:334] "Generic (PLEG): container finished" podID="156929ae-cd9c-46c6-8bf1-bc28162f6917" containerID="308e2c9f1c0aef07bddf1fe2dd614efd8e69ecbd27c2f8e0029dc3838c626674" exitCode=2 Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.416105 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-572n5" event={"ID":"156929ae-cd9c-46c6-8bf1-bc28162f6917","Type":"ContainerDied","Data":"308e2c9f1c0aef07bddf1fe2dd614efd8e69ecbd27c2f8e0029dc3838c626674"} Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.416632 4971 scope.go:117] "RemoveContainer" containerID="308e2c9f1c0aef07bddf1fe2dd614efd8e69ecbd27c2f8e0029dc3838c626674" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.505254 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bhsp_3a2ffbc4-02ce-4bfc-8732-7364ac5878e6/ovn-acl-logging/0.log" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.505894 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bhsp_3a2ffbc4-02ce-4bfc-8732-7364ac5878e6/ovn-controller/0.log" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.506261 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.556865 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jbt72"] Mar 09 09:31:18 crc kubenswrapper[4971]: E0309 09:31:18.557088 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="nbdb" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.557105 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="nbdb" Mar 09 09:31:18 crc kubenswrapper[4971]: E0309 09:31:18.557118 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="northd" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.557124 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="northd" Mar 09 09:31:18 crc kubenswrapper[4971]: E0309 09:31:18.557130 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="ovn-controller" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.557136 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="ovn-controller" Mar 09 09:31:18 crc kubenswrapper[4971]: E0309 09:31:18.557148 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.557153 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 09:31:18 crc kubenswrapper[4971]: E0309 09:31:18.557160 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="ovnkube-controller" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.557165 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="ovnkube-controller" Mar 09 09:31:18 crc kubenswrapper[4971]: E0309 09:31:18.557174 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5a4e33a-3851-4e23-8f30-c766b7326dc0" containerName="oc" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.557179 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5a4e33a-3851-4e23-8f30-c766b7326dc0" containerName="oc" Mar 09 09:31:18 crc kubenswrapper[4971]: E0309 09:31:18.557187 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="kubecfg-setup" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.557192 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="kubecfg-setup" Mar 09 09:31:18 crc kubenswrapper[4971]: E0309 09:31:18.557200 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="kube-rbac-proxy-node" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.557206 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="kube-rbac-proxy-node" Mar 09 09:31:18 crc kubenswrapper[4971]: E0309 09:31:18.557217 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1b03fc2-bfd0-4c40-9d94-9df45ca324ca" containerName="collect-profiles" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.557222 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1b03fc2-bfd0-4c40-9d94-9df45ca324ca" containerName="collect-profiles" Mar 09 09:31:18 crc kubenswrapper[4971]: E0309 09:31:18.557229 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="ovn-acl-logging" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.557236 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="ovn-acl-logging" Mar 09 09:31:18 crc kubenswrapper[4971]: E0309 09:31:18.557243 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="sbdb" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.557249 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="sbdb" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.557335 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="ovnkube-controller" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.557364 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="ovn-controller" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.557374 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="sbdb" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.557382 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5a4e33a-3851-4e23-8f30-c766b7326dc0" containerName="oc" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.557390 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="ovn-acl-logging" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.557399 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="northd" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.557406 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="kube-rbac-proxy-node" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.557414 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="nbdb" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.557421 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.557429 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1b03fc2-bfd0-4c40-9d94-9df45ca324ca" containerName="collect-profiles" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.559281 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.596207 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-run-systemd\") pod \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.596261 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-etc-openvswitch\") pod \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.596295 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-cni-netd\") pod \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.596314 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-slash\") pod \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.596379 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-run-ovn\") pod \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.596413 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-run-netns\") pod \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.596436 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-var-lib-openvswitch\") pod \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.596489 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-run-openvswitch\") pod \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.596516 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-env-overrides\") pod \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.596541 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-ovnkube-script-lib\") pod \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.596565 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-run-ovn-kubernetes\") pod \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.596587 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-node-log\") pod \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.596616 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-systemd-units\") pod \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.596636 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-kubelet\") pod \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.596655 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-cni-bin\") pod \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.596666 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" (UID: "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.596700 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.596741 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" (UID: "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.596782 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" (UID: "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.596807 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" (UID: "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.596820 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-ovn-node-metrics-cert\") pod \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.596874 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-log-socket\") pod \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.596920 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5czk\" (UniqueName: \"kubernetes.io/projected/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-kube-api-access-j5czk\") pod \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.596956 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-ovnkube-config\") pod \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\" (UID: \"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6\") " Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.597374 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" (UID: "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.597398 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" (UID: "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.597416 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" (UID: "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.597428 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" (UID: "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.597438 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" (UID: "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.597446 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-node-log" (OuterVolumeSpecName: "node-log") pod "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" (UID: "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.597456 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" (UID: "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.597462 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" (UID: "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.597476 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-slash" (OuterVolumeSpecName: "host-slash") pod "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" (UID: "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.597477 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" (UID: "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.597490 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" (UID: "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.597956 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-log-socket" (OuterVolumeSpecName: "log-socket") pod "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" (UID: "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.597962 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" (UID: "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.598035 4971 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.598056 4971 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.598069 4971 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.598080 4971 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.603284 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" (UID: "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.603311 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-kube-api-access-j5czk" (OuterVolumeSpecName: "kube-api-access-j5czk") pod "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" (UID: "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6"). InnerVolumeSpecName "kube-api-access-j5czk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.609811 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" (UID: "3a2ffbc4-02ce-4bfc-8732-7364ac5878e6"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.699083 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w6c2\" (UniqueName: \"kubernetes.io/projected/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-kube-api-access-8w6c2\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.699186 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-node-log\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.699261 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-host-run-ovn-kubernetes\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.699300 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-run-ovn\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.699330 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-host-cni-bin\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.699423 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-host-slash\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.699468 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-log-socket\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.699517 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-systemd-units\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.699550 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-etc-openvswitch\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.699579 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-ovnkube-script-lib\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.699628 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-ovnkube-config\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.699690 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-host-kubelet\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.699752 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-run-openvswitch\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.699807 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-ovn-node-metrics-cert\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.699859 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-run-systemd\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.699919 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-host-run-netns\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.699958 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-host-cni-netd\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.699988 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.700035 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-var-lib-openvswitch\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.700075 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-env-overrides\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.700152 4971 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-node-log\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.700179 4971 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.700198 4971 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.700214 4971 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.700232 4971 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.700248 4971 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-log-socket\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.700271 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5czk\" (UniqueName: \"kubernetes.io/projected/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-kube-api-access-j5czk\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.700312 4971 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.700338 4971 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.700407 4971 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.700432 4971 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.700454 4971 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-slash\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.700478 4971 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.700503 4971 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.700525 4971 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.700549 4971 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.802212 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-run-ovn\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.802293 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-host-cni-bin\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.802336 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-host-slash\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.802511 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-log-socket\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.802553 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-systemd-units\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.802576 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-etc-openvswitch\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.802595 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-ovnkube-script-lib\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.802421 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-host-cni-bin\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.802631 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-ovnkube-config\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.802682 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-host-kubelet\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.802725 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-run-openvswitch\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.802761 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-ovn-node-metrics-cert\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.802792 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-run-systemd\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.802815 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-host-run-netns\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.802852 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-host-cni-netd\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.802869 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.802906 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-var-lib-openvswitch\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.802936 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-env-overrides\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.802969 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w6c2\" (UniqueName: \"kubernetes.io/projected/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-kube-api-access-8w6c2\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.802990 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-node-log\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.803015 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-host-run-ovn-kubernetes\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.803095 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-host-run-ovn-kubernetes\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.802379 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-run-ovn\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.803128 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-log-socket\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.803173 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-host-kubelet\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.803196 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-run-openvswitch\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.803317 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-ovnkube-config\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.803412 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-systemd-units\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.802444 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-host-slash\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.803466 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-etc-openvswitch\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.803665 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.803720 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-host-cni-netd\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.803766 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-var-lib-openvswitch\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.803902 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-ovnkube-script-lib\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.803952 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-node-log\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.804131 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-env-overrides\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.804210 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-run-systemd\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.804326 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-host-run-netns\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.806194 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-ovn-node-metrics-cert\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.820064 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w6c2\" (UniqueName: \"kubernetes.io/projected/fcb37aab-9086-4778-8c24-a36ed6cc7ad2-kube-api-access-8w6c2\") pod \"ovnkube-node-jbt72\" (UID: \"fcb37aab-9086-4778-8c24-a36ed6cc7ad2\") " pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: I0309 09:31:18.875565 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:18 crc kubenswrapper[4971]: W0309 09:31:18.893689 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcb37aab_9086_4778_8c24_a36ed6cc7ad2.slice/crio-f45506dcf464b7a2cc035e8c5d98bc3c56d2d5e436e08e636248d7fb437e0a43 WatchSource:0}: Error finding container f45506dcf464b7a2cc035e8c5d98bc3c56d2d5e436e08e636248d7fb437e0a43: Status 404 returned error can't find the container with id f45506dcf464b7a2cc035e8c5d98bc3c56d2d5e436e08e636248d7fb437e0a43 Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.435916 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bhsp_3a2ffbc4-02ce-4bfc-8732-7364ac5878e6/ovn-acl-logging/0.log" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.436550 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bhsp_3a2ffbc4-02ce-4bfc-8732-7364ac5878e6/ovn-controller/0.log" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.437081 4971 generic.go:334] "Generic (PLEG): container finished" podID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" containerID="bd01cd5952c339dc781705cc90d089db0e24e789af8a57a40f3362e254be1183" exitCode=0 Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.437264 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" event={"ID":"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6","Type":"ContainerDied","Data":"bd01cd5952c339dc781705cc90d089db0e24e789af8a57a40f3362e254be1183"} Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.437386 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" event={"ID":"3a2ffbc4-02ce-4bfc-8732-7364ac5878e6","Type":"ContainerDied","Data":"2d159f020b7b482e3bb2301ad0e2eab87a3bbf0d63a10a6fa9a63438b96ee8e3"} Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.437411 4971 scope.go:117] "RemoveContainer" containerID="e5c7c4811b81491cc4a4d2d125c7b4af5ee22ec7af2ae7307b7f80054dd21d7b" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.437488 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9bhsp" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.439326 4971 generic.go:334] "Generic (PLEG): container finished" podID="fcb37aab-9086-4778-8c24-a36ed6cc7ad2" containerID="4cffe59d65b414710abc273f41dd3a68ae4871eb0330d9dc4f0a4d9e1eaaf3ce" exitCode=0 Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.439407 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" event={"ID":"fcb37aab-9086-4778-8c24-a36ed6cc7ad2","Type":"ContainerDied","Data":"4cffe59d65b414710abc273f41dd3a68ae4871eb0330d9dc4f0a4d9e1eaaf3ce"} Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.439435 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" event={"ID":"fcb37aab-9086-4778-8c24-a36ed6cc7ad2","Type":"ContainerStarted","Data":"f45506dcf464b7a2cc035e8c5d98bc3c56d2d5e436e08e636248d7fb437e0a43"} Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.446872 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-572n5_156929ae-cd9c-46c6-8bf1-bc28162f6917/kube-multus/0.log" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.447252 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-572n5" event={"ID":"156929ae-cd9c-46c6-8bf1-bc28162f6917","Type":"ContainerStarted","Data":"5f70875abe9c2ad410278264d53672c3567ac9fb7348613715c7662617cc65b2"} Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.458777 4971 scope.go:117] "RemoveContainer" containerID="51ead60c81aa051f1a5833235079813532303907fca8e7927390f1507f388da8" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.460791 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9bhsp"] Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.464252 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9bhsp"] Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.476693 4971 scope.go:117] "RemoveContainer" containerID="bd01cd5952c339dc781705cc90d089db0e24e789af8a57a40f3362e254be1183" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.503663 4971 scope.go:117] "RemoveContainer" containerID="29f020bddec94a203a1e569fae2e369d84702dcc8bcf68f016fda56798a62056" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.529313 4971 scope.go:117] "RemoveContainer" containerID="59358bfee0d3ea4c0990d9d6584366a5db170d4147cf5b334ba997c5f9d7dccd" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.547376 4971 scope.go:117] "RemoveContainer" containerID="ab8e58e71df7ac41fdd077b1c5bfca8fd7227fc8613cc3e2c3c8de0c0a40e293" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.592580 4971 scope.go:117] "RemoveContainer" containerID="dfc0a02ca9288e6b478c1746877e3532b5930f4df108cd5d58548211cabdd334" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.608445 4971 scope.go:117] "RemoveContainer" containerID="3ae752a69b1dd4eb466ba0c9bbc03f430c2b83a5c4fdf84f5e7ed37decf875fb" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.646659 4971 scope.go:117] "RemoveContainer" containerID="0399641bc491a3ec975fd05a9a91d1f76b5f3fed3c8e43cd06e51f67e386ea51" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.670754 4971 scope.go:117] "RemoveContainer" containerID="e5c7c4811b81491cc4a4d2d125c7b4af5ee22ec7af2ae7307b7f80054dd21d7b" Mar 09 09:31:19 crc kubenswrapper[4971]: E0309 09:31:19.672109 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5c7c4811b81491cc4a4d2d125c7b4af5ee22ec7af2ae7307b7f80054dd21d7b\": container with ID starting with e5c7c4811b81491cc4a4d2d125c7b4af5ee22ec7af2ae7307b7f80054dd21d7b not found: ID does not exist" containerID="e5c7c4811b81491cc4a4d2d125c7b4af5ee22ec7af2ae7307b7f80054dd21d7b" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.672153 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5c7c4811b81491cc4a4d2d125c7b4af5ee22ec7af2ae7307b7f80054dd21d7b"} err="failed to get container status \"e5c7c4811b81491cc4a4d2d125c7b4af5ee22ec7af2ae7307b7f80054dd21d7b\": rpc error: code = NotFound desc = could not find container \"e5c7c4811b81491cc4a4d2d125c7b4af5ee22ec7af2ae7307b7f80054dd21d7b\": container with ID starting with e5c7c4811b81491cc4a4d2d125c7b4af5ee22ec7af2ae7307b7f80054dd21d7b not found: ID does not exist" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.672181 4971 scope.go:117] "RemoveContainer" containerID="51ead60c81aa051f1a5833235079813532303907fca8e7927390f1507f388da8" Mar 09 09:31:19 crc kubenswrapper[4971]: E0309 09:31:19.672538 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51ead60c81aa051f1a5833235079813532303907fca8e7927390f1507f388da8\": container with ID starting with 51ead60c81aa051f1a5833235079813532303907fca8e7927390f1507f388da8 not found: ID does not exist" containerID="51ead60c81aa051f1a5833235079813532303907fca8e7927390f1507f388da8" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.672593 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51ead60c81aa051f1a5833235079813532303907fca8e7927390f1507f388da8"} err="failed to get container status \"51ead60c81aa051f1a5833235079813532303907fca8e7927390f1507f388da8\": rpc error: code = NotFound desc = could not find container \"51ead60c81aa051f1a5833235079813532303907fca8e7927390f1507f388da8\": container with ID starting with 51ead60c81aa051f1a5833235079813532303907fca8e7927390f1507f388da8 not found: ID does not exist" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.672627 4971 scope.go:117] "RemoveContainer" containerID="bd01cd5952c339dc781705cc90d089db0e24e789af8a57a40f3362e254be1183" Mar 09 09:31:19 crc kubenswrapper[4971]: E0309 09:31:19.673016 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd01cd5952c339dc781705cc90d089db0e24e789af8a57a40f3362e254be1183\": container with ID starting with bd01cd5952c339dc781705cc90d089db0e24e789af8a57a40f3362e254be1183 not found: ID does not exist" containerID="bd01cd5952c339dc781705cc90d089db0e24e789af8a57a40f3362e254be1183" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.673042 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd01cd5952c339dc781705cc90d089db0e24e789af8a57a40f3362e254be1183"} err="failed to get container status \"bd01cd5952c339dc781705cc90d089db0e24e789af8a57a40f3362e254be1183\": rpc error: code = NotFound desc = could not find container \"bd01cd5952c339dc781705cc90d089db0e24e789af8a57a40f3362e254be1183\": container with ID starting with bd01cd5952c339dc781705cc90d089db0e24e789af8a57a40f3362e254be1183 not found: ID does not exist" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.673057 4971 scope.go:117] "RemoveContainer" containerID="29f020bddec94a203a1e569fae2e369d84702dcc8bcf68f016fda56798a62056" Mar 09 09:31:19 crc kubenswrapper[4971]: E0309 09:31:19.673412 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29f020bddec94a203a1e569fae2e369d84702dcc8bcf68f016fda56798a62056\": container with ID starting with 29f020bddec94a203a1e569fae2e369d84702dcc8bcf68f016fda56798a62056 not found: ID does not exist" containerID="29f020bddec94a203a1e569fae2e369d84702dcc8bcf68f016fda56798a62056" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.673474 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29f020bddec94a203a1e569fae2e369d84702dcc8bcf68f016fda56798a62056"} err="failed to get container status \"29f020bddec94a203a1e569fae2e369d84702dcc8bcf68f016fda56798a62056\": rpc error: code = NotFound desc = could not find container \"29f020bddec94a203a1e569fae2e369d84702dcc8bcf68f016fda56798a62056\": container with ID starting with 29f020bddec94a203a1e569fae2e369d84702dcc8bcf68f016fda56798a62056 not found: ID does not exist" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.673502 4971 scope.go:117] "RemoveContainer" containerID="59358bfee0d3ea4c0990d9d6584366a5db170d4147cf5b334ba997c5f9d7dccd" Mar 09 09:31:19 crc kubenswrapper[4971]: E0309 09:31:19.673902 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59358bfee0d3ea4c0990d9d6584366a5db170d4147cf5b334ba997c5f9d7dccd\": container with ID starting with 59358bfee0d3ea4c0990d9d6584366a5db170d4147cf5b334ba997c5f9d7dccd not found: ID does not exist" containerID="59358bfee0d3ea4c0990d9d6584366a5db170d4147cf5b334ba997c5f9d7dccd" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.673933 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59358bfee0d3ea4c0990d9d6584366a5db170d4147cf5b334ba997c5f9d7dccd"} err="failed to get container status \"59358bfee0d3ea4c0990d9d6584366a5db170d4147cf5b334ba997c5f9d7dccd\": rpc error: code = NotFound desc = could not find container \"59358bfee0d3ea4c0990d9d6584366a5db170d4147cf5b334ba997c5f9d7dccd\": container with ID starting with 59358bfee0d3ea4c0990d9d6584366a5db170d4147cf5b334ba997c5f9d7dccd not found: ID does not exist" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.673950 4971 scope.go:117] "RemoveContainer" containerID="ab8e58e71df7ac41fdd077b1c5bfca8fd7227fc8613cc3e2c3c8de0c0a40e293" Mar 09 09:31:19 crc kubenswrapper[4971]: E0309 09:31:19.674217 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab8e58e71df7ac41fdd077b1c5bfca8fd7227fc8613cc3e2c3c8de0c0a40e293\": container with ID starting with ab8e58e71df7ac41fdd077b1c5bfca8fd7227fc8613cc3e2c3c8de0c0a40e293 not found: ID does not exist" containerID="ab8e58e71df7ac41fdd077b1c5bfca8fd7227fc8613cc3e2c3c8de0c0a40e293" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.674267 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab8e58e71df7ac41fdd077b1c5bfca8fd7227fc8613cc3e2c3c8de0c0a40e293"} err="failed to get container status \"ab8e58e71df7ac41fdd077b1c5bfca8fd7227fc8613cc3e2c3c8de0c0a40e293\": rpc error: code = NotFound desc = could not find container \"ab8e58e71df7ac41fdd077b1c5bfca8fd7227fc8613cc3e2c3c8de0c0a40e293\": container with ID starting with ab8e58e71df7ac41fdd077b1c5bfca8fd7227fc8613cc3e2c3c8de0c0a40e293 not found: ID does not exist" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.674286 4971 scope.go:117] "RemoveContainer" containerID="dfc0a02ca9288e6b478c1746877e3532b5930f4df108cd5d58548211cabdd334" Mar 09 09:31:19 crc kubenswrapper[4971]: E0309 09:31:19.674617 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfc0a02ca9288e6b478c1746877e3532b5930f4df108cd5d58548211cabdd334\": container with ID starting with dfc0a02ca9288e6b478c1746877e3532b5930f4df108cd5d58548211cabdd334 not found: ID does not exist" containerID="dfc0a02ca9288e6b478c1746877e3532b5930f4df108cd5d58548211cabdd334" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.674653 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfc0a02ca9288e6b478c1746877e3532b5930f4df108cd5d58548211cabdd334"} err="failed to get container status \"dfc0a02ca9288e6b478c1746877e3532b5930f4df108cd5d58548211cabdd334\": rpc error: code = NotFound desc = could not find container \"dfc0a02ca9288e6b478c1746877e3532b5930f4df108cd5d58548211cabdd334\": container with ID starting with dfc0a02ca9288e6b478c1746877e3532b5930f4df108cd5d58548211cabdd334 not found: ID does not exist" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.674674 4971 scope.go:117] "RemoveContainer" containerID="3ae752a69b1dd4eb466ba0c9bbc03f430c2b83a5c4fdf84f5e7ed37decf875fb" Mar 09 09:31:19 crc kubenswrapper[4971]: E0309 09:31:19.674943 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ae752a69b1dd4eb466ba0c9bbc03f430c2b83a5c4fdf84f5e7ed37decf875fb\": container with ID starting with 3ae752a69b1dd4eb466ba0c9bbc03f430c2b83a5c4fdf84f5e7ed37decf875fb not found: ID does not exist" containerID="3ae752a69b1dd4eb466ba0c9bbc03f430c2b83a5c4fdf84f5e7ed37decf875fb" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.674990 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ae752a69b1dd4eb466ba0c9bbc03f430c2b83a5c4fdf84f5e7ed37decf875fb"} err="failed to get container status \"3ae752a69b1dd4eb466ba0c9bbc03f430c2b83a5c4fdf84f5e7ed37decf875fb\": rpc error: code = NotFound desc = could not find container \"3ae752a69b1dd4eb466ba0c9bbc03f430c2b83a5c4fdf84f5e7ed37decf875fb\": container with ID starting with 3ae752a69b1dd4eb466ba0c9bbc03f430c2b83a5c4fdf84f5e7ed37decf875fb not found: ID does not exist" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.675008 4971 scope.go:117] "RemoveContainer" containerID="0399641bc491a3ec975fd05a9a91d1f76b5f3fed3c8e43cd06e51f67e386ea51" Mar 09 09:31:19 crc kubenswrapper[4971]: E0309 09:31:19.675238 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0399641bc491a3ec975fd05a9a91d1f76b5f3fed3c8e43cd06e51f67e386ea51\": container with ID starting with 0399641bc491a3ec975fd05a9a91d1f76b5f3fed3c8e43cd06e51f67e386ea51 not found: ID does not exist" containerID="0399641bc491a3ec975fd05a9a91d1f76b5f3fed3c8e43cd06e51f67e386ea51" Mar 09 09:31:19 crc kubenswrapper[4971]: I0309 09:31:19.675270 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0399641bc491a3ec975fd05a9a91d1f76b5f3fed3c8e43cd06e51f67e386ea51"} err="failed to get container status \"0399641bc491a3ec975fd05a9a91d1f76b5f3fed3c8e43cd06e51f67e386ea51\": rpc error: code = NotFound desc = could not find container \"0399641bc491a3ec975fd05a9a91d1f76b5f3fed3c8e43cd06e51f67e386ea51\": container with ID starting with 0399641bc491a3ec975fd05a9a91d1f76b5f3fed3c8e43cd06e51f67e386ea51 not found: ID does not exist" Mar 09 09:31:20 crc kubenswrapper[4971]: I0309 09:31:20.457060 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" event={"ID":"fcb37aab-9086-4778-8c24-a36ed6cc7ad2","Type":"ContainerStarted","Data":"59e73868123ca70e2204a909cb0c59a602b5f555458898599e10a66fee840f43"} Mar 09 09:31:20 crc kubenswrapper[4971]: I0309 09:31:20.457448 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" event={"ID":"fcb37aab-9086-4778-8c24-a36ed6cc7ad2","Type":"ContainerStarted","Data":"0300c03e1eee4f6072344e1041f64e5bbc3fc7c3682761fac9f1d3b7ff262157"} Mar 09 09:31:20 crc kubenswrapper[4971]: I0309 09:31:20.457458 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" event={"ID":"fcb37aab-9086-4778-8c24-a36ed6cc7ad2","Type":"ContainerStarted","Data":"1af03b0ef11a8563f97bf7b73401a3156585a4c165c33970f2483d55a751331f"} Mar 09 09:31:20 crc kubenswrapper[4971]: I0309 09:31:20.457468 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" event={"ID":"fcb37aab-9086-4778-8c24-a36ed6cc7ad2","Type":"ContainerStarted","Data":"4ba3424d26fd91dcd496330e1a8ef70befd2e4dee7c16785594e9a4fa149f040"} Mar 09 09:31:20 crc kubenswrapper[4971]: I0309 09:31:20.457477 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" event={"ID":"fcb37aab-9086-4778-8c24-a36ed6cc7ad2","Type":"ContainerStarted","Data":"2c0995c6fe8d99c7c5ee649afbc24c293b18336066c1e8baad8fc8337d46cd0c"} Mar 09 09:31:20 crc kubenswrapper[4971]: I0309 09:31:20.457486 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" event={"ID":"fcb37aab-9086-4778-8c24-a36ed6cc7ad2","Type":"ContainerStarted","Data":"d0f90e04afd2a5cccc0be8a184d1511caf6bb814e388413e690ba6c27d7266ff"} Mar 09 09:31:21 crc kubenswrapper[4971]: I0309 09:31:21.158658 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a2ffbc4-02ce-4bfc-8732-7364ac5878e6" path="/var/lib/kubelet/pods/3a2ffbc4-02ce-4bfc-8732-7364ac5878e6/volumes" Mar 09 09:31:22 crc kubenswrapper[4971]: I0309 09:31:22.470487 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" event={"ID":"fcb37aab-9086-4778-8c24-a36ed6cc7ad2","Type":"ContainerStarted","Data":"c8914ed51ab0cbc164ddbc93e5532071f761be03f53d968831d03444fc54e948"} Mar 09 09:31:25 crc kubenswrapper[4971]: I0309 09:31:25.495538 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" event={"ID":"fcb37aab-9086-4778-8c24-a36ed6cc7ad2","Type":"ContainerStarted","Data":"f9a16343fed2821f9590edd144b16e0566632a5016e34d4e92e7e131a2ad6e19"} Mar 09 09:31:25 crc kubenswrapper[4971]: I0309 09:31:25.496637 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:25 crc kubenswrapper[4971]: I0309 09:31:25.496657 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:25 crc kubenswrapper[4971]: I0309 09:31:25.496667 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:25 crc kubenswrapper[4971]: I0309 09:31:25.529154 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:25 crc kubenswrapper[4971]: I0309 09:31:25.529226 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:31:25 crc kubenswrapper[4971]: I0309 09:31:25.595615 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" podStartSLOduration=7.595600327 podStartE2EDuration="7.595600327s" podCreationTimestamp="2026-03-09 09:31:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:31:25.525875755 +0000 UTC m=+689.085803565" watchObservedRunningTime="2026-03-09 09:31:25.595600327 +0000 UTC m=+689.155528137" Mar 09 09:31:42 crc kubenswrapper[4971]: I0309 09:31:42.674629 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh"] Mar 09 09:31:42 crc kubenswrapper[4971]: I0309 09:31:42.676880 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh" Mar 09 09:31:42 crc kubenswrapper[4971]: I0309 09:31:42.679933 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 09 09:31:42 crc kubenswrapper[4971]: I0309 09:31:42.685518 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh"] Mar 09 09:31:42 crc kubenswrapper[4971]: I0309 09:31:42.874179 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d32c7d36-749d-4cd4-a790-e1e702d6cd64-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh\" (UID: \"d32c7d36-749d-4cd4-a790-e1e702d6cd64\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh" Mar 09 09:31:42 crc kubenswrapper[4971]: I0309 09:31:42.874538 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxxhb\" (UniqueName: \"kubernetes.io/projected/d32c7d36-749d-4cd4-a790-e1e702d6cd64-kube-api-access-nxxhb\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh\" (UID: \"d32c7d36-749d-4cd4-a790-e1e702d6cd64\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh" Mar 09 09:31:42 crc kubenswrapper[4971]: I0309 09:31:42.874730 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d32c7d36-749d-4cd4-a790-e1e702d6cd64-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh\" (UID: \"d32c7d36-749d-4cd4-a790-e1e702d6cd64\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh" Mar 09 09:31:42 crc kubenswrapper[4971]: I0309 09:31:42.975331 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d32c7d36-749d-4cd4-a790-e1e702d6cd64-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh\" (UID: \"d32c7d36-749d-4cd4-a790-e1e702d6cd64\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh" Mar 09 09:31:42 crc kubenswrapper[4971]: I0309 09:31:42.975396 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxxhb\" (UniqueName: \"kubernetes.io/projected/d32c7d36-749d-4cd4-a790-e1e702d6cd64-kube-api-access-nxxhb\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh\" (UID: \"d32c7d36-749d-4cd4-a790-e1e702d6cd64\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh" Mar 09 09:31:42 crc kubenswrapper[4971]: I0309 09:31:42.975451 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d32c7d36-749d-4cd4-a790-e1e702d6cd64-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh\" (UID: \"d32c7d36-749d-4cd4-a790-e1e702d6cd64\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh" Mar 09 09:31:42 crc kubenswrapper[4971]: I0309 09:31:42.975849 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d32c7d36-749d-4cd4-a790-e1e702d6cd64-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh\" (UID: \"d32c7d36-749d-4cd4-a790-e1e702d6cd64\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh" Mar 09 09:31:42 crc kubenswrapper[4971]: I0309 09:31:42.975882 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d32c7d36-749d-4cd4-a790-e1e702d6cd64-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh\" (UID: \"d32c7d36-749d-4cd4-a790-e1e702d6cd64\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh" Mar 09 09:31:42 crc kubenswrapper[4971]: I0309 09:31:42.993121 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxxhb\" (UniqueName: \"kubernetes.io/projected/d32c7d36-749d-4cd4-a790-e1e702d6cd64-kube-api-access-nxxhb\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh\" (UID: \"d32c7d36-749d-4cd4-a790-e1e702d6cd64\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh" Mar 09 09:31:42 crc kubenswrapper[4971]: I0309 09:31:42.995413 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh" Mar 09 09:31:43 crc kubenswrapper[4971]: I0309 09:31:43.194696 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh"] Mar 09 09:31:43 crc kubenswrapper[4971]: I0309 09:31:43.791865 4971 generic.go:334] "Generic (PLEG): container finished" podID="d32c7d36-749d-4cd4-a790-e1e702d6cd64" containerID="f25c28983c7e9489145527454ac20063860500eba76593538423d7749e16db10" exitCode=0 Mar 09 09:31:43 crc kubenswrapper[4971]: I0309 09:31:43.791966 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh" event={"ID":"d32c7d36-749d-4cd4-a790-e1e702d6cd64","Type":"ContainerDied","Data":"f25c28983c7e9489145527454ac20063860500eba76593538423d7749e16db10"} Mar 09 09:31:43 crc kubenswrapper[4971]: I0309 09:31:43.792208 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh" event={"ID":"d32c7d36-749d-4cd4-a790-e1e702d6cd64","Type":"ContainerStarted","Data":"1190239a08f361f427eccc1fa366477ab90976184eaa5a160fba0c5ba3ade790"} Mar 09 09:31:44 crc kubenswrapper[4971]: I0309 09:31:44.794328 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:31:44 crc kubenswrapper[4971]: I0309 09:31:44.794426 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:31:48 crc kubenswrapper[4971]: I0309 09:31:48.895241 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jbt72" Mar 09 09:32:00 crc kubenswrapper[4971]: I0309 09:32:00.140538 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550812-gphdx"] Mar 09 09:32:00 crc kubenswrapper[4971]: I0309 09:32:00.142155 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550812-gphdx" Mar 09 09:32:00 crc kubenswrapper[4971]: I0309 09:32:00.144105 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-xhrv2" Mar 09 09:32:00 crc kubenswrapper[4971]: I0309 09:32:00.144283 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:32:00 crc kubenswrapper[4971]: I0309 09:32:00.144505 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:32:00 crc kubenswrapper[4971]: I0309 09:32:00.151802 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550812-gphdx"] Mar 09 09:32:00 crc kubenswrapper[4971]: I0309 09:32:00.189760 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56qmg\" (UniqueName: \"kubernetes.io/projected/e736658b-7928-4d43-b26c-5c93e8fb5f99-kube-api-access-56qmg\") pod \"auto-csr-approver-29550812-gphdx\" (UID: \"e736658b-7928-4d43-b26c-5c93e8fb5f99\") " pod="openshift-infra/auto-csr-approver-29550812-gphdx" Mar 09 09:32:00 crc kubenswrapper[4971]: I0309 09:32:00.290744 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56qmg\" (UniqueName: \"kubernetes.io/projected/e736658b-7928-4d43-b26c-5c93e8fb5f99-kube-api-access-56qmg\") pod \"auto-csr-approver-29550812-gphdx\" (UID: \"e736658b-7928-4d43-b26c-5c93e8fb5f99\") " pod="openshift-infra/auto-csr-approver-29550812-gphdx" Mar 09 09:32:00 crc kubenswrapper[4971]: I0309 09:32:00.314592 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56qmg\" (UniqueName: \"kubernetes.io/projected/e736658b-7928-4d43-b26c-5c93e8fb5f99-kube-api-access-56qmg\") pod \"auto-csr-approver-29550812-gphdx\" (UID: \"e736658b-7928-4d43-b26c-5c93e8fb5f99\") " pod="openshift-infra/auto-csr-approver-29550812-gphdx" Mar 09 09:32:00 crc kubenswrapper[4971]: I0309 09:32:00.466480 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550812-gphdx" Mar 09 09:32:00 crc kubenswrapper[4971]: I0309 09:32:00.870622 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550812-gphdx"] Mar 09 09:32:00 crc kubenswrapper[4971]: W0309 09:32:00.881622 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode736658b_7928_4d43_b26c_5c93e8fb5f99.slice/crio-cf39ec24b91c0bf341d2b31816c3e0cb7c44cee1f03d96447786067ceda49810 WatchSource:0}: Error finding container cf39ec24b91c0bf341d2b31816c3e0cb7c44cee1f03d96447786067ceda49810: Status 404 returned error can't find the container with id cf39ec24b91c0bf341d2b31816c3e0cb7c44cee1f03d96447786067ceda49810 Mar 09 09:32:00 crc kubenswrapper[4971]: I0309 09:32:00.887773 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550812-gphdx" event={"ID":"e736658b-7928-4d43-b26c-5c93e8fb5f99","Type":"ContainerStarted","Data":"cf39ec24b91c0bf341d2b31816c3e0cb7c44cee1f03d96447786067ceda49810"} Mar 09 09:32:00 crc kubenswrapper[4971]: I0309 09:32:00.891561 4971 generic.go:334] "Generic (PLEG): container finished" podID="d32c7d36-749d-4cd4-a790-e1e702d6cd64" containerID="09763de57b22f129517225fae6c0fea47bae39a6c5f7b01ee1f39da781b9120f" exitCode=0 Mar 09 09:32:00 crc kubenswrapper[4971]: I0309 09:32:00.891657 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh" event={"ID":"d32c7d36-749d-4cd4-a790-e1e702d6cd64","Type":"ContainerDied","Data":"09763de57b22f129517225fae6c0fea47bae39a6c5f7b01ee1f39da781b9120f"} Mar 09 09:32:01 crc kubenswrapper[4971]: I0309 09:32:01.900743 4971 generic.go:334] "Generic (PLEG): container finished" podID="d32c7d36-749d-4cd4-a790-e1e702d6cd64" containerID="1887e540691644c95cff2d2a31ff75fede25477b61921bc9b3577b82c958d89e" exitCode=0 Mar 09 09:32:01 crc kubenswrapper[4971]: I0309 09:32:01.900818 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh" event={"ID":"d32c7d36-749d-4cd4-a790-e1e702d6cd64","Type":"ContainerDied","Data":"1887e540691644c95cff2d2a31ff75fede25477b61921bc9b3577b82c958d89e"} Mar 09 09:32:02 crc kubenswrapper[4971]: I0309 09:32:02.908696 4971 generic.go:334] "Generic (PLEG): container finished" podID="e736658b-7928-4d43-b26c-5c93e8fb5f99" containerID="6f6ee7820a4785d9490723be5f5bafdf23d43e451ddd0d6e0573798c47b11cd9" exitCode=0 Mar 09 09:32:02 crc kubenswrapper[4971]: I0309 09:32:02.908787 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550812-gphdx" event={"ID":"e736658b-7928-4d43-b26c-5c93e8fb5f99","Type":"ContainerDied","Data":"6f6ee7820a4785d9490723be5f5bafdf23d43e451ddd0d6e0573798c47b11cd9"} Mar 09 09:32:03 crc kubenswrapper[4971]: I0309 09:32:03.117901 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh" Mar 09 09:32:03 crc kubenswrapper[4971]: I0309 09:32:03.236535 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d32c7d36-749d-4cd4-a790-e1e702d6cd64-bundle\") pod \"d32c7d36-749d-4cd4-a790-e1e702d6cd64\" (UID: \"d32c7d36-749d-4cd4-a790-e1e702d6cd64\") " Mar 09 09:32:03 crc kubenswrapper[4971]: I0309 09:32:03.236601 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxxhb\" (UniqueName: \"kubernetes.io/projected/d32c7d36-749d-4cd4-a790-e1e702d6cd64-kube-api-access-nxxhb\") pod \"d32c7d36-749d-4cd4-a790-e1e702d6cd64\" (UID: \"d32c7d36-749d-4cd4-a790-e1e702d6cd64\") " Mar 09 09:32:03 crc kubenswrapper[4971]: I0309 09:32:03.236630 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d32c7d36-749d-4cd4-a790-e1e702d6cd64-util\") pod \"d32c7d36-749d-4cd4-a790-e1e702d6cd64\" (UID: \"d32c7d36-749d-4cd4-a790-e1e702d6cd64\") " Mar 09 09:32:03 crc kubenswrapper[4971]: I0309 09:32:03.237545 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d32c7d36-749d-4cd4-a790-e1e702d6cd64-bundle" (OuterVolumeSpecName: "bundle") pod "d32c7d36-749d-4cd4-a790-e1e702d6cd64" (UID: "d32c7d36-749d-4cd4-a790-e1e702d6cd64"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:32:03 crc kubenswrapper[4971]: I0309 09:32:03.243181 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d32c7d36-749d-4cd4-a790-e1e702d6cd64-kube-api-access-nxxhb" (OuterVolumeSpecName: "kube-api-access-nxxhb") pod "d32c7d36-749d-4cd4-a790-e1e702d6cd64" (UID: "d32c7d36-749d-4cd4-a790-e1e702d6cd64"). InnerVolumeSpecName "kube-api-access-nxxhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:32:03 crc kubenswrapper[4971]: I0309 09:32:03.249880 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d32c7d36-749d-4cd4-a790-e1e702d6cd64-util" (OuterVolumeSpecName: "util") pod "d32c7d36-749d-4cd4-a790-e1e702d6cd64" (UID: "d32c7d36-749d-4cd4-a790-e1e702d6cd64"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:32:03 crc kubenswrapper[4971]: I0309 09:32:03.338621 4971 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d32c7d36-749d-4cd4-a790-e1e702d6cd64-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:32:03 crc kubenswrapper[4971]: I0309 09:32:03.338666 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxxhb\" (UniqueName: \"kubernetes.io/projected/d32c7d36-749d-4cd4-a790-e1e702d6cd64-kube-api-access-nxxhb\") on node \"crc\" DevicePath \"\"" Mar 09 09:32:03 crc kubenswrapper[4971]: I0309 09:32:03.338681 4971 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d32c7d36-749d-4cd4-a790-e1e702d6cd64-util\") on node \"crc\" DevicePath \"\"" Mar 09 09:32:03 crc kubenswrapper[4971]: I0309 09:32:03.916904 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh" event={"ID":"d32c7d36-749d-4cd4-a790-e1e702d6cd64","Type":"ContainerDied","Data":"1190239a08f361f427eccc1fa366477ab90976184eaa5a160fba0c5ba3ade790"} Mar 09 09:32:03 crc kubenswrapper[4971]: I0309 09:32:03.917001 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1190239a08f361f427eccc1fa366477ab90976184eaa5a160fba0c5ba3ade790" Mar 09 09:32:03 crc kubenswrapper[4971]: I0309 09:32:03.916930 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh" Mar 09 09:32:04 crc kubenswrapper[4971]: I0309 09:32:04.106403 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550812-gphdx" Mar 09 09:32:04 crc kubenswrapper[4971]: I0309 09:32:04.257343 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56qmg\" (UniqueName: \"kubernetes.io/projected/e736658b-7928-4d43-b26c-5c93e8fb5f99-kube-api-access-56qmg\") pod \"e736658b-7928-4d43-b26c-5c93e8fb5f99\" (UID: \"e736658b-7928-4d43-b26c-5c93e8fb5f99\") " Mar 09 09:32:04 crc kubenswrapper[4971]: I0309 09:32:04.261607 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e736658b-7928-4d43-b26c-5c93e8fb5f99-kube-api-access-56qmg" (OuterVolumeSpecName: "kube-api-access-56qmg") pod "e736658b-7928-4d43-b26c-5c93e8fb5f99" (UID: "e736658b-7928-4d43-b26c-5c93e8fb5f99"). InnerVolumeSpecName "kube-api-access-56qmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:32:04 crc kubenswrapper[4971]: I0309 09:32:04.359179 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56qmg\" (UniqueName: \"kubernetes.io/projected/e736658b-7928-4d43-b26c-5c93e8fb5f99-kube-api-access-56qmg\") on node \"crc\" DevicePath \"\"" Mar 09 09:32:04 crc kubenswrapper[4971]: I0309 09:32:04.922611 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550812-gphdx" event={"ID":"e736658b-7928-4d43-b26c-5c93e8fb5f99","Type":"ContainerDied","Data":"cf39ec24b91c0bf341d2b31816c3e0cb7c44cee1f03d96447786067ceda49810"} Mar 09 09:32:04 crc kubenswrapper[4971]: I0309 09:32:04.922929 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf39ec24b91c0bf341d2b31816c3e0cb7c44cee1f03d96447786067ceda49810" Mar 09 09:32:04 crc kubenswrapper[4971]: I0309 09:32:04.922651 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550812-gphdx" Mar 09 09:32:05 crc kubenswrapper[4971]: I0309 09:32:05.160051 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550806-5x6nv"] Mar 09 09:32:05 crc kubenswrapper[4971]: I0309 09:32:05.160096 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550806-5x6nv"] Mar 09 09:32:07 crc kubenswrapper[4971]: I0309 09:32:07.159613 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31b7627-abfd-4227-b142-0fcdca9e2b0b" path="/var/lib/kubelet/pods/a31b7627-abfd-4227-b142-0fcdca9e2b0b/volumes" Mar 09 09:32:13 crc kubenswrapper[4971]: I0309 09:32:13.791851 4971 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 09 09:32:14 crc kubenswrapper[4971]: I0309 09:32:14.794907 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:32:14 crc kubenswrapper[4971]: I0309 09:32:14.794968 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:32:14 crc kubenswrapper[4971]: I0309 09:32:14.795014 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" Mar 09 09:32:14 crc kubenswrapper[4971]: I0309 09:32:14.795547 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75bb88e6db008edd2980d5e44e1931a66833b416d839996571ee8b190f030a3c"} pod="openshift-machine-config-operator/machine-config-daemon-p56wx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:32:14 crc kubenswrapper[4971]: I0309 09:32:14.795616 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" containerID="cri-o://75bb88e6db008edd2980d5e44e1931a66833b416d839996571ee8b190f030a3c" gracePeriod=600 Mar 09 09:32:14 crc kubenswrapper[4971]: I0309 09:32:14.988639 4971 generic.go:334] "Generic (PLEG): container finished" podID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerID="75bb88e6db008edd2980d5e44e1931a66833b416d839996571ee8b190f030a3c" exitCode=0 Mar 09 09:32:14 crc kubenswrapper[4971]: I0309 09:32:14.988719 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" event={"ID":"05fde3ad-1182-4b15-bb1a-f365ecc92d75","Type":"ContainerDied","Data":"75bb88e6db008edd2980d5e44e1931a66833b416d839996571ee8b190f030a3c"} Mar 09 09:32:14 crc kubenswrapper[4971]: I0309 09:32:14.989040 4971 scope.go:117] "RemoveContainer" containerID="7aa603ba67328834de5950491258a16b4fddbca04efe1575ba7e19aa5d559570" Mar 09 09:32:15 crc kubenswrapper[4971]: I0309 09:32:15.928938 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6758965db4-5xg8k"] Mar 09 09:32:15 crc kubenswrapper[4971]: E0309 09:32:15.929510 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d32c7d36-749d-4cd4-a790-e1e702d6cd64" containerName="pull" Mar 09 09:32:15 crc kubenswrapper[4971]: I0309 09:32:15.929526 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d32c7d36-749d-4cd4-a790-e1e702d6cd64" containerName="pull" Mar 09 09:32:15 crc kubenswrapper[4971]: E0309 09:32:15.929550 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d32c7d36-749d-4cd4-a790-e1e702d6cd64" containerName="util" Mar 09 09:32:15 crc kubenswrapper[4971]: I0309 09:32:15.929558 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d32c7d36-749d-4cd4-a790-e1e702d6cd64" containerName="util" Mar 09 09:32:15 crc kubenswrapper[4971]: E0309 09:32:15.929568 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e736658b-7928-4d43-b26c-5c93e8fb5f99" containerName="oc" Mar 09 09:32:15 crc kubenswrapper[4971]: I0309 09:32:15.929578 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e736658b-7928-4d43-b26c-5c93e8fb5f99" containerName="oc" Mar 09 09:32:15 crc kubenswrapper[4971]: E0309 09:32:15.929590 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d32c7d36-749d-4cd4-a790-e1e702d6cd64" containerName="extract" Mar 09 09:32:15 crc kubenswrapper[4971]: I0309 09:32:15.929598 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d32c7d36-749d-4cd4-a790-e1e702d6cd64" containerName="extract" Mar 09 09:32:15 crc kubenswrapper[4971]: I0309 09:32:15.929720 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="d32c7d36-749d-4cd4-a790-e1e702d6cd64" containerName="extract" Mar 09 09:32:15 crc kubenswrapper[4971]: I0309 09:32:15.929746 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e736658b-7928-4d43-b26c-5c93e8fb5f99" containerName="oc" Mar 09 09:32:15 crc kubenswrapper[4971]: I0309 09:32:15.930207 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6758965db4-5xg8k" Mar 09 09:32:15 crc kubenswrapper[4971]: I0309 09:32:15.931989 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 09 09:32:15 crc kubenswrapper[4971]: I0309 09:32:15.932966 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 09 09:32:15 crc kubenswrapper[4971]: I0309 09:32:15.933015 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 09 09:32:15 crc kubenswrapper[4971]: I0309 09:32:15.933216 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 09 09:32:15 crc kubenswrapper[4971]: I0309 09:32:15.933239 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-mh66f" Mar 09 09:32:15 crc kubenswrapper[4971]: I0309 09:32:15.951866 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6758965db4-5xg8k"] Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.012078 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" event={"ID":"05fde3ad-1182-4b15-bb1a-f365ecc92d75","Type":"ContainerStarted","Data":"3faafb59e33c928765c2ecf23a7678ad846a40e6f9948d8c13dc3d6b7074865f"} Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.102244 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wczp\" (UniqueName: \"kubernetes.io/projected/d00659bf-90ef-473d-b641-160aafb0e5cb-kube-api-access-2wczp\") pod \"metallb-operator-controller-manager-6758965db4-5xg8k\" (UID: \"d00659bf-90ef-473d-b641-160aafb0e5cb\") " pod="metallb-system/metallb-operator-controller-manager-6758965db4-5xg8k" Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.102439 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d00659bf-90ef-473d-b641-160aafb0e5cb-webhook-cert\") pod \"metallb-operator-controller-manager-6758965db4-5xg8k\" (UID: \"d00659bf-90ef-473d-b641-160aafb0e5cb\") " pod="metallb-system/metallb-operator-controller-manager-6758965db4-5xg8k" Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.102676 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d00659bf-90ef-473d-b641-160aafb0e5cb-apiservice-cert\") pod \"metallb-operator-controller-manager-6758965db4-5xg8k\" (UID: \"d00659bf-90ef-473d-b641-160aafb0e5cb\") " pod="metallb-system/metallb-operator-controller-manager-6758965db4-5xg8k" Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.203402 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d00659bf-90ef-473d-b641-160aafb0e5cb-apiservice-cert\") pod \"metallb-operator-controller-manager-6758965db4-5xg8k\" (UID: \"d00659bf-90ef-473d-b641-160aafb0e5cb\") " pod="metallb-system/metallb-operator-controller-manager-6758965db4-5xg8k" Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.203456 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wczp\" (UniqueName: \"kubernetes.io/projected/d00659bf-90ef-473d-b641-160aafb0e5cb-kube-api-access-2wczp\") pod \"metallb-operator-controller-manager-6758965db4-5xg8k\" (UID: \"d00659bf-90ef-473d-b641-160aafb0e5cb\") " pod="metallb-system/metallb-operator-controller-manager-6758965db4-5xg8k" Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.203500 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d00659bf-90ef-473d-b641-160aafb0e5cb-webhook-cert\") pod \"metallb-operator-controller-manager-6758965db4-5xg8k\" (UID: \"d00659bf-90ef-473d-b641-160aafb0e5cb\") " pod="metallb-system/metallb-operator-controller-manager-6758965db4-5xg8k" Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.209259 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d00659bf-90ef-473d-b641-160aafb0e5cb-webhook-cert\") pod \"metallb-operator-controller-manager-6758965db4-5xg8k\" (UID: \"d00659bf-90ef-473d-b641-160aafb0e5cb\") " pod="metallb-system/metallb-operator-controller-manager-6758965db4-5xg8k" Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.211018 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d00659bf-90ef-473d-b641-160aafb0e5cb-apiservice-cert\") pod \"metallb-operator-controller-manager-6758965db4-5xg8k\" (UID: \"d00659bf-90ef-473d-b641-160aafb0e5cb\") " pod="metallb-system/metallb-operator-controller-manager-6758965db4-5xg8k" Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.220451 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wczp\" (UniqueName: \"kubernetes.io/projected/d00659bf-90ef-473d-b641-160aafb0e5cb-kube-api-access-2wczp\") pod \"metallb-operator-controller-manager-6758965db4-5xg8k\" (UID: \"d00659bf-90ef-473d-b641-160aafb0e5cb\") " pod="metallb-system/metallb-operator-controller-manager-6758965db4-5xg8k" Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.246227 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6758965db4-5xg8k" Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.258646 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-78cf4d58c9-fftzx"] Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.259536 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-78cf4d58c9-fftzx" Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.261643 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.262655 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.262869 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-42xxv" Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.273288 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-78cf4d58c9-fftzx"] Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.305296 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f97f3e74-40e6-4980-a47c-e184ccb1ee4e-apiservice-cert\") pod \"metallb-operator-webhook-server-78cf4d58c9-fftzx\" (UID: \"f97f3e74-40e6-4980-a47c-e184ccb1ee4e\") " pod="metallb-system/metallb-operator-webhook-server-78cf4d58c9-fftzx" Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.305668 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrx5k\" (UniqueName: \"kubernetes.io/projected/f97f3e74-40e6-4980-a47c-e184ccb1ee4e-kube-api-access-lrx5k\") pod \"metallb-operator-webhook-server-78cf4d58c9-fftzx\" (UID: \"f97f3e74-40e6-4980-a47c-e184ccb1ee4e\") " pod="metallb-system/metallb-operator-webhook-server-78cf4d58c9-fftzx" Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.305762 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f97f3e74-40e6-4980-a47c-e184ccb1ee4e-webhook-cert\") pod \"metallb-operator-webhook-server-78cf4d58c9-fftzx\" (UID: \"f97f3e74-40e6-4980-a47c-e184ccb1ee4e\") " pod="metallb-system/metallb-operator-webhook-server-78cf4d58c9-fftzx" Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.407146 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrx5k\" (UniqueName: \"kubernetes.io/projected/f97f3e74-40e6-4980-a47c-e184ccb1ee4e-kube-api-access-lrx5k\") pod \"metallb-operator-webhook-server-78cf4d58c9-fftzx\" (UID: \"f97f3e74-40e6-4980-a47c-e184ccb1ee4e\") " pod="metallb-system/metallb-operator-webhook-server-78cf4d58c9-fftzx" Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.408487 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f97f3e74-40e6-4980-a47c-e184ccb1ee4e-webhook-cert\") pod \"metallb-operator-webhook-server-78cf4d58c9-fftzx\" (UID: \"f97f3e74-40e6-4980-a47c-e184ccb1ee4e\") " pod="metallb-system/metallb-operator-webhook-server-78cf4d58c9-fftzx" Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.408535 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f97f3e74-40e6-4980-a47c-e184ccb1ee4e-apiservice-cert\") pod \"metallb-operator-webhook-server-78cf4d58c9-fftzx\" (UID: \"f97f3e74-40e6-4980-a47c-e184ccb1ee4e\") " pod="metallb-system/metallb-operator-webhook-server-78cf4d58c9-fftzx" Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.416885 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f97f3e74-40e6-4980-a47c-e184ccb1ee4e-apiservice-cert\") pod \"metallb-operator-webhook-server-78cf4d58c9-fftzx\" (UID: \"f97f3e74-40e6-4980-a47c-e184ccb1ee4e\") " pod="metallb-system/metallb-operator-webhook-server-78cf4d58c9-fftzx" Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.422098 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f97f3e74-40e6-4980-a47c-e184ccb1ee4e-webhook-cert\") pod \"metallb-operator-webhook-server-78cf4d58c9-fftzx\" (UID: \"f97f3e74-40e6-4980-a47c-e184ccb1ee4e\") " pod="metallb-system/metallb-operator-webhook-server-78cf4d58c9-fftzx" Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.430149 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrx5k\" (UniqueName: \"kubernetes.io/projected/f97f3e74-40e6-4980-a47c-e184ccb1ee4e-kube-api-access-lrx5k\") pod \"metallb-operator-webhook-server-78cf4d58c9-fftzx\" (UID: \"f97f3e74-40e6-4980-a47c-e184ccb1ee4e\") " pod="metallb-system/metallb-operator-webhook-server-78cf4d58c9-fftzx" Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.557678 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6758965db4-5xg8k"] Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.617621 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-78cf4d58c9-fftzx" Mar 09 09:32:16 crc kubenswrapper[4971]: I0309 09:32:16.844708 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-78cf4d58c9-fftzx"] Mar 09 09:32:16 crc kubenswrapper[4971]: W0309 09:32:16.885860 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf97f3e74_40e6_4980_a47c_e184ccb1ee4e.slice/crio-bcd4ed0cd2303bc628e56c08a304d7b4d64a6b1f0bd8f79f3f2d3a3ac20aa714 WatchSource:0}: Error finding container bcd4ed0cd2303bc628e56c08a304d7b4d64a6b1f0bd8f79f3f2d3a3ac20aa714: Status 404 returned error can't find the container with id bcd4ed0cd2303bc628e56c08a304d7b4d64a6b1f0bd8f79f3f2d3a3ac20aa714 Mar 09 09:32:17 crc kubenswrapper[4971]: I0309 09:32:17.018146 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-78cf4d58c9-fftzx" event={"ID":"f97f3e74-40e6-4980-a47c-e184ccb1ee4e","Type":"ContainerStarted","Data":"bcd4ed0cd2303bc628e56c08a304d7b4d64a6b1f0bd8f79f3f2d3a3ac20aa714"} Mar 09 09:32:17 crc kubenswrapper[4971]: I0309 09:32:17.019494 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6758965db4-5xg8k" event={"ID":"d00659bf-90ef-473d-b641-160aafb0e5cb","Type":"ContainerStarted","Data":"bf10d997e9e8a00ad9024f81116f8fe90f1f945ca2d795059ee06187d983e84d"} Mar 09 09:32:20 crc kubenswrapper[4971]: I0309 09:32:20.038280 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6758965db4-5xg8k" event={"ID":"d00659bf-90ef-473d-b641-160aafb0e5cb","Type":"ContainerStarted","Data":"359c23b5728e7b36e0973cf2e982f8a5039bee77b52464a2e35e9f2bfabc7a23"} Mar 09 09:32:20 crc kubenswrapper[4971]: I0309 09:32:20.038855 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6758965db4-5xg8k" Mar 09 09:32:20 crc kubenswrapper[4971]: I0309 09:32:20.067506 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6758965db4-5xg8k" podStartSLOduration=2.127198203 podStartE2EDuration="5.067484211s" podCreationTimestamp="2026-03-09 09:32:15 +0000 UTC" firstStartedPulling="2026-03-09 09:32:16.572243489 +0000 UTC m=+740.132171299" lastFinishedPulling="2026-03-09 09:32:19.512529487 +0000 UTC m=+743.072457307" observedRunningTime="2026-03-09 09:32:20.059253877 +0000 UTC m=+743.619181707" watchObservedRunningTime="2026-03-09 09:32:20.067484211 +0000 UTC m=+743.627412021" Mar 09 09:32:23 crc kubenswrapper[4971]: I0309 09:32:23.058116 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-78cf4d58c9-fftzx" event={"ID":"f97f3e74-40e6-4980-a47c-e184ccb1ee4e","Type":"ContainerStarted","Data":"979b5fd38d6ed45f29df8fa1052a4ec3cee6e610094c08666e299028523e32d8"} Mar 09 09:32:23 crc kubenswrapper[4971]: I0309 09:32:23.058468 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-78cf4d58c9-fftzx" Mar 09 09:32:23 crc kubenswrapper[4971]: I0309 09:32:23.079238 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-78cf4d58c9-fftzx" podStartSLOduration=1.9015032 podStartE2EDuration="7.079220804s" podCreationTimestamp="2026-03-09 09:32:16 +0000 UTC" firstStartedPulling="2026-03-09 09:32:16.893124161 +0000 UTC m=+740.453051971" lastFinishedPulling="2026-03-09 09:32:22.070841765 +0000 UTC m=+745.630769575" observedRunningTime="2026-03-09 09:32:23.077453592 +0000 UTC m=+746.637381402" watchObservedRunningTime="2026-03-09 09:32:23.079220804 +0000 UTC m=+746.639148614" Mar 09 09:32:36 crc kubenswrapper[4971]: I0309 09:32:36.624742 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-78cf4d58c9-fftzx" Mar 09 09:32:56 crc kubenswrapper[4971]: I0309 09:32:56.249181 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6758965db4-5xg8k" Mar 09 09:32:56 crc kubenswrapper[4971]: I0309 09:32:56.962171 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-bvkfw"] Mar 09 09:32:56 crc kubenswrapper[4971]: I0309 09:32:56.962872 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-bvkfw" Mar 09 09:32:56 crc kubenswrapper[4971]: I0309 09:32:56.966233 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 09 09:32:56 crc kubenswrapper[4971]: I0309 09:32:56.966340 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-dmm86" Mar 09 09:32:56 crc kubenswrapper[4971]: I0309 09:32:56.969113 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-cs8lp"] Mar 09 09:32:56 crc kubenswrapper[4971]: I0309 09:32:56.971723 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:32:56 crc kubenswrapper[4971]: I0309 09:32:56.974121 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 09 09:32:56 crc kubenswrapper[4971]: I0309 09:32:56.975032 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 09 09:32:56 crc kubenswrapper[4971]: I0309 09:32:56.980098 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-bvkfw"] Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.061722 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-4xcv7"] Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.062763 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-4xcv7" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.064769 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.064975 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.065484 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.065746 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-xpgpk" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.066737 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-z4jb9"] Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.067878 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-z4jb9" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.071542 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.075866 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-z4jb9"] Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.126078 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whpjs\" (UniqueName: \"kubernetes.io/projected/624ce1af-f384-423e-847c-dc60c2996603-kube-api-access-whpjs\") pod \"frr-k8s-cs8lp\" (UID: \"624ce1af-f384-423e-847c-dc60c2996603\") " pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.126178 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/49bcc560-e687-4f99-9526-4baacbce3baa-cert\") pod \"frr-k8s-webhook-server-7f989f654f-bvkfw\" (UID: \"49bcc560-e687-4f99-9526-4baacbce3baa\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-bvkfw" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.126226 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/624ce1af-f384-423e-847c-dc60c2996603-frr-startup\") pod \"frr-k8s-cs8lp\" (UID: \"624ce1af-f384-423e-847c-dc60c2996603\") " pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.126271 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-784x7\" (UniqueName: \"kubernetes.io/projected/49bcc560-e687-4f99-9526-4baacbce3baa-kube-api-access-784x7\") pod \"frr-k8s-webhook-server-7f989f654f-bvkfw\" (UID: \"49bcc560-e687-4f99-9526-4baacbce3baa\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-bvkfw" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.126308 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/624ce1af-f384-423e-847c-dc60c2996603-metrics\") pod \"frr-k8s-cs8lp\" (UID: \"624ce1af-f384-423e-847c-dc60c2996603\") " pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.126336 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/624ce1af-f384-423e-847c-dc60c2996603-frr-sockets\") pod \"frr-k8s-cs8lp\" (UID: \"624ce1af-f384-423e-847c-dc60c2996603\") " pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.126381 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/624ce1af-f384-423e-847c-dc60c2996603-frr-conf\") pod \"frr-k8s-cs8lp\" (UID: \"624ce1af-f384-423e-847c-dc60c2996603\") " pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.126412 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/624ce1af-f384-423e-847c-dc60c2996603-reloader\") pod \"frr-k8s-cs8lp\" (UID: \"624ce1af-f384-423e-847c-dc60c2996603\") " pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.126435 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/624ce1af-f384-423e-847c-dc60c2996603-metrics-certs\") pod \"frr-k8s-cs8lp\" (UID: \"624ce1af-f384-423e-847c-dc60c2996603\") " pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.228019 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/624ce1af-f384-423e-847c-dc60c2996603-frr-sockets\") pod \"frr-k8s-cs8lp\" (UID: \"624ce1af-f384-423e-847c-dc60c2996603\") " pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.228331 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/624ce1af-f384-423e-847c-dc60c2996603-frr-conf\") pod \"frr-k8s-cs8lp\" (UID: \"624ce1af-f384-423e-847c-dc60c2996603\") " pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.228411 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6280ff70-b6ef-483e-a767-9b62f92c1d4e-memberlist\") pod \"speaker-4xcv7\" (UID: \"6280ff70-b6ef-483e-a767-9b62f92c1d4e\") " pod="metallb-system/speaker-4xcv7" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.228449 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/624ce1af-f384-423e-847c-dc60c2996603-frr-sockets\") pod \"frr-k8s-cs8lp\" (UID: \"624ce1af-f384-423e-847c-dc60c2996603\") " pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.228455 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/624ce1af-f384-423e-847c-dc60c2996603-reloader\") pod \"frr-k8s-cs8lp\" (UID: \"624ce1af-f384-423e-847c-dc60c2996603\") " pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.228663 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/624ce1af-f384-423e-847c-dc60c2996603-frr-conf\") pod \"frr-k8s-cs8lp\" (UID: \"624ce1af-f384-423e-847c-dc60c2996603\") " pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.228723 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/624ce1af-f384-423e-847c-dc60c2996603-metrics-certs\") pod \"frr-k8s-cs8lp\" (UID: \"624ce1af-f384-423e-847c-dc60c2996603\") " pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.228758 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/624ce1af-f384-423e-847c-dc60c2996603-reloader\") pod \"frr-k8s-cs8lp\" (UID: \"624ce1af-f384-423e-847c-dc60c2996603\") " pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.228777 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3a469eec-42c7-456a-9315-d028751496cd-cert\") pod \"controller-86ddb6bd46-z4jb9\" (UID: \"3a469eec-42c7-456a-9315-d028751496cd\") " pod="metallb-system/controller-86ddb6bd46-z4jb9" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.228813 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9jts\" (UniqueName: \"kubernetes.io/projected/3a469eec-42c7-456a-9315-d028751496cd-kube-api-access-f9jts\") pod \"controller-86ddb6bd46-z4jb9\" (UID: \"3a469eec-42c7-456a-9315-d028751496cd\") " pod="metallb-system/controller-86ddb6bd46-z4jb9" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.228834 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whpjs\" (UniqueName: \"kubernetes.io/projected/624ce1af-f384-423e-847c-dc60c2996603-kube-api-access-whpjs\") pod \"frr-k8s-cs8lp\" (UID: \"624ce1af-f384-423e-847c-dc60c2996603\") " pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.228877 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6280ff70-b6ef-483e-a767-9b62f92c1d4e-metrics-certs\") pod \"speaker-4xcv7\" (UID: \"6280ff70-b6ef-483e-a767-9b62f92c1d4e\") " pod="metallb-system/speaker-4xcv7" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.228895 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/49bcc560-e687-4f99-9526-4baacbce3baa-cert\") pod \"frr-k8s-webhook-server-7f989f654f-bvkfw\" (UID: \"49bcc560-e687-4f99-9526-4baacbce3baa\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-bvkfw" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.228912 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/624ce1af-f384-423e-847c-dc60c2996603-frr-startup\") pod \"frr-k8s-cs8lp\" (UID: \"624ce1af-f384-423e-847c-dc60c2996603\") " pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.228937 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a469eec-42c7-456a-9315-d028751496cd-metrics-certs\") pod \"controller-86ddb6bd46-z4jb9\" (UID: \"3a469eec-42c7-456a-9315-d028751496cd\") " pod="metallb-system/controller-86ddb6bd46-z4jb9" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.228962 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6280ff70-b6ef-483e-a767-9b62f92c1d4e-metallb-excludel2\") pod \"speaker-4xcv7\" (UID: \"6280ff70-b6ef-483e-a767-9b62f92c1d4e\") " pod="metallb-system/speaker-4xcv7" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.228985 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-784x7\" (UniqueName: \"kubernetes.io/projected/49bcc560-e687-4f99-9526-4baacbce3baa-kube-api-access-784x7\") pod \"frr-k8s-webhook-server-7f989f654f-bvkfw\" (UID: \"49bcc560-e687-4f99-9526-4baacbce3baa\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-bvkfw" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.229003 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkpbg\" (UniqueName: \"kubernetes.io/projected/6280ff70-b6ef-483e-a767-9b62f92c1d4e-kube-api-access-kkpbg\") pod \"speaker-4xcv7\" (UID: \"6280ff70-b6ef-483e-a767-9b62f92c1d4e\") " pod="metallb-system/speaker-4xcv7" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.229022 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/624ce1af-f384-423e-847c-dc60c2996603-metrics\") pod \"frr-k8s-cs8lp\" (UID: \"624ce1af-f384-423e-847c-dc60c2996603\") " pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.229229 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/624ce1af-f384-423e-847c-dc60c2996603-metrics\") pod \"frr-k8s-cs8lp\" (UID: \"624ce1af-f384-423e-847c-dc60c2996603\") " pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.231779 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.231888 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.232120 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 09 09:32:57 crc kubenswrapper[4971]: E0309 09:32:57.239561 4971 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 09 09:32:57 crc kubenswrapper[4971]: E0309 09:32:57.239673 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49bcc560-e687-4f99-9526-4baacbce3baa-cert podName:49bcc560-e687-4f99-9526-4baacbce3baa nodeName:}" failed. No retries permitted until 2026-03-09 09:32:57.739646315 +0000 UTC m=+781.299574125 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/49bcc560-e687-4f99-9526-4baacbce3baa-cert") pod "frr-k8s-webhook-server-7f989f654f-bvkfw" (UID: "49bcc560-e687-4f99-9526-4baacbce3baa") : secret "frr-k8s-webhook-server-cert" not found Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.240716 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/624ce1af-f384-423e-847c-dc60c2996603-frr-startup\") pod \"frr-k8s-cs8lp\" (UID: \"624ce1af-f384-423e-847c-dc60c2996603\") " pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.253138 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/624ce1af-f384-423e-847c-dc60c2996603-metrics-certs\") pod \"frr-k8s-cs8lp\" (UID: \"624ce1af-f384-423e-847c-dc60c2996603\") " pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.253234 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-784x7\" (UniqueName: \"kubernetes.io/projected/49bcc560-e687-4f99-9526-4baacbce3baa-kube-api-access-784x7\") pod \"frr-k8s-webhook-server-7f989f654f-bvkfw\" (UID: \"49bcc560-e687-4f99-9526-4baacbce3baa\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-bvkfw" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.256592 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whpjs\" (UniqueName: \"kubernetes.io/projected/624ce1af-f384-423e-847c-dc60c2996603-kube-api-access-whpjs\") pod \"frr-k8s-cs8lp\" (UID: \"624ce1af-f384-423e-847c-dc60c2996603\") " pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.295078 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-dmm86" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.303980 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.329513 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3a469eec-42c7-456a-9315-d028751496cd-cert\") pod \"controller-86ddb6bd46-z4jb9\" (UID: \"3a469eec-42c7-456a-9315-d028751496cd\") " pod="metallb-system/controller-86ddb6bd46-z4jb9" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.329577 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9jts\" (UniqueName: \"kubernetes.io/projected/3a469eec-42c7-456a-9315-d028751496cd-kube-api-access-f9jts\") pod \"controller-86ddb6bd46-z4jb9\" (UID: \"3a469eec-42c7-456a-9315-d028751496cd\") " pod="metallb-system/controller-86ddb6bd46-z4jb9" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.329619 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6280ff70-b6ef-483e-a767-9b62f92c1d4e-metrics-certs\") pod \"speaker-4xcv7\" (UID: \"6280ff70-b6ef-483e-a767-9b62f92c1d4e\") " pod="metallb-system/speaker-4xcv7" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.329665 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a469eec-42c7-456a-9315-d028751496cd-metrics-certs\") pod \"controller-86ddb6bd46-z4jb9\" (UID: \"3a469eec-42c7-456a-9315-d028751496cd\") " pod="metallb-system/controller-86ddb6bd46-z4jb9" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.329690 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6280ff70-b6ef-483e-a767-9b62f92c1d4e-metallb-excludel2\") pod \"speaker-4xcv7\" (UID: \"6280ff70-b6ef-483e-a767-9b62f92c1d4e\") " pod="metallb-system/speaker-4xcv7" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.329717 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkpbg\" (UniqueName: \"kubernetes.io/projected/6280ff70-b6ef-483e-a767-9b62f92c1d4e-kube-api-access-kkpbg\") pod \"speaker-4xcv7\" (UID: \"6280ff70-b6ef-483e-a767-9b62f92c1d4e\") " pod="metallb-system/speaker-4xcv7" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.329751 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6280ff70-b6ef-483e-a767-9b62f92c1d4e-memberlist\") pod \"speaker-4xcv7\" (UID: \"6280ff70-b6ef-483e-a767-9b62f92c1d4e\") " pod="metallb-system/speaker-4xcv7" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.332978 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.333250 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.333279 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.333369 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.336154 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a469eec-42c7-456a-9315-d028751496cd-metrics-certs\") pod \"controller-86ddb6bd46-z4jb9\" (UID: \"3a469eec-42c7-456a-9315-d028751496cd\") " pod="metallb-system/controller-86ddb6bd46-z4jb9" Mar 09 09:32:57 crc kubenswrapper[4971]: E0309 09:32:57.340823 4971 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 09 09:32:57 crc kubenswrapper[4971]: E0309 09:32:57.340999 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6280ff70-b6ef-483e-a767-9b62f92c1d4e-memberlist podName:6280ff70-b6ef-483e-a767-9b62f92c1d4e nodeName:}" failed. No retries permitted until 2026-03-09 09:32:57.840977103 +0000 UTC m=+781.400904913 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6280ff70-b6ef-483e-a767-9b62f92c1d4e-memberlist") pod "speaker-4xcv7" (UID: "6280ff70-b6ef-483e-a767-9b62f92c1d4e") : secret "metallb-memberlist" not found Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.341361 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6280ff70-b6ef-483e-a767-9b62f92c1d4e-metallb-excludel2\") pod \"speaker-4xcv7\" (UID: \"6280ff70-b6ef-483e-a767-9b62f92c1d4e\") " pod="metallb-system/speaker-4xcv7" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.343655 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6280ff70-b6ef-483e-a767-9b62f92c1d4e-metrics-certs\") pod \"speaker-4xcv7\" (UID: \"6280ff70-b6ef-483e-a767-9b62f92c1d4e\") " pod="metallb-system/speaker-4xcv7" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.344840 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3a469eec-42c7-456a-9315-d028751496cd-cert\") pod \"controller-86ddb6bd46-z4jb9\" (UID: \"3a469eec-42c7-456a-9315-d028751496cd\") " pod="metallb-system/controller-86ddb6bd46-z4jb9" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.350939 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9jts\" (UniqueName: \"kubernetes.io/projected/3a469eec-42c7-456a-9315-d028751496cd-kube-api-access-f9jts\") pod \"controller-86ddb6bd46-z4jb9\" (UID: \"3a469eec-42c7-456a-9315-d028751496cd\") " pod="metallb-system/controller-86ddb6bd46-z4jb9" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.351638 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkpbg\" (UniqueName: \"kubernetes.io/projected/6280ff70-b6ef-483e-a767-9b62f92c1d4e-kube-api-access-kkpbg\") pod \"speaker-4xcv7\" (UID: \"6280ff70-b6ef-483e-a767-9b62f92c1d4e\") " pod="metallb-system/speaker-4xcv7" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.412807 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-z4jb9" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.655341 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-z4jb9"] Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.783029 4971 scope.go:117] "RemoveContainer" containerID="dd5eef1804aa68fd008d1b1c595dfd66ff453887a895d5ba26105d768f3b6e03" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.837924 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/49bcc560-e687-4f99-9526-4baacbce3baa-cert\") pod \"frr-k8s-webhook-server-7f989f654f-bvkfw\" (UID: \"49bcc560-e687-4f99-9526-4baacbce3baa\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-bvkfw" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.844596 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/49bcc560-e687-4f99-9526-4baacbce3baa-cert\") pod \"frr-k8s-webhook-server-7f989f654f-bvkfw\" (UID: \"49bcc560-e687-4f99-9526-4baacbce3baa\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-bvkfw" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.883868 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-bvkfw" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.938883 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6280ff70-b6ef-483e-a767-9b62f92c1d4e-memberlist\") pod \"speaker-4xcv7\" (UID: \"6280ff70-b6ef-483e-a767-9b62f92c1d4e\") " pod="metallb-system/speaker-4xcv7" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.942798 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6280ff70-b6ef-483e-a767-9b62f92c1d4e-memberlist\") pod \"speaker-4xcv7\" (UID: \"6280ff70-b6ef-483e-a767-9b62f92c1d4e\") " pod="metallb-system/speaker-4xcv7" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.987756 4971 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-xpgpk" Mar 09 09:32:57 crc kubenswrapper[4971]: I0309 09:32:57.993739 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-4xcv7" Mar 09 09:32:58 crc kubenswrapper[4971]: W0309 09:32:58.020811 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6280ff70_b6ef_483e_a767_9b62f92c1d4e.slice/crio-4db254dd7d015e8ed63d68bb344de5de8dec933a51fd36e3e4054099c9d77774 WatchSource:0}: Error finding container 4db254dd7d015e8ed63d68bb344de5de8dec933a51fd36e3e4054099c9d77774: Status 404 returned error can't find the container with id 4db254dd7d015e8ed63d68bb344de5de8dec933a51fd36e3e4054099c9d77774 Mar 09 09:32:58 crc kubenswrapper[4971]: I0309 09:32:58.107768 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-bvkfw"] Mar 09 09:32:58 crc kubenswrapper[4971]: W0309 09:32:58.117980 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49bcc560_e687_4f99_9526_4baacbce3baa.slice/crio-3b48596856e78f185f4547ab5bebd6fb443115f7e9d01bda20a1a9fdb9fc6407 WatchSource:0}: Error finding container 3b48596856e78f185f4547ab5bebd6fb443115f7e9d01bda20a1a9fdb9fc6407: Status 404 returned error can't find the container with id 3b48596856e78f185f4547ab5bebd6fb443115f7e9d01bda20a1a9fdb9fc6407 Mar 09 09:32:58 crc kubenswrapper[4971]: I0309 09:32:58.269593 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cs8lp" event={"ID":"624ce1af-f384-423e-847c-dc60c2996603","Type":"ContainerStarted","Data":"39568e049f4403fca0b449ddf92e627d71d9cafd2f6b807d81004ed11e54a84a"} Mar 09 09:32:58 crc kubenswrapper[4971]: I0309 09:32:58.271129 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-z4jb9" event={"ID":"3a469eec-42c7-456a-9315-d028751496cd","Type":"ContainerStarted","Data":"04eee664906ea7a64acccf15355422fa3874cd65114b8da0d58cbac015d70627"} Mar 09 09:32:58 crc kubenswrapper[4971]: I0309 09:32:58.271169 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-z4jb9" event={"ID":"3a469eec-42c7-456a-9315-d028751496cd","Type":"ContainerStarted","Data":"6b979504ad660a62a61c120609c9b1b834fd7bf83baf00f371ed0f2b551bdbcb"} Mar 09 09:32:58 crc kubenswrapper[4971]: I0309 09:32:58.278849 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4xcv7" event={"ID":"6280ff70-b6ef-483e-a767-9b62f92c1d4e","Type":"ContainerStarted","Data":"4db254dd7d015e8ed63d68bb344de5de8dec933a51fd36e3e4054099c9d77774"} Mar 09 09:32:58 crc kubenswrapper[4971]: I0309 09:32:58.281747 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-bvkfw" event={"ID":"49bcc560-e687-4f99-9526-4baacbce3baa","Type":"ContainerStarted","Data":"3b48596856e78f185f4547ab5bebd6fb443115f7e9d01bda20a1a9fdb9fc6407"} Mar 09 09:32:59 crc kubenswrapper[4971]: I0309 09:32:59.292223 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4xcv7" event={"ID":"6280ff70-b6ef-483e-a767-9b62f92c1d4e","Type":"ContainerStarted","Data":"851458bcc052021f7d150ea9d5f904f9a42bf56824c1e0f4324f5a4933c9da7c"} Mar 09 09:33:03 crc kubenswrapper[4971]: I0309 09:33:03.333152 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-z4jb9" event={"ID":"3a469eec-42c7-456a-9315-d028751496cd","Type":"ContainerStarted","Data":"d3481029f02b826aac162e17a3cd78f38179948bf7e2143005f65ae27074cb4c"} Mar 09 09:33:03 crc kubenswrapper[4971]: I0309 09:33:03.333793 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-z4jb9" Mar 09 09:33:03 crc kubenswrapper[4971]: I0309 09:33:03.338017 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4xcv7" event={"ID":"6280ff70-b6ef-483e-a767-9b62f92c1d4e","Type":"ContainerStarted","Data":"58da9061f6f7c5bb69d9b3311ae7af60ee623c65c3de43172af2cb38cedcfff0"} Mar 09 09:33:03 crc kubenswrapper[4971]: I0309 09:33:03.338390 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-4xcv7" Mar 09 09:33:03 crc kubenswrapper[4971]: I0309 09:33:03.355734 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-z4jb9" podStartSLOduration=1.712502415 podStartE2EDuration="6.355718098s" podCreationTimestamp="2026-03-09 09:32:57 +0000 UTC" firstStartedPulling="2026-03-09 09:32:57.85803197 +0000 UTC m=+781.417959780" lastFinishedPulling="2026-03-09 09:33:02.501247653 +0000 UTC m=+786.061175463" observedRunningTime="2026-03-09 09:33:03.352468863 +0000 UTC m=+786.912396673" watchObservedRunningTime="2026-03-09 09:33:03.355718098 +0000 UTC m=+786.915645908" Mar 09 09:33:07 crc kubenswrapper[4971]: I0309 09:33:07.174225 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-4xcv7" podStartSLOduration=5.9615461320000005 podStartE2EDuration="10.174208651s" podCreationTimestamp="2026-03-09 09:32:57 +0000 UTC" firstStartedPulling="2026-03-09 09:32:58.296846227 +0000 UTC m=+781.856774047" lastFinishedPulling="2026-03-09 09:33:02.509508756 +0000 UTC m=+786.069436566" observedRunningTime="2026-03-09 09:33:03.383713091 +0000 UTC m=+786.943640901" watchObservedRunningTime="2026-03-09 09:33:07.174208651 +0000 UTC m=+790.734136461" Mar 09 09:33:07 crc kubenswrapper[4971]: I0309 09:33:07.361774 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-bvkfw" event={"ID":"49bcc560-e687-4f99-9526-4baacbce3baa","Type":"ContainerStarted","Data":"b781812be0e8aae58be928d07a1d004265527b13804f30092c3efd8ac5f2ec6d"} Mar 09 09:33:07 crc kubenswrapper[4971]: I0309 09:33:07.362031 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-bvkfw" Mar 09 09:33:07 crc kubenswrapper[4971]: I0309 09:33:07.364701 4971 generic.go:334] "Generic (PLEG): container finished" podID="624ce1af-f384-423e-847c-dc60c2996603" containerID="e3971c23304dc0e1bf184b5fa62d2259dae812652220214b1401016c44595796" exitCode=0 Mar 09 09:33:07 crc kubenswrapper[4971]: I0309 09:33:07.364751 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cs8lp" event={"ID":"624ce1af-f384-423e-847c-dc60c2996603","Type":"ContainerDied","Data":"e3971c23304dc0e1bf184b5fa62d2259dae812652220214b1401016c44595796"} Mar 09 09:33:07 crc kubenswrapper[4971]: I0309 09:33:07.380080 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-bvkfw" podStartSLOduration=2.690901579 podStartE2EDuration="11.380056391s" podCreationTimestamp="2026-03-09 09:32:56 +0000 UTC" firstStartedPulling="2026-03-09 09:32:58.120625788 +0000 UTC m=+781.680553598" lastFinishedPulling="2026-03-09 09:33:06.8097806 +0000 UTC m=+790.369708410" observedRunningTime="2026-03-09 09:33:07.377646901 +0000 UTC m=+790.937574721" watchObservedRunningTime="2026-03-09 09:33:07.380056391 +0000 UTC m=+790.939984201" Mar 09 09:33:07 crc kubenswrapper[4971]: I0309 09:33:07.422212 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-z4jb9" Mar 09 09:33:08 crc kubenswrapper[4971]: I0309 09:33:08.372953 4971 generic.go:334] "Generic (PLEG): container finished" podID="624ce1af-f384-423e-847c-dc60c2996603" containerID="60afffce54ae03edc4a198d5306d929761c6f11474da218587da55f442889a36" exitCode=0 Mar 09 09:33:08 crc kubenswrapper[4971]: I0309 09:33:08.373006 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cs8lp" event={"ID":"624ce1af-f384-423e-847c-dc60c2996603","Type":"ContainerDied","Data":"60afffce54ae03edc4a198d5306d929761c6f11474da218587da55f442889a36"} Mar 09 09:33:09 crc kubenswrapper[4971]: I0309 09:33:09.380532 4971 generic.go:334] "Generic (PLEG): container finished" podID="624ce1af-f384-423e-847c-dc60c2996603" containerID="daffd9b12992128f2742da78e2de581fefaed0db6356de9b957329202e61e073" exitCode=0 Mar 09 09:33:09 crc kubenswrapper[4971]: I0309 09:33:09.380635 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cs8lp" event={"ID":"624ce1af-f384-423e-847c-dc60c2996603","Type":"ContainerDied","Data":"daffd9b12992128f2742da78e2de581fefaed0db6356de9b957329202e61e073"} Mar 09 09:33:10 crc kubenswrapper[4971]: I0309 09:33:10.391273 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cs8lp" event={"ID":"624ce1af-f384-423e-847c-dc60c2996603","Type":"ContainerStarted","Data":"3896f7425871c41c79f6f159b6ec737744e75f5b2c76d97fe8fe43f2e10740cd"} Mar 09 09:33:10 crc kubenswrapper[4971]: I0309 09:33:10.391541 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cs8lp" event={"ID":"624ce1af-f384-423e-847c-dc60c2996603","Type":"ContainerStarted","Data":"b6fb08153e100249dcacfb3826b7d76e966a5cb34d66bc7f7022fb93a2fc9c74"} Mar 09 09:33:10 crc kubenswrapper[4971]: I0309 09:33:10.391555 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cs8lp" event={"ID":"624ce1af-f384-423e-847c-dc60c2996603","Type":"ContainerStarted","Data":"2b99e29f0af7c43feda54c2be24fb36a2edc5cb7c64082c543dde6a7eae8d79d"} Mar 09 09:33:10 crc kubenswrapper[4971]: I0309 09:33:10.391564 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cs8lp" event={"ID":"624ce1af-f384-423e-847c-dc60c2996603","Type":"ContainerStarted","Data":"366455a4b4e44b5973e40b40c1362159eef2ee86c4ec995ed39877c47b347fc9"} Mar 09 09:33:10 crc kubenswrapper[4971]: I0309 09:33:10.391572 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cs8lp" event={"ID":"624ce1af-f384-423e-847c-dc60c2996603","Type":"ContainerStarted","Data":"9df6407dd325777760be7ab9e601d28bf9f7efd5f7f9533c2a6a827d90f2ef8f"} Mar 09 09:33:11 crc kubenswrapper[4971]: I0309 09:33:11.399983 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cs8lp" event={"ID":"624ce1af-f384-423e-847c-dc60c2996603","Type":"ContainerStarted","Data":"59f34396f2da9109ca50ce3020a3299d283785ac06da26edc301820d56efa2c5"} Mar 09 09:33:11 crc kubenswrapper[4971]: I0309 09:33:11.400905 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:33:11 crc kubenswrapper[4971]: I0309 09:33:11.426047 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-cs8lp" podStartSLOduration=6.041370646 podStartE2EDuration="15.426030449s" podCreationTimestamp="2026-03-09 09:32:56 +0000 UTC" firstStartedPulling="2026-03-09 09:32:57.44905491 +0000 UTC m=+781.008982720" lastFinishedPulling="2026-03-09 09:33:06.833714713 +0000 UTC m=+790.393642523" observedRunningTime="2026-03-09 09:33:11.423816164 +0000 UTC m=+794.983743974" watchObservedRunningTime="2026-03-09 09:33:11.426030449 +0000 UTC m=+794.985958259" Mar 09 09:33:12 crc kubenswrapper[4971]: I0309 09:33:12.304631 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:33:12 crc kubenswrapper[4971]: I0309 09:33:12.343092 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:33:17 crc kubenswrapper[4971]: I0309 09:33:17.888435 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-bvkfw" Mar 09 09:33:17 crc kubenswrapper[4971]: I0309 09:33:17.997770 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-4xcv7" Mar 09 09:33:23 crc kubenswrapper[4971]: I0309 09:33:23.828769 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-q5hkl"] Mar 09 09:33:23 crc kubenswrapper[4971]: I0309 09:33:23.830695 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-q5hkl" Mar 09 09:33:23 crc kubenswrapper[4971]: I0309 09:33:23.833461 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 09 09:33:23 crc kubenswrapper[4971]: I0309 09:33:23.833703 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-n5zt8" Mar 09 09:33:23 crc kubenswrapper[4971]: I0309 09:33:23.834079 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 09 09:33:23 crc kubenswrapper[4971]: I0309 09:33:23.839024 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-q5hkl"] Mar 09 09:33:23 crc kubenswrapper[4971]: I0309 09:33:23.852816 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cwgs\" (UniqueName: \"kubernetes.io/projected/79acb95e-61c0-4d94-bcf5-a605bcd73450-kube-api-access-8cwgs\") pod \"mariadb-operator-index-q5hkl\" (UID: \"79acb95e-61c0-4d94-bcf5-a605bcd73450\") " pod="openstack-operators/mariadb-operator-index-q5hkl" Mar 09 09:33:23 crc kubenswrapper[4971]: I0309 09:33:23.953696 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cwgs\" (UniqueName: \"kubernetes.io/projected/79acb95e-61c0-4d94-bcf5-a605bcd73450-kube-api-access-8cwgs\") pod \"mariadb-operator-index-q5hkl\" (UID: \"79acb95e-61c0-4d94-bcf5-a605bcd73450\") " pod="openstack-operators/mariadb-operator-index-q5hkl" Mar 09 09:33:23 crc kubenswrapper[4971]: I0309 09:33:23.972483 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cwgs\" (UniqueName: \"kubernetes.io/projected/79acb95e-61c0-4d94-bcf5-a605bcd73450-kube-api-access-8cwgs\") pod \"mariadb-operator-index-q5hkl\" (UID: \"79acb95e-61c0-4d94-bcf5-a605bcd73450\") " pod="openstack-operators/mariadb-operator-index-q5hkl" Mar 09 09:33:24 crc kubenswrapper[4971]: I0309 09:33:24.149722 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-q5hkl" Mar 09 09:33:24 crc kubenswrapper[4971]: I0309 09:33:24.477784 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-q5hkl"] Mar 09 09:33:24 crc kubenswrapper[4971]: W0309 09:33:24.621093 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79acb95e_61c0_4d94_bcf5_a605bcd73450.slice/crio-dccda2650e28cd9b37d085750f199ea54a9b8f620bb5afc7ac6e851590122300 WatchSource:0}: Error finding container dccda2650e28cd9b37d085750f199ea54a9b8f620bb5afc7ac6e851590122300: Status 404 returned error can't find the container with id dccda2650e28cd9b37d085750f199ea54a9b8f620bb5afc7ac6e851590122300 Mar 09 09:33:24 crc kubenswrapper[4971]: I0309 09:33:24.624440 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:33:25 crc kubenswrapper[4971]: I0309 09:33:25.495469 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-q5hkl" event={"ID":"79acb95e-61c0-4d94-bcf5-a605bcd73450","Type":"ContainerStarted","Data":"dccda2650e28cd9b37d085750f199ea54a9b8f620bb5afc7ac6e851590122300"} Mar 09 09:33:26 crc kubenswrapper[4971]: I0309 09:33:26.502849 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-q5hkl" event={"ID":"79acb95e-61c0-4d94-bcf5-a605bcd73450","Type":"ContainerStarted","Data":"b85711bff826e0be108ada9d375800aa70590534ceb2b4f96b59c5b77ebd0c09"} Mar 09 09:33:26 crc kubenswrapper[4971]: I0309 09:33:26.523912 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-q5hkl" podStartSLOduration=2.684820582 podStartE2EDuration="3.523892254s" podCreationTimestamp="2026-03-09 09:33:23 +0000 UTC" firstStartedPulling="2026-03-09 09:33:24.624190337 +0000 UTC m=+808.184118147" lastFinishedPulling="2026-03-09 09:33:25.463262009 +0000 UTC m=+809.023189819" observedRunningTime="2026-03-09 09:33:26.518159975 +0000 UTC m=+810.078087785" watchObservedRunningTime="2026-03-09 09:33:26.523892254 +0000 UTC m=+810.083820064" Mar 09 09:33:27 crc kubenswrapper[4971]: I0309 09:33:27.213370 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-q5hkl"] Mar 09 09:33:27 crc kubenswrapper[4971]: I0309 09:33:27.306707 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-cs8lp" Mar 09 09:33:27 crc kubenswrapper[4971]: I0309 09:33:27.817845 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-g8xc7"] Mar 09 09:33:27 crc kubenswrapper[4971]: I0309 09:33:27.818877 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-g8xc7" Mar 09 09:33:27 crc kubenswrapper[4971]: I0309 09:33:27.834183 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-g8xc7"] Mar 09 09:33:28 crc kubenswrapper[4971]: I0309 09:33:28.001282 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbmmf\" (UniqueName: \"kubernetes.io/projected/ae6f5029-30ab-4f16-bae0-38c580d4acfa-kube-api-access-rbmmf\") pod \"mariadb-operator-index-g8xc7\" (UID: \"ae6f5029-30ab-4f16-bae0-38c580d4acfa\") " pod="openstack-operators/mariadb-operator-index-g8xc7" Mar 09 09:33:28 crc kubenswrapper[4971]: I0309 09:33:28.102909 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbmmf\" (UniqueName: \"kubernetes.io/projected/ae6f5029-30ab-4f16-bae0-38c580d4acfa-kube-api-access-rbmmf\") pod \"mariadb-operator-index-g8xc7\" (UID: \"ae6f5029-30ab-4f16-bae0-38c580d4acfa\") " pod="openstack-operators/mariadb-operator-index-g8xc7" Mar 09 09:33:28 crc kubenswrapper[4971]: I0309 09:33:28.124041 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbmmf\" (UniqueName: \"kubernetes.io/projected/ae6f5029-30ab-4f16-bae0-38c580d4acfa-kube-api-access-rbmmf\") pod \"mariadb-operator-index-g8xc7\" (UID: \"ae6f5029-30ab-4f16-bae0-38c580d4acfa\") " pod="openstack-operators/mariadb-operator-index-g8xc7" Mar 09 09:33:28 crc kubenswrapper[4971]: I0309 09:33:28.146691 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-g8xc7" Mar 09 09:33:28 crc kubenswrapper[4971]: I0309 09:33:28.517736 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-q5hkl" podUID="79acb95e-61c0-4d94-bcf5-a605bcd73450" containerName="registry-server" containerID="cri-o://b85711bff826e0be108ada9d375800aa70590534ceb2b4f96b59c5b77ebd0c09" gracePeriod=2 Mar 09 09:33:28 crc kubenswrapper[4971]: I0309 09:33:28.527999 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-g8xc7"] Mar 09 09:33:28 crc kubenswrapper[4971]: W0309 09:33:28.538002 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae6f5029_30ab_4f16_bae0_38c580d4acfa.slice/crio-0d78b03e3fc2482848b48d3de52b555c129c96d2b6f79b7f8bca2b4843372494 WatchSource:0}: Error finding container 0d78b03e3fc2482848b48d3de52b555c129c96d2b6f79b7f8bca2b4843372494: Status 404 returned error can't find the container with id 0d78b03e3fc2482848b48d3de52b555c129c96d2b6f79b7f8bca2b4843372494 Mar 09 09:33:28 crc kubenswrapper[4971]: I0309 09:33:28.845896 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-q5hkl" Mar 09 09:33:29 crc kubenswrapper[4971]: I0309 09:33:29.015234 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cwgs\" (UniqueName: \"kubernetes.io/projected/79acb95e-61c0-4d94-bcf5-a605bcd73450-kube-api-access-8cwgs\") pod \"79acb95e-61c0-4d94-bcf5-a605bcd73450\" (UID: \"79acb95e-61c0-4d94-bcf5-a605bcd73450\") " Mar 09 09:33:29 crc kubenswrapper[4971]: I0309 09:33:29.022148 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79acb95e-61c0-4d94-bcf5-a605bcd73450-kube-api-access-8cwgs" (OuterVolumeSpecName: "kube-api-access-8cwgs") pod "79acb95e-61c0-4d94-bcf5-a605bcd73450" (UID: "79acb95e-61c0-4d94-bcf5-a605bcd73450"). InnerVolumeSpecName "kube-api-access-8cwgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:33:29 crc kubenswrapper[4971]: I0309 09:33:29.116855 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cwgs\" (UniqueName: \"kubernetes.io/projected/79acb95e-61c0-4d94-bcf5-a605bcd73450-kube-api-access-8cwgs\") on node \"crc\" DevicePath \"\"" Mar 09 09:33:29 crc kubenswrapper[4971]: I0309 09:33:29.524906 4971 generic.go:334] "Generic (PLEG): container finished" podID="79acb95e-61c0-4d94-bcf5-a605bcd73450" containerID="b85711bff826e0be108ada9d375800aa70590534ceb2b4f96b59c5b77ebd0c09" exitCode=0 Mar 09 09:33:29 crc kubenswrapper[4971]: I0309 09:33:29.525020 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-q5hkl" event={"ID":"79acb95e-61c0-4d94-bcf5-a605bcd73450","Type":"ContainerDied","Data":"b85711bff826e0be108ada9d375800aa70590534ceb2b4f96b59c5b77ebd0c09"} Mar 09 09:33:29 crc kubenswrapper[4971]: I0309 09:33:29.525072 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-q5hkl" event={"ID":"79acb95e-61c0-4d94-bcf5-a605bcd73450","Type":"ContainerDied","Data":"dccda2650e28cd9b37d085750f199ea54a9b8f620bb5afc7ac6e851590122300"} Mar 09 09:33:29 crc kubenswrapper[4971]: I0309 09:33:29.525103 4971 scope.go:117] "RemoveContainer" containerID="b85711bff826e0be108ada9d375800aa70590534ceb2b4f96b59c5b77ebd0c09" Mar 09 09:33:29 crc kubenswrapper[4971]: I0309 09:33:29.525286 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-q5hkl" Mar 09 09:33:29 crc kubenswrapper[4971]: I0309 09:33:29.528440 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-g8xc7" event={"ID":"ae6f5029-30ab-4f16-bae0-38c580d4acfa","Type":"ContainerStarted","Data":"eb6724386b593072ba86e5de6eafb06993e9305879f90c6726b51d8a93f8e16e"} Mar 09 09:33:29 crc kubenswrapper[4971]: I0309 09:33:29.528468 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-g8xc7" event={"ID":"ae6f5029-30ab-4f16-bae0-38c580d4acfa","Type":"ContainerStarted","Data":"0d78b03e3fc2482848b48d3de52b555c129c96d2b6f79b7f8bca2b4843372494"} Mar 09 09:33:29 crc kubenswrapper[4971]: I0309 09:33:29.551957 4971 scope.go:117] "RemoveContainer" containerID="b85711bff826e0be108ada9d375800aa70590534ceb2b4f96b59c5b77ebd0c09" Mar 09 09:33:29 crc kubenswrapper[4971]: E0309 09:33:29.553045 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b85711bff826e0be108ada9d375800aa70590534ceb2b4f96b59c5b77ebd0c09\": container with ID starting with b85711bff826e0be108ada9d375800aa70590534ceb2b4f96b59c5b77ebd0c09 not found: ID does not exist" containerID="b85711bff826e0be108ada9d375800aa70590534ceb2b4f96b59c5b77ebd0c09" Mar 09 09:33:29 crc kubenswrapper[4971]: I0309 09:33:29.553122 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b85711bff826e0be108ada9d375800aa70590534ceb2b4f96b59c5b77ebd0c09"} err="failed to get container status \"b85711bff826e0be108ada9d375800aa70590534ceb2b4f96b59c5b77ebd0c09\": rpc error: code = NotFound desc = could not find container \"b85711bff826e0be108ada9d375800aa70590534ceb2b4f96b59c5b77ebd0c09\": container with ID starting with b85711bff826e0be108ada9d375800aa70590534ceb2b4f96b59c5b77ebd0c09 not found: ID does not exist" Mar 09 09:33:29 crc kubenswrapper[4971]: I0309 09:33:29.553237 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-g8xc7" podStartSLOduration=2.121218864 podStartE2EDuration="2.553208371s" podCreationTimestamp="2026-03-09 09:33:27 +0000 UTC" firstStartedPulling="2026-03-09 09:33:28.543330089 +0000 UTC m=+812.103257899" lastFinishedPulling="2026-03-09 09:33:28.975319596 +0000 UTC m=+812.535247406" observedRunningTime="2026-03-09 09:33:29.551676176 +0000 UTC m=+813.111603986" watchObservedRunningTime="2026-03-09 09:33:29.553208371 +0000 UTC m=+813.113136181" Mar 09 09:33:29 crc kubenswrapper[4971]: I0309 09:33:29.568708 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-q5hkl"] Mar 09 09:33:29 crc kubenswrapper[4971]: I0309 09:33:29.572658 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-q5hkl"] Mar 09 09:33:31 crc kubenswrapper[4971]: I0309 09:33:31.161129 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79acb95e-61c0-4d94-bcf5-a605bcd73450" path="/var/lib/kubelet/pods/79acb95e-61c0-4d94-bcf5-a605bcd73450/volumes" Mar 09 09:33:38 crc kubenswrapper[4971]: I0309 09:33:38.147299 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-g8xc7" Mar 09 09:33:38 crc kubenswrapper[4971]: I0309 09:33:38.147654 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-g8xc7" Mar 09 09:33:38 crc kubenswrapper[4971]: I0309 09:33:38.178169 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-g8xc7" Mar 09 09:33:38 crc kubenswrapper[4971]: I0309 09:33:38.601482 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-g8xc7" Mar 09 09:33:43 crc kubenswrapper[4971]: I0309 09:33:43.884083 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs"] Mar 09 09:33:43 crc kubenswrapper[4971]: E0309 09:33:43.884633 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79acb95e-61c0-4d94-bcf5-a605bcd73450" containerName="registry-server" Mar 09 09:33:43 crc kubenswrapper[4971]: I0309 09:33:43.884648 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="79acb95e-61c0-4d94-bcf5-a605bcd73450" containerName="registry-server" Mar 09 09:33:43 crc kubenswrapper[4971]: I0309 09:33:43.884783 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="79acb95e-61c0-4d94-bcf5-a605bcd73450" containerName="registry-server" Mar 09 09:33:43 crc kubenswrapper[4971]: I0309 09:33:43.885721 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs" Mar 09 09:33:43 crc kubenswrapper[4971]: I0309 09:33:43.887489 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-w69pb" Mar 09 09:33:43 crc kubenswrapper[4971]: I0309 09:33:43.894651 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs"] Mar 09 09:33:43 crc kubenswrapper[4971]: I0309 09:33:43.905677 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rpbw\" (UniqueName: \"kubernetes.io/projected/29e24e00-d64d-44ac-9ea3-b6cfb014d046-kube-api-access-7rpbw\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs\" (UID: \"29e24e00-d64d-44ac-9ea3-b6cfb014d046\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs" Mar 09 09:33:43 crc kubenswrapper[4971]: I0309 09:33:43.905753 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29e24e00-d64d-44ac-9ea3-b6cfb014d046-bundle\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs\" (UID: \"29e24e00-d64d-44ac-9ea3-b6cfb014d046\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs" Mar 09 09:33:43 crc kubenswrapper[4971]: I0309 09:33:43.905784 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29e24e00-d64d-44ac-9ea3-b6cfb014d046-util\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs\" (UID: \"29e24e00-d64d-44ac-9ea3-b6cfb014d046\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs" Mar 09 09:33:44 crc kubenswrapper[4971]: I0309 09:33:44.006386 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rpbw\" (UniqueName: \"kubernetes.io/projected/29e24e00-d64d-44ac-9ea3-b6cfb014d046-kube-api-access-7rpbw\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs\" (UID: \"29e24e00-d64d-44ac-9ea3-b6cfb014d046\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs" Mar 09 09:33:44 crc kubenswrapper[4971]: I0309 09:33:44.006649 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29e24e00-d64d-44ac-9ea3-b6cfb014d046-bundle\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs\" (UID: \"29e24e00-d64d-44ac-9ea3-b6cfb014d046\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs" Mar 09 09:33:44 crc kubenswrapper[4971]: I0309 09:33:44.006702 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29e24e00-d64d-44ac-9ea3-b6cfb014d046-util\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs\" (UID: \"29e24e00-d64d-44ac-9ea3-b6cfb014d046\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs" Mar 09 09:33:44 crc kubenswrapper[4971]: I0309 09:33:44.007122 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29e24e00-d64d-44ac-9ea3-b6cfb014d046-util\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs\" (UID: \"29e24e00-d64d-44ac-9ea3-b6cfb014d046\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs" Mar 09 09:33:44 crc kubenswrapper[4971]: I0309 09:33:44.007140 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29e24e00-d64d-44ac-9ea3-b6cfb014d046-bundle\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs\" (UID: \"29e24e00-d64d-44ac-9ea3-b6cfb014d046\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs" Mar 09 09:33:44 crc kubenswrapper[4971]: I0309 09:33:44.029216 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rpbw\" (UniqueName: \"kubernetes.io/projected/29e24e00-d64d-44ac-9ea3-b6cfb014d046-kube-api-access-7rpbw\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs\" (UID: \"29e24e00-d64d-44ac-9ea3-b6cfb014d046\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs" Mar 09 09:33:44 crc kubenswrapper[4971]: I0309 09:33:44.206774 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs" Mar 09 09:33:44 crc kubenswrapper[4971]: I0309 09:33:44.392195 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs"] Mar 09 09:33:44 crc kubenswrapper[4971]: W0309 09:33:44.397811 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29e24e00_d64d_44ac_9ea3_b6cfb014d046.slice/crio-00bc9cd635b14ac03a9c1c5f52d2cdc149a8f6a3a981e2a742d693f14e594fb5 WatchSource:0}: Error finding container 00bc9cd635b14ac03a9c1c5f52d2cdc149a8f6a3a981e2a742d693f14e594fb5: Status 404 returned error can't find the container with id 00bc9cd635b14ac03a9c1c5f52d2cdc149a8f6a3a981e2a742d693f14e594fb5 Mar 09 09:33:44 crc kubenswrapper[4971]: I0309 09:33:44.616448 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs" event={"ID":"29e24e00-d64d-44ac-9ea3-b6cfb014d046","Type":"ContainerStarted","Data":"00bc9cd635b14ac03a9c1c5f52d2cdc149a8f6a3a981e2a742d693f14e594fb5"} Mar 09 09:33:45 crc kubenswrapper[4971]: I0309 09:33:45.623986 4971 generic.go:334] "Generic (PLEG): container finished" podID="29e24e00-d64d-44ac-9ea3-b6cfb014d046" containerID="5ca01e02b4de8153eec2378cfacaed9c7a8a94d4a563fef8a6b6716aa5228045" exitCode=0 Mar 09 09:33:45 crc kubenswrapper[4971]: I0309 09:33:45.624118 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs" event={"ID":"29e24e00-d64d-44ac-9ea3-b6cfb014d046","Type":"ContainerDied","Data":"5ca01e02b4de8153eec2378cfacaed9c7a8a94d4a563fef8a6b6716aa5228045"} Mar 09 09:33:47 crc kubenswrapper[4971]: I0309 09:33:47.640647 4971 generic.go:334] "Generic (PLEG): container finished" podID="29e24e00-d64d-44ac-9ea3-b6cfb014d046" containerID="18e72628263f0ebbe2777e66f31522f6a22657ad6ed8ba7f530becb8efb4d9c2" exitCode=0 Mar 09 09:33:47 crc kubenswrapper[4971]: I0309 09:33:47.640717 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs" event={"ID":"29e24e00-d64d-44ac-9ea3-b6cfb014d046","Type":"ContainerDied","Data":"18e72628263f0ebbe2777e66f31522f6a22657ad6ed8ba7f530becb8efb4d9c2"} Mar 09 09:33:48 crc kubenswrapper[4971]: I0309 09:33:48.648556 4971 generic.go:334] "Generic (PLEG): container finished" podID="29e24e00-d64d-44ac-9ea3-b6cfb014d046" containerID="972ba1fbd2d461ef07d94b097b78502d3262ccc532017a3d6bc34008ed3542d3" exitCode=0 Mar 09 09:33:48 crc kubenswrapper[4971]: I0309 09:33:48.649030 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs" event={"ID":"29e24e00-d64d-44ac-9ea3-b6cfb014d046","Type":"ContainerDied","Data":"972ba1fbd2d461ef07d94b097b78502d3262ccc532017a3d6bc34008ed3542d3"} Mar 09 09:33:49 crc kubenswrapper[4971]: I0309 09:33:49.879241 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs" Mar 09 09:33:49 crc kubenswrapper[4971]: I0309 09:33:49.901184 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rpbw\" (UniqueName: \"kubernetes.io/projected/29e24e00-d64d-44ac-9ea3-b6cfb014d046-kube-api-access-7rpbw\") pod \"29e24e00-d64d-44ac-9ea3-b6cfb014d046\" (UID: \"29e24e00-d64d-44ac-9ea3-b6cfb014d046\") " Mar 09 09:33:49 crc kubenswrapper[4971]: I0309 09:33:49.901660 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29e24e00-d64d-44ac-9ea3-b6cfb014d046-bundle\") pod \"29e24e00-d64d-44ac-9ea3-b6cfb014d046\" (UID: \"29e24e00-d64d-44ac-9ea3-b6cfb014d046\") " Mar 09 09:33:49 crc kubenswrapper[4971]: I0309 09:33:49.901687 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29e24e00-d64d-44ac-9ea3-b6cfb014d046-util\") pod \"29e24e00-d64d-44ac-9ea3-b6cfb014d046\" (UID: \"29e24e00-d64d-44ac-9ea3-b6cfb014d046\") " Mar 09 09:33:49 crc kubenswrapper[4971]: I0309 09:33:49.902826 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29e24e00-d64d-44ac-9ea3-b6cfb014d046-bundle" (OuterVolumeSpecName: "bundle") pod "29e24e00-d64d-44ac-9ea3-b6cfb014d046" (UID: "29e24e00-d64d-44ac-9ea3-b6cfb014d046"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:33:49 crc kubenswrapper[4971]: I0309 09:33:49.949777 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29e24e00-d64d-44ac-9ea3-b6cfb014d046-kube-api-access-7rpbw" (OuterVolumeSpecName: "kube-api-access-7rpbw") pod "29e24e00-d64d-44ac-9ea3-b6cfb014d046" (UID: "29e24e00-d64d-44ac-9ea3-b6cfb014d046"). InnerVolumeSpecName "kube-api-access-7rpbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:33:50 crc kubenswrapper[4971]: I0309 09:33:50.002830 4971 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29e24e00-d64d-44ac-9ea3-b6cfb014d046-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:33:50 crc kubenswrapper[4971]: I0309 09:33:50.002864 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rpbw\" (UniqueName: \"kubernetes.io/projected/29e24e00-d64d-44ac-9ea3-b6cfb014d046-kube-api-access-7rpbw\") on node \"crc\" DevicePath \"\"" Mar 09 09:33:50 crc kubenswrapper[4971]: I0309 09:33:50.196564 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29e24e00-d64d-44ac-9ea3-b6cfb014d046-util" (OuterVolumeSpecName: "util") pod "29e24e00-d64d-44ac-9ea3-b6cfb014d046" (UID: "29e24e00-d64d-44ac-9ea3-b6cfb014d046"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:33:50 crc kubenswrapper[4971]: I0309 09:33:50.205934 4971 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29e24e00-d64d-44ac-9ea3-b6cfb014d046-util\") on node \"crc\" DevicePath \"\"" Mar 09 09:33:50 crc kubenswrapper[4971]: I0309 09:33:50.663086 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs" event={"ID":"29e24e00-d64d-44ac-9ea3-b6cfb014d046","Type":"ContainerDied","Data":"00bc9cd635b14ac03a9c1c5f52d2cdc149a8f6a3a981e2a742d693f14e594fb5"} Mar 09 09:33:50 crc kubenswrapper[4971]: I0309 09:33:50.663135 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00bc9cd635b14ac03a9c1c5f52d2cdc149a8f6a3a981e2a742d693f14e594fb5" Mar 09 09:33:50 crc kubenswrapper[4971]: I0309 09:33:50.663437 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs" Mar 09 09:33:57 crc kubenswrapper[4971]: I0309 09:33:57.002553 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5794c4499-rj58k"] Mar 09 09:33:57 crc kubenswrapper[4971]: E0309 09:33:57.003475 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29e24e00-d64d-44ac-9ea3-b6cfb014d046" containerName="pull" Mar 09 09:33:57 crc kubenswrapper[4971]: I0309 09:33:57.003490 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e24e00-d64d-44ac-9ea3-b6cfb014d046" containerName="pull" Mar 09 09:33:57 crc kubenswrapper[4971]: E0309 09:33:57.003509 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29e24e00-d64d-44ac-9ea3-b6cfb014d046" containerName="util" Mar 09 09:33:57 crc kubenswrapper[4971]: I0309 09:33:57.003516 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e24e00-d64d-44ac-9ea3-b6cfb014d046" containerName="util" Mar 09 09:33:57 crc kubenswrapper[4971]: E0309 09:33:57.003530 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29e24e00-d64d-44ac-9ea3-b6cfb014d046" containerName="extract" Mar 09 09:33:57 crc kubenswrapper[4971]: I0309 09:33:57.003538 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e24e00-d64d-44ac-9ea3-b6cfb014d046" containerName="extract" Mar 09 09:33:57 crc kubenswrapper[4971]: I0309 09:33:57.003672 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="29e24e00-d64d-44ac-9ea3-b6cfb014d046" containerName="extract" Mar 09 09:33:57 crc kubenswrapper[4971]: I0309 09:33:57.004138 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5794c4499-rj58k" Mar 09 09:33:57 crc kubenswrapper[4971]: I0309 09:33:57.008928 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Mar 09 09:33:57 crc kubenswrapper[4971]: I0309 09:33:57.009465 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-76w78" Mar 09 09:33:57 crc kubenswrapper[4971]: I0309 09:33:57.021785 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5794c4499-rj58k"] Mar 09 09:33:57 crc kubenswrapper[4971]: I0309 09:33:57.023115 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 09 09:33:57 crc kubenswrapper[4971]: I0309 09:33:57.100443 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w9cm\" (UniqueName: \"kubernetes.io/projected/f0a1e70a-0ac8-4e6e-87d6-85e6097cf8e7-kube-api-access-5w9cm\") pod \"mariadb-operator-controller-manager-5794c4499-rj58k\" (UID: \"f0a1e70a-0ac8-4e6e-87d6-85e6097cf8e7\") " pod="openstack-operators/mariadb-operator-controller-manager-5794c4499-rj58k" Mar 09 09:33:57 crc kubenswrapper[4971]: I0309 09:33:57.100771 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f0a1e70a-0ac8-4e6e-87d6-85e6097cf8e7-webhook-cert\") pod \"mariadb-operator-controller-manager-5794c4499-rj58k\" (UID: \"f0a1e70a-0ac8-4e6e-87d6-85e6097cf8e7\") " pod="openstack-operators/mariadb-operator-controller-manager-5794c4499-rj58k" Mar 09 09:33:57 crc kubenswrapper[4971]: I0309 09:33:57.100906 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f0a1e70a-0ac8-4e6e-87d6-85e6097cf8e7-apiservice-cert\") pod \"mariadb-operator-controller-manager-5794c4499-rj58k\" (UID: \"f0a1e70a-0ac8-4e6e-87d6-85e6097cf8e7\") " pod="openstack-operators/mariadb-operator-controller-manager-5794c4499-rj58k" Mar 09 09:33:57 crc kubenswrapper[4971]: I0309 09:33:57.202498 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w9cm\" (UniqueName: \"kubernetes.io/projected/f0a1e70a-0ac8-4e6e-87d6-85e6097cf8e7-kube-api-access-5w9cm\") pod \"mariadb-operator-controller-manager-5794c4499-rj58k\" (UID: \"f0a1e70a-0ac8-4e6e-87d6-85e6097cf8e7\") " pod="openstack-operators/mariadb-operator-controller-manager-5794c4499-rj58k" Mar 09 09:33:57 crc kubenswrapper[4971]: I0309 09:33:57.202764 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f0a1e70a-0ac8-4e6e-87d6-85e6097cf8e7-webhook-cert\") pod \"mariadb-operator-controller-manager-5794c4499-rj58k\" (UID: \"f0a1e70a-0ac8-4e6e-87d6-85e6097cf8e7\") " pod="openstack-operators/mariadb-operator-controller-manager-5794c4499-rj58k" Mar 09 09:33:57 crc kubenswrapper[4971]: I0309 09:33:57.202838 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f0a1e70a-0ac8-4e6e-87d6-85e6097cf8e7-apiservice-cert\") pod \"mariadb-operator-controller-manager-5794c4499-rj58k\" (UID: \"f0a1e70a-0ac8-4e6e-87d6-85e6097cf8e7\") " pod="openstack-operators/mariadb-operator-controller-manager-5794c4499-rj58k" Mar 09 09:33:57 crc kubenswrapper[4971]: I0309 09:33:57.204662 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Mar 09 09:33:57 crc kubenswrapper[4971]: I0309 09:33:57.218686 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f0a1e70a-0ac8-4e6e-87d6-85e6097cf8e7-webhook-cert\") pod \"mariadb-operator-controller-manager-5794c4499-rj58k\" (UID: \"f0a1e70a-0ac8-4e6e-87d6-85e6097cf8e7\") " pod="openstack-operators/mariadb-operator-controller-manager-5794c4499-rj58k" Mar 09 09:33:57 crc kubenswrapper[4971]: I0309 09:33:57.221414 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w9cm\" (UniqueName: \"kubernetes.io/projected/f0a1e70a-0ac8-4e6e-87d6-85e6097cf8e7-kube-api-access-5w9cm\") pod \"mariadb-operator-controller-manager-5794c4499-rj58k\" (UID: \"f0a1e70a-0ac8-4e6e-87d6-85e6097cf8e7\") " pod="openstack-operators/mariadb-operator-controller-manager-5794c4499-rj58k" Mar 09 09:33:57 crc kubenswrapper[4971]: I0309 09:33:57.222400 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f0a1e70a-0ac8-4e6e-87d6-85e6097cf8e7-apiservice-cert\") pod \"mariadb-operator-controller-manager-5794c4499-rj58k\" (UID: \"f0a1e70a-0ac8-4e6e-87d6-85e6097cf8e7\") " pod="openstack-operators/mariadb-operator-controller-manager-5794c4499-rj58k" Mar 09 09:33:57 crc kubenswrapper[4971]: I0309 09:33:57.327130 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-76w78" Mar 09 09:33:57 crc kubenswrapper[4971]: I0309 09:33:57.335831 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5794c4499-rj58k" Mar 09 09:33:57 crc kubenswrapper[4971]: I0309 09:33:57.528867 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5794c4499-rj58k"] Mar 09 09:33:57 crc kubenswrapper[4971]: I0309 09:33:57.704828 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5794c4499-rj58k" event={"ID":"f0a1e70a-0ac8-4e6e-87d6-85e6097cf8e7","Type":"ContainerStarted","Data":"95401992ffaa47e4cac68e14a49de21c98e35ab9ffe85ca364f0dc7c34c9ed5c"} Mar 09 09:34:00 crc kubenswrapper[4971]: I0309 09:34:00.223876 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550814-b5bs7"] Mar 09 09:34:00 crc kubenswrapper[4971]: I0309 09:34:00.225424 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550814-b5bs7" Mar 09 09:34:00 crc kubenswrapper[4971]: I0309 09:34:00.227616 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-xhrv2" Mar 09 09:34:00 crc kubenswrapper[4971]: I0309 09:34:00.227650 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:34:00 crc kubenswrapper[4971]: I0309 09:34:00.230045 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:34:00 crc kubenswrapper[4971]: I0309 09:34:00.230402 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550814-b5bs7"] Mar 09 09:34:00 crc kubenswrapper[4971]: I0309 09:34:00.262484 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpcfh\" (UniqueName: \"kubernetes.io/projected/05e7a398-d1de-4a76-bd69-ff9c7269a24a-kube-api-access-gpcfh\") pod \"auto-csr-approver-29550814-b5bs7\" (UID: \"05e7a398-d1de-4a76-bd69-ff9c7269a24a\") " pod="openshift-infra/auto-csr-approver-29550814-b5bs7" Mar 09 09:34:00 crc kubenswrapper[4971]: I0309 09:34:00.363333 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpcfh\" (UniqueName: \"kubernetes.io/projected/05e7a398-d1de-4a76-bd69-ff9c7269a24a-kube-api-access-gpcfh\") pod \"auto-csr-approver-29550814-b5bs7\" (UID: \"05e7a398-d1de-4a76-bd69-ff9c7269a24a\") " pod="openshift-infra/auto-csr-approver-29550814-b5bs7" Mar 09 09:34:00 crc kubenswrapper[4971]: I0309 09:34:00.382517 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpcfh\" (UniqueName: \"kubernetes.io/projected/05e7a398-d1de-4a76-bd69-ff9c7269a24a-kube-api-access-gpcfh\") pod \"auto-csr-approver-29550814-b5bs7\" (UID: \"05e7a398-d1de-4a76-bd69-ff9c7269a24a\") " pod="openshift-infra/auto-csr-approver-29550814-b5bs7" Mar 09 09:34:00 crc kubenswrapper[4971]: I0309 09:34:00.540602 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550814-b5bs7" Mar 09 09:34:01 crc kubenswrapper[4971]: I0309 09:34:01.420711 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550814-b5bs7"] Mar 09 09:34:01 crc kubenswrapper[4971]: W0309 09:34:01.428894 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05e7a398_d1de_4a76_bd69_ff9c7269a24a.slice/crio-982cf23dd1f3e147cd240f1a52383cae1c87d8bba72eeac9c20228a602089527 WatchSource:0}: Error finding container 982cf23dd1f3e147cd240f1a52383cae1c87d8bba72eeac9c20228a602089527: Status 404 returned error can't find the container with id 982cf23dd1f3e147cd240f1a52383cae1c87d8bba72eeac9c20228a602089527 Mar 09 09:34:01 crc kubenswrapper[4971]: I0309 09:34:01.733963 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550814-b5bs7" event={"ID":"05e7a398-d1de-4a76-bd69-ff9c7269a24a","Type":"ContainerStarted","Data":"982cf23dd1f3e147cd240f1a52383cae1c87d8bba72eeac9c20228a602089527"} Mar 09 09:34:01 crc kubenswrapper[4971]: I0309 09:34:01.736102 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5794c4499-rj58k" event={"ID":"f0a1e70a-0ac8-4e6e-87d6-85e6097cf8e7","Type":"ContainerStarted","Data":"4cc7cbf52dafeafe5803958d2a6a3b049014c12692acd40ecae79432e7a24b45"} Mar 09 09:34:01 crc kubenswrapper[4971]: I0309 09:34:01.737688 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5794c4499-rj58k" Mar 09 09:34:02 crc kubenswrapper[4971]: I0309 09:34:02.743313 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550814-b5bs7" event={"ID":"05e7a398-d1de-4a76-bd69-ff9c7269a24a","Type":"ContainerStarted","Data":"92a19ad23e1640c11c41f05e7db9e460f2c9b32944f1ef05fceb489fb73ec537"} Mar 09 09:34:02 crc kubenswrapper[4971]: I0309 09:34:02.757883 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5794c4499-rj58k" podStartSLOduration=3.282189164 podStartE2EDuration="6.757864292s" podCreationTimestamp="2026-03-09 09:33:56 +0000 UTC" firstStartedPulling="2026-03-09 09:33:57.534203899 +0000 UTC m=+841.094131709" lastFinishedPulling="2026-03-09 09:34:01.009879027 +0000 UTC m=+844.569806837" observedRunningTime="2026-03-09 09:34:01.762727647 +0000 UTC m=+845.322655467" watchObservedRunningTime="2026-03-09 09:34:02.757864292 +0000 UTC m=+846.317792102" Mar 09 09:34:02 crc kubenswrapper[4971]: I0309 09:34:02.759983 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550814-b5bs7" podStartSLOduration=1.8324031029999999 podStartE2EDuration="2.759975993s" podCreationTimestamp="2026-03-09 09:34:00 +0000 UTC" firstStartedPulling="2026-03-09 09:34:01.430318921 +0000 UTC m=+844.990246731" lastFinishedPulling="2026-03-09 09:34:02.357891811 +0000 UTC m=+845.917819621" observedRunningTime="2026-03-09 09:34:02.754842934 +0000 UTC m=+846.314770744" watchObservedRunningTime="2026-03-09 09:34:02.759975993 +0000 UTC m=+846.319903803" Mar 09 09:34:03 crc kubenswrapper[4971]: I0309 09:34:03.749907 4971 generic.go:334] "Generic (PLEG): container finished" podID="05e7a398-d1de-4a76-bd69-ff9c7269a24a" containerID="92a19ad23e1640c11c41f05e7db9e460f2c9b32944f1ef05fceb489fb73ec537" exitCode=0 Mar 09 09:34:03 crc kubenswrapper[4971]: I0309 09:34:03.750027 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550814-b5bs7" event={"ID":"05e7a398-d1de-4a76-bd69-ff9c7269a24a","Type":"ContainerDied","Data":"92a19ad23e1640c11c41f05e7db9e460f2c9b32944f1ef05fceb489fb73ec537"} Mar 09 09:34:04 crc kubenswrapper[4971]: I0309 09:34:04.995566 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550814-b5bs7" Mar 09 09:34:05 crc kubenswrapper[4971]: I0309 09:34:05.047625 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpcfh\" (UniqueName: \"kubernetes.io/projected/05e7a398-d1de-4a76-bd69-ff9c7269a24a-kube-api-access-gpcfh\") pod \"05e7a398-d1de-4a76-bd69-ff9c7269a24a\" (UID: \"05e7a398-d1de-4a76-bd69-ff9c7269a24a\") " Mar 09 09:34:05 crc kubenswrapper[4971]: I0309 09:34:05.052713 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05e7a398-d1de-4a76-bd69-ff9c7269a24a-kube-api-access-gpcfh" (OuterVolumeSpecName: "kube-api-access-gpcfh") pod "05e7a398-d1de-4a76-bd69-ff9c7269a24a" (UID: "05e7a398-d1de-4a76-bd69-ff9c7269a24a"). InnerVolumeSpecName "kube-api-access-gpcfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:34:05 crc kubenswrapper[4971]: I0309 09:34:05.148743 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpcfh\" (UniqueName: \"kubernetes.io/projected/05e7a398-d1de-4a76-bd69-ff9c7269a24a-kube-api-access-gpcfh\") on node \"crc\" DevicePath \"\"" Mar 09 09:34:05 crc kubenswrapper[4971]: I0309 09:34:05.762272 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550814-b5bs7" event={"ID":"05e7a398-d1de-4a76-bd69-ff9c7269a24a","Type":"ContainerDied","Data":"982cf23dd1f3e147cd240f1a52383cae1c87d8bba72eeac9c20228a602089527"} Mar 09 09:34:05 crc kubenswrapper[4971]: I0309 09:34:05.762336 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="982cf23dd1f3e147cd240f1a52383cae1c87d8bba72eeac9c20228a602089527" Mar 09 09:34:05 crc kubenswrapper[4971]: I0309 09:34:05.762369 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550814-b5bs7" Mar 09 09:34:05 crc kubenswrapper[4971]: I0309 09:34:05.797236 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550808-w22d4"] Mar 09 09:34:05 crc kubenswrapper[4971]: I0309 09:34:05.800604 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550808-w22d4"] Mar 09 09:34:07 crc kubenswrapper[4971]: I0309 09:34:07.158829 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee51aea1-202c-473d-ac89-4db3058e25a1" path="/var/lib/kubelet/pods/ee51aea1-202c-473d-ac89-4db3058e25a1/volumes" Mar 09 09:34:07 crc kubenswrapper[4971]: I0309 09:34:07.341594 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5794c4499-rj58k" Mar 09 09:34:14 crc kubenswrapper[4971]: I0309 09:34:14.536789 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-rvqm6"] Mar 09 09:34:14 crc kubenswrapper[4971]: E0309 09:34:14.537542 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05e7a398-d1de-4a76-bd69-ff9c7269a24a" containerName="oc" Mar 09 09:34:14 crc kubenswrapper[4971]: I0309 09:34:14.537558 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="05e7a398-d1de-4a76-bd69-ff9c7269a24a" containerName="oc" Mar 09 09:34:14 crc kubenswrapper[4971]: I0309 09:34:14.537674 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="05e7a398-d1de-4a76-bd69-ff9c7269a24a" containerName="oc" Mar 09 09:34:14 crc kubenswrapper[4971]: I0309 09:34:14.538127 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-rvqm6" Mar 09 09:34:14 crc kubenswrapper[4971]: I0309 09:34:14.540405 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-brvvm" Mar 09 09:34:14 crc kubenswrapper[4971]: I0309 09:34:14.545642 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-rvqm6"] Mar 09 09:34:14 crc kubenswrapper[4971]: I0309 09:34:14.663299 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrdlc\" (UniqueName: \"kubernetes.io/projected/5d62c895-9226-41c3-b4b3-23f6990c1ee7-kube-api-access-xrdlc\") pod \"infra-operator-index-rvqm6\" (UID: \"5d62c895-9226-41c3-b4b3-23f6990c1ee7\") " pod="openstack-operators/infra-operator-index-rvqm6" Mar 09 09:34:14 crc kubenswrapper[4971]: I0309 09:34:14.764883 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrdlc\" (UniqueName: \"kubernetes.io/projected/5d62c895-9226-41c3-b4b3-23f6990c1ee7-kube-api-access-xrdlc\") pod \"infra-operator-index-rvqm6\" (UID: \"5d62c895-9226-41c3-b4b3-23f6990c1ee7\") " pod="openstack-operators/infra-operator-index-rvqm6" Mar 09 09:34:14 crc kubenswrapper[4971]: I0309 09:34:14.783454 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrdlc\" (UniqueName: \"kubernetes.io/projected/5d62c895-9226-41c3-b4b3-23f6990c1ee7-kube-api-access-xrdlc\") pod \"infra-operator-index-rvqm6\" (UID: \"5d62c895-9226-41c3-b4b3-23f6990c1ee7\") " pod="openstack-operators/infra-operator-index-rvqm6" Mar 09 09:34:14 crc kubenswrapper[4971]: I0309 09:34:14.891623 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-rvqm6" Mar 09 09:34:15 crc kubenswrapper[4971]: I0309 09:34:15.319323 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-rvqm6"] Mar 09 09:34:15 crc kubenswrapper[4971]: W0309 09:34:15.329204 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d62c895_9226_41c3_b4b3_23f6990c1ee7.slice/crio-986ceecaffadf698d379dfddeb310aacf943be3f10f9e18760bbf617fddfc682 WatchSource:0}: Error finding container 986ceecaffadf698d379dfddeb310aacf943be3f10f9e18760bbf617fddfc682: Status 404 returned error can't find the container with id 986ceecaffadf698d379dfddeb310aacf943be3f10f9e18760bbf617fddfc682 Mar 09 09:34:15 crc kubenswrapper[4971]: I0309 09:34:15.828154 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-rvqm6" event={"ID":"5d62c895-9226-41c3-b4b3-23f6990c1ee7","Type":"ContainerStarted","Data":"986ceecaffadf698d379dfddeb310aacf943be3f10f9e18760bbf617fddfc682"} Mar 09 09:34:17 crc kubenswrapper[4971]: I0309 09:34:17.842857 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-rvqm6" event={"ID":"5d62c895-9226-41c3-b4b3-23f6990c1ee7","Type":"ContainerStarted","Data":"212d86a03adaa8ed47c4eb5944a27dea3a052e7af232f89ec414c4c3a9118ccd"} Mar 09 09:34:17 crc kubenswrapper[4971]: I0309 09:34:17.861749 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-rvqm6" podStartSLOduration=2.358115829 podStartE2EDuration="3.861730408s" podCreationTimestamp="2026-03-09 09:34:14 +0000 UTC" firstStartedPulling="2026-03-09 09:34:15.331566394 +0000 UTC m=+858.891494204" lastFinishedPulling="2026-03-09 09:34:16.835180973 +0000 UTC m=+860.395108783" observedRunningTime="2026-03-09 09:34:17.859164904 +0000 UTC m=+861.419092714" watchObservedRunningTime="2026-03-09 09:34:17.861730408 +0000 UTC m=+861.421658218" Mar 09 09:34:24 crc kubenswrapper[4971]: I0309 09:34:24.891906 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-rvqm6" Mar 09 09:34:24 crc kubenswrapper[4971]: I0309 09:34:24.892511 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-rvqm6" Mar 09 09:34:24 crc kubenswrapper[4971]: I0309 09:34:24.920272 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-rvqm6" Mar 09 09:34:25 crc kubenswrapper[4971]: I0309 09:34:25.920465 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-rvqm6" Mar 09 09:34:34 crc kubenswrapper[4971]: I0309 09:34:34.182638 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf"] Mar 09 09:34:34 crc kubenswrapper[4971]: I0309 09:34:34.184491 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf" Mar 09 09:34:34 crc kubenswrapper[4971]: I0309 09:34:34.191091 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-w69pb" Mar 09 09:34:34 crc kubenswrapper[4971]: I0309 09:34:34.195740 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf"] Mar 09 09:34:34 crc kubenswrapper[4971]: I0309 09:34:34.329937 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pshs6\" (UniqueName: \"kubernetes.io/projected/48c57993-da6a-45d8-8103-c90eb33399b0-kube-api-access-pshs6\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf\" (UID: \"48c57993-da6a-45d8-8103-c90eb33399b0\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf" Mar 09 09:34:34 crc kubenswrapper[4971]: I0309 09:34:34.329988 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48c57993-da6a-45d8-8103-c90eb33399b0-util\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf\" (UID: \"48c57993-da6a-45d8-8103-c90eb33399b0\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf" Mar 09 09:34:34 crc kubenswrapper[4971]: I0309 09:34:34.330046 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48c57993-da6a-45d8-8103-c90eb33399b0-bundle\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf\" (UID: \"48c57993-da6a-45d8-8103-c90eb33399b0\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf" Mar 09 09:34:34 crc kubenswrapper[4971]: I0309 09:34:34.431090 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pshs6\" (UniqueName: \"kubernetes.io/projected/48c57993-da6a-45d8-8103-c90eb33399b0-kube-api-access-pshs6\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf\" (UID: \"48c57993-da6a-45d8-8103-c90eb33399b0\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf" Mar 09 09:34:34 crc kubenswrapper[4971]: I0309 09:34:34.431155 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48c57993-da6a-45d8-8103-c90eb33399b0-util\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf\" (UID: \"48c57993-da6a-45d8-8103-c90eb33399b0\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf" Mar 09 09:34:34 crc kubenswrapper[4971]: I0309 09:34:34.431219 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48c57993-da6a-45d8-8103-c90eb33399b0-bundle\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf\" (UID: \"48c57993-da6a-45d8-8103-c90eb33399b0\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf" Mar 09 09:34:34 crc kubenswrapper[4971]: I0309 09:34:34.431758 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48c57993-da6a-45d8-8103-c90eb33399b0-util\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf\" (UID: \"48c57993-da6a-45d8-8103-c90eb33399b0\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf" Mar 09 09:34:34 crc kubenswrapper[4971]: I0309 09:34:34.431836 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48c57993-da6a-45d8-8103-c90eb33399b0-bundle\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf\" (UID: \"48c57993-da6a-45d8-8103-c90eb33399b0\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf" Mar 09 09:34:34 crc kubenswrapper[4971]: I0309 09:34:34.450332 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pshs6\" (UniqueName: \"kubernetes.io/projected/48c57993-da6a-45d8-8103-c90eb33399b0-kube-api-access-pshs6\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf\" (UID: \"48c57993-da6a-45d8-8103-c90eb33399b0\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf" Mar 09 09:34:34 crc kubenswrapper[4971]: I0309 09:34:34.508744 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf" Mar 09 09:34:34 crc kubenswrapper[4971]: I0309 09:34:34.911036 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf"] Mar 09 09:34:34 crc kubenswrapper[4971]: W0309 09:34:34.915779 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48c57993_da6a_45d8_8103_c90eb33399b0.slice/crio-02622d6913ca38badb05d6e3ad8f623a5ea6e30689ecef51dc0c40bb2b6f03a2 WatchSource:0}: Error finding container 02622d6913ca38badb05d6e3ad8f623a5ea6e30689ecef51dc0c40bb2b6f03a2: Status 404 returned error can't find the container with id 02622d6913ca38badb05d6e3ad8f623a5ea6e30689ecef51dc0c40bb2b6f03a2 Mar 09 09:34:34 crc kubenswrapper[4971]: I0309 09:34:34.944809 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf" event={"ID":"48c57993-da6a-45d8-8103-c90eb33399b0","Type":"ContainerStarted","Data":"02622d6913ca38badb05d6e3ad8f623a5ea6e30689ecef51dc0c40bb2b6f03a2"} Mar 09 09:34:35 crc kubenswrapper[4971]: I0309 09:34:35.952251 4971 generic.go:334] "Generic (PLEG): container finished" podID="48c57993-da6a-45d8-8103-c90eb33399b0" containerID="acba6d2ab8737facca58d1de6c3fddb851a5f7e8ab548d0b11a13d9a2ec366e2" exitCode=0 Mar 09 09:34:35 crc kubenswrapper[4971]: I0309 09:34:35.952415 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf" event={"ID":"48c57993-da6a-45d8-8103-c90eb33399b0","Type":"ContainerDied","Data":"acba6d2ab8737facca58d1de6c3fddb851a5f7e8ab548d0b11a13d9a2ec366e2"} Mar 09 09:34:36 crc kubenswrapper[4971]: I0309 09:34:36.959858 4971 generic.go:334] "Generic (PLEG): container finished" podID="48c57993-da6a-45d8-8103-c90eb33399b0" containerID="36ded9d972ba029a91149bffe0511e5a8bc3ea232001e5eb926a371743e3f435" exitCode=0 Mar 09 09:34:36 crc kubenswrapper[4971]: I0309 09:34:36.959933 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf" event={"ID":"48c57993-da6a-45d8-8103-c90eb33399b0","Type":"ContainerDied","Data":"36ded9d972ba029a91149bffe0511e5a8bc3ea232001e5eb926a371743e3f435"} Mar 09 09:34:37 crc kubenswrapper[4971]: I0309 09:34:37.550398 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8b74t"] Mar 09 09:34:37 crc kubenswrapper[4971]: I0309 09:34:37.551593 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8b74t" Mar 09 09:34:37 crc kubenswrapper[4971]: I0309 09:34:37.557685 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8b74t"] Mar 09 09:34:37 crc kubenswrapper[4971]: I0309 09:34:37.677115 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwtgj\" (UniqueName: \"kubernetes.io/projected/bdc95e23-ea4d-432e-a1fa-eed98bc40b7b-kube-api-access-dwtgj\") pod \"redhat-operators-8b74t\" (UID: \"bdc95e23-ea4d-432e-a1fa-eed98bc40b7b\") " pod="openshift-marketplace/redhat-operators-8b74t" Mar 09 09:34:37 crc kubenswrapper[4971]: I0309 09:34:37.677183 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc95e23-ea4d-432e-a1fa-eed98bc40b7b-catalog-content\") pod \"redhat-operators-8b74t\" (UID: \"bdc95e23-ea4d-432e-a1fa-eed98bc40b7b\") " pod="openshift-marketplace/redhat-operators-8b74t" Mar 09 09:34:37 crc kubenswrapper[4971]: I0309 09:34:37.677257 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc95e23-ea4d-432e-a1fa-eed98bc40b7b-utilities\") pod \"redhat-operators-8b74t\" (UID: \"bdc95e23-ea4d-432e-a1fa-eed98bc40b7b\") " pod="openshift-marketplace/redhat-operators-8b74t" Mar 09 09:34:37 crc kubenswrapper[4971]: I0309 09:34:37.778923 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwtgj\" (UniqueName: \"kubernetes.io/projected/bdc95e23-ea4d-432e-a1fa-eed98bc40b7b-kube-api-access-dwtgj\") pod \"redhat-operators-8b74t\" (UID: \"bdc95e23-ea4d-432e-a1fa-eed98bc40b7b\") " pod="openshift-marketplace/redhat-operators-8b74t" Mar 09 09:34:37 crc kubenswrapper[4971]: I0309 09:34:37.779557 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc95e23-ea4d-432e-a1fa-eed98bc40b7b-catalog-content\") pod \"redhat-operators-8b74t\" (UID: \"bdc95e23-ea4d-432e-a1fa-eed98bc40b7b\") " pod="openshift-marketplace/redhat-operators-8b74t" Mar 09 09:34:37 crc kubenswrapper[4971]: I0309 09:34:37.779679 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc95e23-ea4d-432e-a1fa-eed98bc40b7b-utilities\") pod \"redhat-operators-8b74t\" (UID: \"bdc95e23-ea4d-432e-a1fa-eed98bc40b7b\") " pod="openshift-marketplace/redhat-operators-8b74t" Mar 09 09:34:37 crc kubenswrapper[4971]: I0309 09:34:37.780077 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc95e23-ea4d-432e-a1fa-eed98bc40b7b-catalog-content\") pod \"redhat-operators-8b74t\" (UID: \"bdc95e23-ea4d-432e-a1fa-eed98bc40b7b\") " pod="openshift-marketplace/redhat-operators-8b74t" Mar 09 09:34:37 crc kubenswrapper[4971]: I0309 09:34:37.780125 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc95e23-ea4d-432e-a1fa-eed98bc40b7b-utilities\") pod \"redhat-operators-8b74t\" (UID: \"bdc95e23-ea4d-432e-a1fa-eed98bc40b7b\") " pod="openshift-marketplace/redhat-operators-8b74t" Mar 09 09:34:37 crc kubenswrapper[4971]: I0309 09:34:37.823666 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwtgj\" (UniqueName: \"kubernetes.io/projected/bdc95e23-ea4d-432e-a1fa-eed98bc40b7b-kube-api-access-dwtgj\") pod \"redhat-operators-8b74t\" (UID: \"bdc95e23-ea4d-432e-a1fa-eed98bc40b7b\") " pod="openshift-marketplace/redhat-operators-8b74t" Mar 09 09:34:37 crc kubenswrapper[4971]: I0309 09:34:37.868506 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8b74t" Mar 09 09:34:37 crc kubenswrapper[4971]: I0309 09:34:37.972163 4971 generic.go:334] "Generic (PLEG): container finished" podID="48c57993-da6a-45d8-8103-c90eb33399b0" containerID="3087b30fb6bc8e62370c81df942a6149edce97fc0d6243726bd6043cc8347a76" exitCode=0 Mar 09 09:34:37 crc kubenswrapper[4971]: I0309 09:34:37.972216 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf" event={"ID":"48c57993-da6a-45d8-8103-c90eb33399b0","Type":"ContainerDied","Data":"3087b30fb6bc8e62370c81df942a6149edce97fc0d6243726bd6043cc8347a76"} Mar 09 09:34:38 crc kubenswrapper[4971]: I0309 09:34:38.106795 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8b74t"] Mar 09 09:34:38 crc kubenswrapper[4971]: W0309 09:34:38.110845 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdc95e23_ea4d_432e_a1fa_eed98bc40b7b.slice/crio-2f394f3b11868f1bebe9651eaa37a4625dca58b7716d554a34d8d27ab2b93077 WatchSource:0}: Error finding container 2f394f3b11868f1bebe9651eaa37a4625dca58b7716d554a34d8d27ab2b93077: Status 404 returned error can't find the container with id 2f394f3b11868f1bebe9651eaa37a4625dca58b7716d554a34d8d27ab2b93077 Mar 09 09:34:38 crc kubenswrapper[4971]: I0309 09:34:38.979855 4971 generic.go:334] "Generic (PLEG): container finished" podID="bdc95e23-ea4d-432e-a1fa-eed98bc40b7b" containerID="61cdd23dd01f597b26823150551a6e5213482e9ab08b866eb98dfe1751de6eeb" exitCode=0 Mar 09 09:34:38 crc kubenswrapper[4971]: I0309 09:34:38.979940 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8b74t" event={"ID":"bdc95e23-ea4d-432e-a1fa-eed98bc40b7b","Type":"ContainerDied","Data":"61cdd23dd01f597b26823150551a6e5213482e9ab08b866eb98dfe1751de6eeb"} Mar 09 09:34:38 crc kubenswrapper[4971]: I0309 09:34:38.980000 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8b74t" event={"ID":"bdc95e23-ea4d-432e-a1fa-eed98bc40b7b","Type":"ContainerStarted","Data":"2f394f3b11868f1bebe9651eaa37a4625dca58b7716d554a34d8d27ab2b93077"} Mar 09 09:34:39 crc kubenswrapper[4971]: I0309 09:34:39.269715 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf" Mar 09 09:34:39 crc kubenswrapper[4971]: I0309 09:34:39.405617 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pshs6\" (UniqueName: \"kubernetes.io/projected/48c57993-da6a-45d8-8103-c90eb33399b0-kube-api-access-pshs6\") pod \"48c57993-da6a-45d8-8103-c90eb33399b0\" (UID: \"48c57993-da6a-45d8-8103-c90eb33399b0\") " Mar 09 09:34:39 crc kubenswrapper[4971]: I0309 09:34:39.405732 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48c57993-da6a-45d8-8103-c90eb33399b0-util\") pod \"48c57993-da6a-45d8-8103-c90eb33399b0\" (UID: \"48c57993-da6a-45d8-8103-c90eb33399b0\") " Mar 09 09:34:39 crc kubenswrapper[4971]: I0309 09:34:39.405753 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48c57993-da6a-45d8-8103-c90eb33399b0-bundle\") pod \"48c57993-da6a-45d8-8103-c90eb33399b0\" (UID: \"48c57993-da6a-45d8-8103-c90eb33399b0\") " Mar 09 09:34:39 crc kubenswrapper[4971]: I0309 09:34:39.407623 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48c57993-da6a-45d8-8103-c90eb33399b0-bundle" (OuterVolumeSpecName: "bundle") pod "48c57993-da6a-45d8-8103-c90eb33399b0" (UID: "48c57993-da6a-45d8-8103-c90eb33399b0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:34:39 crc kubenswrapper[4971]: I0309 09:34:39.411796 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48c57993-da6a-45d8-8103-c90eb33399b0-kube-api-access-pshs6" (OuterVolumeSpecName: "kube-api-access-pshs6") pod "48c57993-da6a-45d8-8103-c90eb33399b0" (UID: "48c57993-da6a-45d8-8103-c90eb33399b0"). InnerVolumeSpecName "kube-api-access-pshs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:34:39 crc kubenswrapper[4971]: I0309 09:34:39.420987 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48c57993-da6a-45d8-8103-c90eb33399b0-util" (OuterVolumeSpecName: "util") pod "48c57993-da6a-45d8-8103-c90eb33399b0" (UID: "48c57993-da6a-45d8-8103-c90eb33399b0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:34:39 crc kubenswrapper[4971]: I0309 09:34:39.507084 4971 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48c57993-da6a-45d8-8103-c90eb33399b0-util\") on node \"crc\" DevicePath \"\"" Mar 09 09:34:39 crc kubenswrapper[4971]: I0309 09:34:39.507131 4971 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48c57993-da6a-45d8-8103-c90eb33399b0-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:34:39 crc kubenswrapper[4971]: I0309 09:34:39.507145 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pshs6\" (UniqueName: \"kubernetes.io/projected/48c57993-da6a-45d8-8103-c90eb33399b0-kube-api-access-pshs6\") on node \"crc\" DevicePath \"\"" Mar 09 09:34:39 crc kubenswrapper[4971]: I0309 09:34:39.987156 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf" event={"ID":"48c57993-da6a-45d8-8103-c90eb33399b0","Type":"ContainerDied","Data":"02622d6913ca38badb05d6e3ad8f623a5ea6e30689ecef51dc0c40bb2b6f03a2"} Mar 09 09:34:39 crc kubenswrapper[4971]: I0309 09:34:39.987484 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02622d6913ca38badb05d6e3ad8f623a5ea6e30689ecef51dc0c40bb2b6f03a2" Mar 09 09:34:39 crc kubenswrapper[4971]: I0309 09:34:39.987201 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf" Mar 09 09:34:39 crc kubenswrapper[4971]: I0309 09:34:39.988928 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8b74t" event={"ID":"bdc95e23-ea4d-432e-a1fa-eed98bc40b7b","Type":"ContainerStarted","Data":"60dd31495ad4467ef48bd971d88af4024945ac5a803718a440dbe149c2d6323e"} Mar 09 09:34:40 crc kubenswrapper[4971]: I0309 09:34:40.998282 4971 generic.go:334] "Generic (PLEG): container finished" podID="bdc95e23-ea4d-432e-a1fa-eed98bc40b7b" containerID="60dd31495ad4467ef48bd971d88af4024945ac5a803718a440dbe149c2d6323e" exitCode=0 Mar 09 09:34:40 crc kubenswrapper[4971]: I0309 09:34:40.998318 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8b74t" event={"ID":"bdc95e23-ea4d-432e-a1fa-eed98bc40b7b","Type":"ContainerDied","Data":"60dd31495ad4467ef48bd971d88af4024945ac5a803718a440dbe149c2d6323e"} Mar 09 09:34:42 crc kubenswrapper[4971]: I0309 09:34:42.011984 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8b74t" event={"ID":"bdc95e23-ea4d-432e-a1fa-eed98bc40b7b","Type":"ContainerStarted","Data":"c30c9e3e51956f18592742280215dd489ae20fcf48ed4feb44e6b73cc94be084"} Mar 09 09:34:42 crc kubenswrapper[4971]: I0309 09:34:42.030597 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8b74t" podStartSLOduration=2.621921987 podStartE2EDuration="5.030580974s" podCreationTimestamp="2026-03-09 09:34:37 +0000 UTC" firstStartedPulling="2026-03-09 09:34:38.981704579 +0000 UTC m=+882.541632389" lastFinishedPulling="2026-03-09 09:34:41.390363566 +0000 UTC m=+884.950291376" observedRunningTime="2026-03-09 09:34:42.027368871 +0000 UTC m=+885.587296701" watchObservedRunningTime="2026-03-09 09:34:42.030580974 +0000 UTC m=+885.590508784" Mar 09 09:34:44 crc kubenswrapper[4971]: I0309 09:34:44.795050 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:34:44 crc kubenswrapper[4971]: I0309 09:34:44.795120 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:34:47 crc kubenswrapper[4971]: I0309 09:34:47.869394 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8b74t" Mar 09 09:34:47 crc kubenswrapper[4971]: I0309 09:34:47.869742 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8b74t" Mar 09 09:34:47 crc kubenswrapper[4971]: I0309 09:34:47.905842 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8b74t" Mar 09 09:34:48 crc kubenswrapper[4971]: I0309 09:34:48.079194 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8b74t" Mar 09 09:34:51 crc kubenswrapper[4971]: I0309 09:34:51.336405 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8b74t"] Mar 09 09:34:51 crc kubenswrapper[4971]: I0309 09:34:51.338613 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8b74t" podUID="bdc95e23-ea4d-432e-a1fa-eed98bc40b7b" containerName="registry-server" containerID="cri-o://c30c9e3e51956f18592742280215dd489ae20fcf48ed4feb44e6b73cc94be084" gracePeriod=2 Mar 09 09:34:51 crc kubenswrapper[4971]: I0309 09:34:51.764543 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-557f5c56bb-4glvw"] Mar 09 09:34:51 crc kubenswrapper[4971]: E0309 09:34:51.764802 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c57993-da6a-45d8-8103-c90eb33399b0" containerName="util" Mar 09 09:34:51 crc kubenswrapper[4971]: I0309 09:34:51.764813 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c57993-da6a-45d8-8103-c90eb33399b0" containerName="util" Mar 09 09:34:51 crc kubenswrapper[4971]: E0309 09:34:51.764828 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c57993-da6a-45d8-8103-c90eb33399b0" containerName="extract" Mar 09 09:34:51 crc kubenswrapper[4971]: I0309 09:34:51.764835 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c57993-da6a-45d8-8103-c90eb33399b0" containerName="extract" Mar 09 09:34:51 crc kubenswrapper[4971]: E0309 09:34:51.764850 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c57993-da6a-45d8-8103-c90eb33399b0" containerName="pull" Mar 09 09:34:51 crc kubenswrapper[4971]: I0309 09:34:51.764856 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c57993-da6a-45d8-8103-c90eb33399b0" containerName="pull" Mar 09 09:34:51 crc kubenswrapper[4971]: I0309 09:34:51.764972 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="48c57993-da6a-45d8-8103-c90eb33399b0" containerName="extract" Mar 09 09:34:51 crc kubenswrapper[4971]: I0309 09:34:51.765369 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-557f5c56bb-4glvw" Mar 09 09:34:51 crc kubenswrapper[4971]: I0309 09:34:51.767926 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Mar 09 09:34:51 crc kubenswrapper[4971]: I0309 09:34:51.769316 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-78cw5" Mar 09 09:34:51 crc kubenswrapper[4971]: I0309 09:34:51.813410 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-557f5c56bb-4glvw"] Mar 09 09:34:51 crc kubenswrapper[4971]: I0309 09:34:51.888320 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed9e539e-9f00-4168-9486-c1aa126c0514-webhook-cert\") pod \"infra-operator-controller-manager-557f5c56bb-4glvw\" (UID: \"ed9e539e-9f00-4168-9486-c1aa126c0514\") " pod="openstack-operators/infra-operator-controller-manager-557f5c56bb-4glvw" Mar 09 09:34:51 crc kubenswrapper[4971]: I0309 09:34:51.888404 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed9e539e-9f00-4168-9486-c1aa126c0514-apiservice-cert\") pod \"infra-operator-controller-manager-557f5c56bb-4glvw\" (UID: \"ed9e539e-9f00-4168-9486-c1aa126c0514\") " pod="openstack-operators/infra-operator-controller-manager-557f5c56bb-4glvw" Mar 09 09:34:51 crc kubenswrapper[4971]: I0309 09:34:51.888442 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhhd4\" (UniqueName: \"kubernetes.io/projected/ed9e539e-9f00-4168-9486-c1aa126c0514-kube-api-access-bhhd4\") pod \"infra-operator-controller-manager-557f5c56bb-4glvw\" (UID: \"ed9e539e-9f00-4168-9486-c1aa126c0514\") " pod="openstack-operators/infra-operator-controller-manager-557f5c56bb-4glvw" Mar 09 09:34:51 crc kubenswrapper[4971]: I0309 09:34:51.989663 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed9e539e-9f00-4168-9486-c1aa126c0514-webhook-cert\") pod \"infra-operator-controller-manager-557f5c56bb-4glvw\" (UID: \"ed9e539e-9f00-4168-9486-c1aa126c0514\") " pod="openstack-operators/infra-operator-controller-manager-557f5c56bb-4glvw" Mar 09 09:34:51 crc kubenswrapper[4971]: I0309 09:34:51.990026 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed9e539e-9f00-4168-9486-c1aa126c0514-apiservice-cert\") pod \"infra-operator-controller-manager-557f5c56bb-4glvw\" (UID: \"ed9e539e-9f00-4168-9486-c1aa126c0514\") " pod="openstack-operators/infra-operator-controller-manager-557f5c56bb-4glvw" Mar 09 09:34:51 crc kubenswrapper[4971]: I0309 09:34:51.990064 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhhd4\" (UniqueName: \"kubernetes.io/projected/ed9e539e-9f00-4168-9486-c1aa126c0514-kube-api-access-bhhd4\") pod \"infra-operator-controller-manager-557f5c56bb-4glvw\" (UID: \"ed9e539e-9f00-4168-9486-c1aa126c0514\") " pod="openstack-operators/infra-operator-controller-manager-557f5c56bb-4glvw" Mar 09 09:34:52 crc kubenswrapper[4971]: I0309 09:34:52.004465 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed9e539e-9f00-4168-9486-c1aa126c0514-apiservice-cert\") pod \"infra-operator-controller-manager-557f5c56bb-4glvw\" (UID: \"ed9e539e-9f00-4168-9486-c1aa126c0514\") " pod="openstack-operators/infra-operator-controller-manager-557f5c56bb-4glvw" Mar 09 09:34:52 crc kubenswrapper[4971]: I0309 09:34:52.005033 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed9e539e-9f00-4168-9486-c1aa126c0514-webhook-cert\") pod \"infra-operator-controller-manager-557f5c56bb-4glvw\" (UID: \"ed9e539e-9f00-4168-9486-c1aa126c0514\") " pod="openstack-operators/infra-operator-controller-manager-557f5c56bb-4glvw" Mar 09 09:34:52 crc kubenswrapper[4971]: I0309 09:34:52.007620 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhhd4\" (UniqueName: \"kubernetes.io/projected/ed9e539e-9f00-4168-9486-c1aa126c0514-kube-api-access-bhhd4\") pod \"infra-operator-controller-manager-557f5c56bb-4glvw\" (UID: \"ed9e539e-9f00-4168-9486-c1aa126c0514\") " pod="openstack-operators/infra-operator-controller-manager-557f5c56bb-4glvw" Mar 09 09:34:52 crc kubenswrapper[4971]: I0309 09:34:52.081981 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-557f5c56bb-4glvw" Mar 09 09:34:52 crc kubenswrapper[4971]: I0309 09:34:52.308525 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-557f5c56bb-4glvw"] Mar 09 09:34:52 crc kubenswrapper[4971]: I0309 09:34:52.973542 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/openstack-galera-2"] Mar 09 09:34:52 crc kubenswrapper[4971]: I0309 09:34:52.974766 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-2" Mar 09 09:34:52 crc kubenswrapper[4971]: I0309 09:34:52.979416 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"openstack-config-data" Mar 09 09:34:52 crc kubenswrapper[4971]: I0309 09:34:52.981174 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"openstack-scripts" Mar 09 09:34:52 crc kubenswrapper[4971]: I0309 09:34:52.981487 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"kube-root-ca.crt" Mar 09 09:34:52 crc kubenswrapper[4971]: I0309 09:34:52.982863 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"openshift-service-ca.crt" Mar 09 09:34:52 crc kubenswrapper[4971]: I0309 09:34:52.983120 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"galera-openstack-dockercfg-q7bdd" Mar 09 09:34:52 crc kubenswrapper[4971]: I0309 09:34:52.983757 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/openstack-galera-0"] Mar 09 09:34:52 crc kubenswrapper[4971]: I0309 09:34:52.984724 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-0" Mar 09 09:34:52 crc kubenswrapper[4971]: I0309 09:34:52.992139 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/openstack-galera-1"] Mar 09 09:34:52 crc kubenswrapper[4971]: I0309 09:34:52.993027 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-1" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.001845 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/openstack-galera-2"] Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.007084 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/openstack-galera-0"] Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.023108 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/openstack-galera-1"] Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.073678 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-557f5c56bb-4glvw" event={"ID":"ed9e539e-9f00-4168-9486-c1aa126c0514","Type":"ContainerStarted","Data":"3d661832a5d21e4c35e442d590c08f1dbf6cd812991fcc6324f9663a2e99cbb9"} Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.076280 4971 generic.go:334] "Generic (PLEG): container finished" podID="bdc95e23-ea4d-432e-a1fa-eed98bc40b7b" containerID="c30c9e3e51956f18592742280215dd489ae20fcf48ed4feb44e6b73cc94be084" exitCode=0 Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.076315 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8b74t" event={"ID":"bdc95e23-ea4d-432e-a1fa-eed98bc40b7b","Type":"ContainerDied","Data":"c30c9e3e51956f18592742280215dd489ae20fcf48ed4feb44e6b73cc94be084"} Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.102970 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/09cf3e3d-f27d-4258-a35a-17172dce14cf-config-data-default\") pod \"openstack-galera-1\" (UID: \"09cf3e3d-f27d-4258-a35a-17172dce14cf\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.103048 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0248bf28-3089-40e7-9ab1-2131010368c4-config-data-generated\") pod \"openstack-galera-2\" (UID: \"0248bf28-3089-40e7-9ab1-2131010368c4\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.103078 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7fde2aa4-e297-4641-b450-e95ea05b5229-config-data-default\") pod \"openstack-galera-0\" (UID: \"7fde2aa4-e297-4641-b450-e95ea05b5229\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.103106 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7fde2aa4-e297-4641-b450-e95ea05b5229-kolla-config\") pod \"openstack-galera-0\" (UID: \"7fde2aa4-e297-4641-b450-e95ea05b5229\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.103141 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svnbw\" (UniqueName: \"kubernetes.io/projected/09cf3e3d-f27d-4258-a35a-17172dce14cf-kube-api-access-svnbw\") pod \"openstack-galera-1\" (UID: \"09cf3e3d-f27d-4258-a35a-17172dce14cf\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.103163 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65fbn\" (UniqueName: \"kubernetes.io/projected/7fde2aa4-e297-4641-b450-e95ea05b5229-kube-api-access-65fbn\") pod \"openstack-galera-0\" (UID: \"7fde2aa4-e297-4641-b450-e95ea05b5229\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.103273 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/09cf3e3d-f27d-4258-a35a-17172dce14cf-kolla-config\") pod \"openstack-galera-1\" (UID: \"09cf3e3d-f27d-4258-a35a-17172dce14cf\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.103382 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fde2aa4-e297-4641-b450-e95ea05b5229-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7fde2aa4-e297-4641-b450-e95ea05b5229\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.103416 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0248bf28-3089-40e7-9ab1-2131010368c4-operator-scripts\") pod \"openstack-galera-2\" (UID: \"0248bf28-3089-40e7-9ab1-2131010368c4\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.103449 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"7fde2aa4-e297-4641-b450-e95ea05b5229\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.103549 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0248bf28-3089-40e7-9ab1-2131010368c4-config-data-default\") pod \"openstack-galera-2\" (UID: \"0248bf28-3089-40e7-9ab1-2131010368c4\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.103587 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-1\" (UID: \"09cf3e3d-f27d-4258-a35a-17172dce14cf\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.103684 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/09cf3e3d-f27d-4258-a35a-17172dce14cf-config-data-generated\") pod \"openstack-galera-1\" (UID: \"09cf3e3d-f27d-4258-a35a-17172dce14cf\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.103741 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7fde2aa4-e297-4641-b450-e95ea05b5229-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7fde2aa4-e297-4641-b450-e95ea05b5229\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.103791 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2j6r\" (UniqueName: \"kubernetes.io/projected/0248bf28-3089-40e7-9ab1-2131010368c4-kube-api-access-s2j6r\") pod \"openstack-galera-2\" (UID: \"0248bf28-3089-40e7-9ab1-2131010368c4\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.103831 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09cf3e3d-f27d-4258-a35a-17172dce14cf-operator-scripts\") pod \"openstack-galera-1\" (UID: \"09cf3e3d-f27d-4258-a35a-17172dce14cf\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.103870 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0248bf28-3089-40e7-9ab1-2131010368c4-kolla-config\") pod \"openstack-galera-2\" (UID: \"0248bf28-3089-40e7-9ab1-2131010368c4\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.103912 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-2\" (UID: \"0248bf28-3089-40e7-9ab1-2131010368c4\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.205675 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0248bf28-3089-40e7-9ab1-2131010368c4-config-data-generated\") pod \"openstack-galera-2\" (UID: \"0248bf28-3089-40e7-9ab1-2131010368c4\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.205727 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7fde2aa4-e297-4641-b450-e95ea05b5229-kolla-config\") pod \"openstack-galera-0\" (UID: \"7fde2aa4-e297-4641-b450-e95ea05b5229\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.205742 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7fde2aa4-e297-4641-b450-e95ea05b5229-config-data-default\") pod \"openstack-galera-0\" (UID: \"7fde2aa4-e297-4641-b450-e95ea05b5229\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.205783 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svnbw\" (UniqueName: \"kubernetes.io/projected/09cf3e3d-f27d-4258-a35a-17172dce14cf-kube-api-access-svnbw\") pod \"openstack-galera-1\" (UID: \"09cf3e3d-f27d-4258-a35a-17172dce14cf\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.205803 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65fbn\" (UniqueName: \"kubernetes.io/projected/7fde2aa4-e297-4641-b450-e95ea05b5229-kube-api-access-65fbn\") pod \"openstack-galera-0\" (UID: \"7fde2aa4-e297-4641-b450-e95ea05b5229\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.205823 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/09cf3e3d-f27d-4258-a35a-17172dce14cf-kolla-config\") pod \"openstack-galera-1\" (UID: \"09cf3e3d-f27d-4258-a35a-17172dce14cf\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.206039 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fde2aa4-e297-4641-b450-e95ea05b5229-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7fde2aa4-e297-4641-b450-e95ea05b5229\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.206062 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0248bf28-3089-40e7-9ab1-2131010368c4-operator-scripts\") pod \"openstack-galera-2\" (UID: \"0248bf28-3089-40e7-9ab1-2131010368c4\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.206099 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"7fde2aa4-e297-4641-b450-e95ea05b5229\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.206121 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0248bf28-3089-40e7-9ab1-2131010368c4-config-data-default\") pod \"openstack-galera-2\" (UID: \"0248bf28-3089-40e7-9ab1-2131010368c4\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.206197 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0248bf28-3089-40e7-9ab1-2131010368c4-config-data-generated\") pod \"openstack-galera-2\" (UID: \"0248bf28-3089-40e7-9ab1-2131010368c4\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.206553 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"7fde2aa4-e297-4641-b450-e95ea05b5229\") device mount path \"/mnt/openstack/pv05\"" pod="swift-kuttl-tests/openstack-galera-0" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.207558 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-1\" (UID: \"09cf3e3d-f27d-4258-a35a-17172dce14cf\") device mount path \"/mnt/openstack/pv03\"" pod="swift-kuttl-tests/openstack-galera-1" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.207686 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fde2aa4-e297-4641-b450-e95ea05b5229-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7fde2aa4-e297-4641-b450-e95ea05b5229\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.207075 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0248bf28-3089-40e7-9ab1-2131010368c4-config-data-default\") pod \"openstack-galera-2\" (UID: \"0248bf28-3089-40e7-9ab1-2131010368c4\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.207126 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7fde2aa4-e297-4641-b450-e95ea05b5229-kolla-config\") pod \"openstack-galera-0\" (UID: \"7fde2aa4-e297-4641-b450-e95ea05b5229\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.207469 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-1\" (UID: \"09cf3e3d-f27d-4258-a35a-17172dce14cf\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.207809 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/09cf3e3d-f27d-4258-a35a-17172dce14cf-config-data-generated\") pod \"openstack-galera-1\" (UID: \"09cf3e3d-f27d-4258-a35a-17172dce14cf\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.207863 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7fde2aa4-e297-4641-b450-e95ea05b5229-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7fde2aa4-e297-4641-b450-e95ea05b5229\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.207916 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2j6r\" (UniqueName: \"kubernetes.io/projected/0248bf28-3089-40e7-9ab1-2131010368c4-kube-api-access-s2j6r\") pod \"openstack-galera-2\" (UID: \"0248bf28-3089-40e7-9ab1-2131010368c4\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.207955 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09cf3e3d-f27d-4258-a35a-17172dce14cf-operator-scripts\") pod \"openstack-galera-1\" (UID: \"09cf3e3d-f27d-4258-a35a-17172dce14cf\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.206855 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/09cf3e3d-f27d-4258-a35a-17172dce14cf-kolla-config\") pod \"openstack-galera-1\" (UID: \"09cf3e3d-f27d-4258-a35a-17172dce14cf\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.207996 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0248bf28-3089-40e7-9ab1-2131010368c4-kolla-config\") pod \"openstack-galera-2\" (UID: \"0248bf28-3089-40e7-9ab1-2131010368c4\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.208034 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-2\" (UID: \"0248bf28-3089-40e7-9ab1-2131010368c4\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.208089 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/09cf3e3d-f27d-4258-a35a-17172dce14cf-config-data-default\") pod \"openstack-galera-1\" (UID: \"09cf3e3d-f27d-4258-a35a-17172dce14cf\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.208321 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/09cf3e3d-f27d-4258-a35a-17172dce14cf-config-data-generated\") pod \"openstack-galera-1\" (UID: \"09cf3e3d-f27d-4258-a35a-17172dce14cf\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.208908 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/09cf3e3d-f27d-4258-a35a-17172dce14cf-config-data-default\") pod \"openstack-galera-1\" (UID: \"09cf3e3d-f27d-4258-a35a-17172dce14cf\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.209400 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0248bf28-3089-40e7-9ab1-2131010368c4-kolla-config\") pod \"openstack-galera-2\" (UID: \"0248bf28-3089-40e7-9ab1-2131010368c4\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.207692 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0248bf28-3089-40e7-9ab1-2131010368c4-operator-scripts\") pod \"openstack-galera-2\" (UID: \"0248bf28-3089-40e7-9ab1-2131010368c4\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.209542 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-2\" (UID: \"0248bf28-3089-40e7-9ab1-2131010368c4\") device mount path \"/mnt/openstack/pv01\"" pod="swift-kuttl-tests/openstack-galera-2" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.209860 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09cf3e3d-f27d-4258-a35a-17172dce14cf-operator-scripts\") pod \"openstack-galera-1\" (UID: \"09cf3e3d-f27d-4258-a35a-17172dce14cf\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.210078 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7fde2aa4-e297-4641-b450-e95ea05b5229-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7fde2aa4-e297-4641-b450-e95ea05b5229\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.206961 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7fde2aa4-e297-4641-b450-e95ea05b5229-config-data-default\") pod \"openstack-galera-0\" (UID: \"7fde2aa4-e297-4641-b450-e95ea05b5229\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.227520 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-1\" (UID: \"09cf3e3d-f27d-4258-a35a-17172dce14cf\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.228100 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svnbw\" (UniqueName: \"kubernetes.io/projected/09cf3e3d-f27d-4258-a35a-17172dce14cf-kube-api-access-svnbw\") pod \"openstack-galera-1\" (UID: \"09cf3e3d-f27d-4258-a35a-17172dce14cf\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.228768 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"7fde2aa4-e297-4641-b450-e95ea05b5229\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.229063 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2j6r\" (UniqueName: \"kubernetes.io/projected/0248bf28-3089-40e7-9ab1-2131010368c4-kube-api-access-s2j6r\") pod \"openstack-galera-2\" (UID: \"0248bf28-3089-40e7-9ab1-2131010368c4\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.229714 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65fbn\" (UniqueName: \"kubernetes.io/projected/7fde2aa4-e297-4641-b450-e95ea05b5229-kube-api-access-65fbn\") pod \"openstack-galera-0\" (UID: \"7fde2aa4-e297-4641-b450-e95ea05b5229\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.235098 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-2\" (UID: \"0248bf28-3089-40e7-9ab1-2131010368c4\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.296507 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-2" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.310229 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-0" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.320563 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-1" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.709625 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8b74t" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.782292 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/openstack-galera-2"] Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.819831 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc95e23-ea4d-432e-a1fa-eed98bc40b7b-utilities\") pod \"bdc95e23-ea4d-432e-a1fa-eed98bc40b7b\" (UID: \"bdc95e23-ea4d-432e-a1fa-eed98bc40b7b\") " Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.819918 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc95e23-ea4d-432e-a1fa-eed98bc40b7b-catalog-content\") pod \"bdc95e23-ea4d-432e-a1fa-eed98bc40b7b\" (UID: \"bdc95e23-ea4d-432e-a1fa-eed98bc40b7b\") " Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.820035 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwtgj\" (UniqueName: \"kubernetes.io/projected/bdc95e23-ea4d-432e-a1fa-eed98bc40b7b-kube-api-access-dwtgj\") pod \"bdc95e23-ea4d-432e-a1fa-eed98bc40b7b\" (UID: \"bdc95e23-ea4d-432e-a1fa-eed98bc40b7b\") " Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.820793 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdc95e23-ea4d-432e-a1fa-eed98bc40b7b-utilities" (OuterVolumeSpecName: "utilities") pod "bdc95e23-ea4d-432e-a1fa-eed98bc40b7b" (UID: "bdc95e23-ea4d-432e-a1fa-eed98bc40b7b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.823575 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc95e23-ea4d-432e-a1fa-eed98bc40b7b-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.825000 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/openstack-galera-1"] Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.830104 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdc95e23-ea4d-432e-a1fa-eed98bc40b7b-kube-api-access-dwtgj" (OuterVolumeSpecName: "kube-api-access-dwtgj") pod "bdc95e23-ea4d-432e-a1fa-eed98bc40b7b" (UID: "bdc95e23-ea4d-432e-a1fa-eed98bc40b7b"). InnerVolumeSpecName "kube-api-access-dwtgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.924981 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwtgj\" (UniqueName: \"kubernetes.io/projected/bdc95e23-ea4d-432e-a1fa-eed98bc40b7b-kube-api-access-dwtgj\") on node \"crc\" DevicePath \"\"" Mar 09 09:34:53 crc kubenswrapper[4971]: I0309 09:34:53.968864 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdc95e23-ea4d-432e-a1fa-eed98bc40b7b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bdc95e23-ea4d-432e-a1fa-eed98bc40b7b" (UID: "bdc95e23-ea4d-432e-a1fa-eed98bc40b7b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:34:54 crc kubenswrapper[4971]: I0309 09:34:54.026662 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc95e23-ea4d-432e-a1fa-eed98bc40b7b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:34:54 crc kubenswrapper[4971]: I0309 09:34:54.083182 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/openstack-galera-0"] Mar 09 09:34:54 crc kubenswrapper[4971]: I0309 09:34:54.087048 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8b74t" event={"ID":"bdc95e23-ea4d-432e-a1fa-eed98bc40b7b","Type":"ContainerDied","Data":"2f394f3b11868f1bebe9651eaa37a4625dca58b7716d554a34d8d27ab2b93077"} Mar 09 09:34:54 crc kubenswrapper[4971]: I0309 09:34:54.087109 4971 scope.go:117] "RemoveContainer" containerID="c30c9e3e51956f18592742280215dd489ae20fcf48ed4feb44e6b73cc94be084" Mar 09 09:34:54 crc kubenswrapper[4971]: I0309 09:34:54.087995 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8b74t" Mar 09 09:34:54 crc kubenswrapper[4971]: I0309 09:34:54.089599 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-2" event={"ID":"0248bf28-3089-40e7-9ab1-2131010368c4","Type":"ContainerStarted","Data":"622d04d83cdabf10895e42469231da64e57f302cd9ac0f45e76ac6b1653c82c1"} Mar 09 09:34:54 crc kubenswrapper[4971]: I0309 09:34:54.093245 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-1" event={"ID":"09cf3e3d-f27d-4258-a35a-17172dce14cf","Type":"ContainerStarted","Data":"98f42d9ce81d368c2568ed8140e381d20cd59140045f0b3f3c09aba2fe6a2e29"} Mar 09 09:34:54 crc kubenswrapper[4971]: W0309 09:34:54.098080 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fde2aa4_e297_4641_b450_e95ea05b5229.slice/crio-c44760236ad3f62630471f296d2354cb4790a485d093d70c10280bfe6112a1e6 WatchSource:0}: Error finding container c44760236ad3f62630471f296d2354cb4790a485d093d70c10280bfe6112a1e6: Status 404 returned error can't find the container with id c44760236ad3f62630471f296d2354cb4790a485d093d70c10280bfe6112a1e6 Mar 09 09:34:54 crc kubenswrapper[4971]: I0309 09:34:54.116963 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8b74t"] Mar 09 09:34:54 crc kubenswrapper[4971]: I0309 09:34:54.121225 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8b74t"] Mar 09 09:34:54 crc kubenswrapper[4971]: I0309 09:34:54.122541 4971 scope.go:117] "RemoveContainer" containerID="60dd31495ad4467ef48bd971d88af4024945ac5a803718a440dbe149c2d6323e" Mar 09 09:34:54 crc kubenswrapper[4971]: I0309 09:34:54.158416 4971 scope.go:117] "RemoveContainer" containerID="61cdd23dd01f597b26823150551a6e5213482e9ab08b866eb98dfe1751de6eeb" Mar 09 09:34:55 crc kubenswrapper[4971]: I0309 09:34:55.117834 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-0" event={"ID":"7fde2aa4-e297-4641-b450-e95ea05b5229","Type":"ContainerStarted","Data":"c44760236ad3f62630471f296d2354cb4790a485d093d70c10280bfe6112a1e6"} Mar 09 09:34:55 crc kubenswrapper[4971]: I0309 09:34:55.163210 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdc95e23-ea4d-432e-a1fa-eed98bc40b7b" path="/var/lib/kubelet/pods/bdc95e23-ea4d-432e-a1fa-eed98bc40b7b/volumes" Mar 09 09:34:56 crc kubenswrapper[4971]: I0309 09:34:56.341664 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p4n75"] Mar 09 09:34:56 crc kubenswrapper[4971]: E0309 09:34:56.341962 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc95e23-ea4d-432e-a1fa-eed98bc40b7b" containerName="registry-server" Mar 09 09:34:56 crc kubenswrapper[4971]: I0309 09:34:56.341980 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc95e23-ea4d-432e-a1fa-eed98bc40b7b" containerName="registry-server" Mar 09 09:34:56 crc kubenswrapper[4971]: E0309 09:34:56.341999 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc95e23-ea4d-432e-a1fa-eed98bc40b7b" containerName="extract-utilities" Mar 09 09:34:56 crc kubenswrapper[4971]: I0309 09:34:56.342008 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc95e23-ea4d-432e-a1fa-eed98bc40b7b" containerName="extract-utilities" Mar 09 09:34:56 crc kubenswrapper[4971]: E0309 09:34:56.342019 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc95e23-ea4d-432e-a1fa-eed98bc40b7b" containerName="extract-content" Mar 09 09:34:56 crc kubenswrapper[4971]: I0309 09:34:56.342026 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc95e23-ea4d-432e-a1fa-eed98bc40b7b" containerName="extract-content" Mar 09 09:34:56 crc kubenswrapper[4971]: I0309 09:34:56.342179 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdc95e23-ea4d-432e-a1fa-eed98bc40b7b" containerName="registry-server" Mar 09 09:34:56 crc kubenswrapper[4971]: I0309 09:34:56.343195 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p4n75" Mar 09 09:34:56 crc kubenswrapper[4971]: I0309 09:34:56.367561 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p4n75"] Mar 09 09:34:56 crc kubenswrapper[4971]: I0309 09:34:56.487184 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a-utilities\") pod \"redhat-marketplace-p4n75\" (UID: \"4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a\") " pod="openshift-marketplace/redhat-marketplace-p4n75" Mar 09 09:34:56 crc kubenswrapper[4971]: I0309 09:34:56.487314 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nljfm\" (UniqueName: \"kubernetes.io/projected/4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a-kube-api-access-nljfm\") pod \"redhat-marketplace-p4n75\" (UID: \"4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a\") " pod="openshift-marketplace/redhat-marketplace-p4n75" Mar 09 09:34:56 crc kubenswrapper[4971]: I0309 09:34:56.487402 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a-catalog-content\") pod \"redhat-marketplace-p4n75\" (UID: \"4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a\") " pod="openshift-marketplace/redhat-marketplace-p4n75" Mar 09 09:34:56 crc kubenswrapper[4971]: I0309 09:34:56.593175 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nljfm\" (UniqueName: \"kubernetes.io/projected/4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a-kube-api-access-nljfm\") pod \"redhat-marketplace-p4n75\" (UID: \"4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a\") " pod="openshift-marketplace/redhat-marketplace-p4n75" Mar 09 09:34:56 crc kubenswrapper[4971]: I0309 09:34:56.593557 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a-catalog-content\") pod \"redhat-marketplace-p4n75\" (UID: \"4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a\") " pod="openshift-marketplace/redhat-marketplace-p4n75" Mar 09 09:34:56 crc kubenswrapper[4971]: I0309 09:34:56.593612 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a-utilities\") pod \"redhat-marketplace-p4n75\" (UID: \"4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a\") " pod="openshift-marketplace/redhat-marketplace-p4n75" Mar 09 09:34:56 crc kubenswrapper[4971]: I0309 09:34:56.594070 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a-utilities\") pod \"redhat-marketplace-p4n75\" (UID: \"4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a\") " pod="openshift-marketplace/redhat-marketplace-p4n75" Mar 09 09:34:56 crc kubenswrapper[4971]: I0309 09:34:56.594600 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a-catalog-content\") pod \"redhat-marketplace-p4n75\" (UID: \"4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a\") " pod="openshift-marketplace/redhat-marketplace-p4n75" Mar 09 09:34:56 crc kubenswrapper[4971]: I0309 09:34:56.630508 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nljfm\" (UniqueName: \"kubernetes.io/projected/4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a-kube-api-access-nljfm\") pod \"redhat-marketplace-p4n75\" (UID: \"4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a\") " pod="openshift-marketplace/redhat-marketplace-p4n75" Mar 09 09:34:56 crc kubenswrapper[4971]: I0309 09:34:56.669525 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p4n75" Mar 09 09:34:57 crc kubenswrapper[4971]: I0309 09:34:57.172220 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-557f5c56bb-4glvw" event={"ID":"ed9e539e-9f00-4168-9486-c1aa126c0514","Type":"ContainerStarted","Data":"ef51e72a68a85c0881f628e8dec2ce49dc1c62fcbb647651cd5505d433940a0d"} Mar 09 09:34:57 crc kubenswrapper[4971]: I0309 09:34:57.172906 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-557f5c56bb-4glvw" Mar 09 09:34:57 crc kubenswrapper[4971]: I0309 09:34:57.220862 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p4n75"] Mar 09 09:34:57 crc kubenswrapper[4971]: I0309 09:34:57.317381 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-557f5c56bb-4glvw" podStartSLOduration=2.26170291 podStartE2EDuration="6.317367726s" podCreationTimestamp="2026-03-09 09:34:51 +0000 UTC" firstStartedPulling="2026-03-09 09:34:52.317805328 +0000 UTC m=+895.877733138" lastFinishedPulling="2026-03-09 09:34:56.373470144 +0000 UTC m=+899.933397954" observedRunningTime="2026-03-09 09:34:57.316610634 +0000 UTC m=+900.876538454" watchObservedRunningTime="2026-03-09 09:34:57.317367726 +0000 UTC m=+900.877295536" Mar 09 09:34:57 crc kubenswrapper[4971]: I0309 09:34:57.883970 4971 scope.go:117] "RemoveContainer" containerID="fd3a314e7c2e35d412daa54c7b0d57ffeb9845b8a321765df386308d1d9eae61" Mar 09 09:34:58 crc kubenswrapper[4971]: I0309 09:34:58.202538 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p4n75" event={"ID":"4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a","Type":"ContainerStarted","Data":"b485b47de8345387d6cc7d7799058b47f4f1600d9cb0f9610519d5c7ea85e9ea"} Mar 09 09:35:01 crc kubenswrapper[4971]: I0309 09:35:01.241138 4971 generic.go:334] "Generic (PLEG): container finished" podID="4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a" containerID="e6b100f102496b79d3f0aa6d4a19a718bd10f3d799f31cb1342fecf86484ce82" exitCode=0 Mar 09 09:35:01 crc kubenswrapper[4971]: I0309 09:35:01.241337 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p4n75" event={"ID":"4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a","Type":"ContainerDied","Data":"e6b100f102496b79d3f0aa6d4a19a718bd10f3d799f31cb1342fecf86484ce82"} Mar 09 09:35:02 crc kubenswrapper[4971]: I0309 09:35:02.086433 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-557f5c56bb-4glvw" Mar 09 09:35:04 crc kubenswrapper[4971]: I0309 09:35:04.264621 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p4n75" event={"ID":"4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a","Type":"ContainerStarted","Data":"1e0f697926ebbc93419b5e0b85d3c7c2a80b58a707c8fc2a9e36ce1382c34a0e"} Mar 09 09:35:04 crc kubenswrapper[4971]: I0309 09:35:04.270964 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-2" event={"ID":"0248bf28-3089-40e7-9ab1-2131010368c4","Type":"ContainerStarted","Data":"d1b04530e56d39044bec2ab933974120a0cefcb689c173eb83dceec414feb5de"} Mar 09 09:35:04 crc kubenswrapper[4971]: I0309 09:35:04.273812 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-0" event={"ID":"7fde2aa4-e297-4641-b450-e95ea05b5229","Type":"ContainerStarted","Data":"7b1115067b4d84cfd867e9d974dafe90bb232b9a13d6444cef0a9a472e52cd01"} Mar 09 09:35:04 crc kubenswrapper[4971]: I0309 09:35:04.275675 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-1" event={"ID":"09cf3e3d-f27d-4258-a35a-17172dce14cf","Type":"ContainerStarted","Data":"94597ef68fdd8433887a101752c87dba34267c9571b1338e22dab40026fd6236"} Mar 09 09:35:05 crc kubenswrapper[4971]: I0309 09:35:05.287723 4971 generic.go:334] "Generic (PLEG): container finished" podID="4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a" containerID="1e0f697926ebbc93419b5e0b85d3c7c2a80b58a707c8fc2a9e36ce1382c34a0e" exitCode=0 Mar 09 09:35:05 crc kubenswrapper[4971]: I0309 09:35:05.287849 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p4n75" event={"ID":"4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a","Type":"ContainerDied","Data":"1e0f697926ebbc93419b5e0b85d3c7c2a80b58a707c8fc2a9e36ce1382c34a0e"} Mar 09 09:35:06 crc kubenswrapper[4971]: I0309 09:35:06.295761 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p4n75" event={"ID":"4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a","Type":"ContainerStarted","Data":"d95c412ac78408149037104c692c549018b850d13202811d62603b7bf6993c87"} Mar 09 09:35:06 crc kubenswrapper[4971]: I0309 09:35:06.316071 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p4n75" podStartSLOduration=5.722318057 podStartE2EDuration="10.316053989s" podCreationTimestamp="2026-03-09 09:34:56 +0000 UTC" firstStartedPulling="2026-03-09 09:35:01.293200423 +0000 UTC m=+904.853128233" lastFinishedPulling="2026-03-09 09:35:05.886936355 +0000 UTC m=+909.446864165" observedRunningTime="2026-03-09 09:35:06.313112801 +0000 UTC m=+909.873040611" watchObservedRunningTime="2026-03-09 09:35:06.316053989 +0000 UTC m=+909.875981799" Mar 09 09:35:06 crc kubenswrapper[4971]: I0309 09:35:06.670455 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p4n75" Mar 09 09:35:06 crc kubenswrapper[4971]: I0309 09:35:06.670490 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p4n75" Mar 09 09:35:07 crc kubenswrapper[4971]: I0309 09:35:07.589236 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/memcached-0"] Mar 09 09:35:07 crc kubenswrapper[4971]: I0309 09:35:07.591383 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/memcached-0" Mar 09 09:35:07 crc kubenswrapper[4971]: I0309 09:35:07.597812 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"memcached-memcached-dockercfg-kv8cf" Mar 09 09:35:07 crc kubenswrapper[4971]: I0309 09:35:07.598051 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"memcached-config-data" Mar 09 09:35:07 crc kubenswrapper[4971]: I0309 09:35:07.659959 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/memcached-0"] Mar 09 09:35:07 crc kubenswrapper[4971]: I0309 09:35:07.668817 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e4c9ed17-abec-40ab-acd0-aa857fd946f9-kolla-config\") pod \"memcached-0\" (UID: \"e4c9ed17-abec-40ab-acd0-aa857fd946f9\") " pod="swift-kuttl-tests/memcached-0" Mar 09 09:35:07 crc kubenswrapper[4971]: I0309 09:35:07.668888 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4c9ed17-abec-40ab-acd0-aa857fd946f9-config-data\") pod \"memcached-0\" (UID: \"e4c9ed17-abec-40ab-acd0-aa857fd946f9\") " pod="swift-kuttl-tests/memcached-0" Mar 09 09:35:07 crc kubenswrapper[4971]: I0309 09:35:07.668934 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p54f7\" (UniqueName: \"kubernetes.io/projected/e4c9ed17-abec-40ab-acd0-aa857fd946f9-kube-api-access-p54f7\") pod \"memcached-0\" (UID: \"e4c9ed17-abec-40ab-acd0-aa857fd946f9\") " pod="swift-kuttl-tests/memcached-0" Mar 09 09:35:07 crc kubenswrapper[4971]: I0309 09:35:07.734975 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-p4n75" podUID="4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a" containerName="registry-server" probeResult="failure" output=< Mar 09 09:35:07 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 09 09:35:07 crc kubenswrapper[4971]: > Mar 09 09:35:07 crc kubenswrapper[4971]: I0309 09:35:07.770582 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4c9ed17-abec-40ab-acd0-aa857fd946f9-config-data\") pod \"memcached-0\" (UID: \"e4c9ed17-abec-40ab-acd0-aa857fd946f9\") " pod="swift-kuttl-tests/memcached-0" Mar 09 09:35:07 crc kubenswrapper[4971]: I0309 09:35:07.770680 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p54f7\" (UniqueName: \"kubernetes.io/projected/e4c9ed17-abec-40ab-acd0-aa857fd946f9-kube-api-access-p54f7\") pod \"memcached-0\" (UID: \"e4c9ed17-abec-40ab-acd0-aa857fd946f9\") " pod="swift-kuttl-tests/memcached-0" Mar 09 09:35:07 crc kubenswrapper[4971]: I0309 09:35:07.770736 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e4c9ed17-abec-40ab-acd0-aa857fd946f9-kolla-config\") pod \"memcached-0\" (UID: \"e4c9ed17-abec-40ab-acd0-aa857fd946f9\") " pod="swift-kuttl-tests/memcached-0" Mar 09 09:35:07 crc kubenswrapper[4971]: I0309 09:35:07.771674 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4c9ed17-abec-40ab-acd0-aa857fd946f9-config-data\") pod \"memcached-0\" (UID: \"e4c9ed17-abec-40ab-acd0-aa857fd946f9\") " pod="swift-kuttl-tests/memcached-0" Mar 09 09:35:07 crc kubenswrapper[4971]: I0309 09:35:07.771772 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e4c9ed17-abec-40ab-acd0-aa857fd946f9-kolla-config\") pod \"memcached-0\" (UID: \"e4c9ed17-abec-40ab-acd0-aa857fd946f9\") " pod="swift-kuttl-tests/memcached-0" Mar 09 09:35:07 crc kubenswrapper[4971]: I0309 09:35:07.814324 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p54f7\" (UniqueName: \"kubernetes.io/projected/e4c9ed17-abec-40ab-acd0-aa857fd946f9-kube-api-access-p54f7\") pod \"memcached-0\" (UID: \"e4c9ed17-abec-40ab-acd0-aa857fd946f9\") " pod="swift-kuttl-tests/memcached-0" Mar 09 09:35:07 crc kubenswrapper[4971]: I0309 09:35:07.910529 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/memcached-0" Mar 09 09:35:08 crc kubenswrapper[4971]: I0309 09:35:08.314625 4971 generic.go:334] "Generic (PLEG): container finished" podID="0248bf28-3089-40e7-9ab1-2131010368c4" containerID="d1b04530e56d39044bec2ab933974120a0cefcb689c173eb83dceec414feb5de" exitCode=0 Mar 09 09:35:08 crc kubenswrapper[4971]: I0309 09:35:08.314717 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-2" event={"ID":"0248bf28-3089-40e7-9ab1-2131010368c4","Type":"ContainerDied","Data":"d1b04530e56d39044bec2ab933974120a0cefcb689c173eb83dceec414feb5de"} Mar 09 09:35:08 crc kubenswrapper[4971]: I0309 09:35:08.316766 4971 generic.go:334] "Generic (PLEG): container finished" podID="7fde2aa4-e297-4641-b450-e95ea05b5229" containerID="7b1115067b4d84cfd867e9d974dafe90bb232b9a13d6444cef0a9a472e52cd01" exitCode=0 Mar 09 09:35:08 crc kubenswrapper[4971]: I0309 09:35:08.316835 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-0" event={"ID":"7fde2aa4-e297-4641-b450-e95ea05b5229","Type":"ContainerDied","Data":"7b1115067b4d84cfd867e9d974dafe90bb232b9a13d6444cef0a9a472e52cd01"} Mar 09 09:35:08 crc kubenswrapper[4971]: I0309 09:35:08.319923 4971 generic.go:334] "Generic (PLEG): container finished" podID="09cf3e3d-f27d-4258-a35a-17172dce14cf" containerID="94597ef68fdd8433887a101752c87dba34267c9571b1338e22dab40026fd6236" exitCode=0 Mar 09 09:35:08 crc kubenswrapper[4971]: I0309 09:35:08.320009 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-1" event={"ID":"09cf3e3d-f27d-4258-a35a-17172dce14cf","Type":"ContainerDied","Data":"94597ef68fdd8433887a101752c87dba34267c9571b1338e22dab40026fd6236"} Mar 09 09:35:08 crc kubenswrapper[4971]: I0309 09:35:08.385020 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/memcached-0"] Mar 09 09:35:08 crc kubenswrapper[4971]: W0309 09:35:08.403683 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4c9ed17_abec_40ab_acd0_aa857fd946f9.slice/crio-11575a6fe1d50fe2299a424e66ec250ea55a09059f9e86ea26012c89463e3760 WatchSource:0}: Error finding container 11575a6fe1d50fe2299a424e66ec250ea55a09059f9e86ea26012c89463e3760: Status 404 returned error can't find the container with id 11575a6fe1d50fe2299a424e66ec250ea55a09059f9e86ea26012c89463e3760 Mar 09 09:35:09 crc kubenswrapper[4971]: I0309 09:35:09.340183 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-2" event={"ID":"0248bf28-3089-40e7-9ab1-2131010368c4","Type":"ContainerStarted","Data":"85e690cf9194d7c977faf51afd99c2c39795c06343e405c43885f0bc6fde332a"} Mar 09 09:35:09 crc kubenswrapper[4971]: I0309 09:35:09.342932 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-0" event={"ID":"7fde2aa4-e297-4641-b450-e95ea05b5229","Type":"ContainerStarted","Data":"3e6b627a08ae9ba0e0a00d9a43a920ffe7325f287bae6e20fe227d2181071d15"} Mar 09 09:35:09 crc kubenswrapper[4971]: I0309 09:35:09.344192 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/memcached-0" event={"ID":"e4c9ed17-abec-40ab-acd0-aa857fd946f9","Type":"ContainerStarted","Data":"11575a6fe1d50fe2299a424e66ec250ea55a09059f9e86ea26012c89463e3760"} Mar 09 09:35:09 crc kubenswrapper[4971]: I0309 09:35:09.346781 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-1" event={"ID":"09cf3e3d-f27d-4258-a35a-17172dce14cf","Type":"ContainerStarted","Data":"58f143d0dcf765d54a4821b13c02399fadadf9b738400779351a2c7674ac6557"} Mar 09 09:35:09 crc kubenswrapper[4971]: I0309 09:35:09.363742 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/openstack-galera-2" podStartSLOduration=8.248803971 podStartE2EDuration="18.363719864s" podCreationTimestamp="2026-03-09 09:34:51 +0000 UTC" firstStartedPulling="2026-03-09 09:34:53.789819911 +0000 UTC m=+897.349747731" lastFinishedPulling="2026-03-09 09:35:03.904735814 +0000 UTC m=+907.464663624" observedRunningTime="2026-03-09 09:35:09.359984545 +0000 UTC m=+912.919912375" watchObservedRunningTime="2026-03-09 09:35:09.363719864 +0000 UTC m=+912.923647674" Mar 09 09:35:09 crc kubenswrapper[4971]: I0309 09:35:09.383329 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/openstack-galera-1" podStartSLOduration=8.402585606 podStartE2EDuration="18.383305464s" podCreationTimestamp="2026-03-09 09:34:51 +0000 UTC" firstStartedPulling="2026-03-09 09:34:53.830245861 +0000 UTC m=+897.390173671" lastFinishedPulling="2026-03-09 09:35:03.810965719 +0000 UTC m=+907.370893529" observedRunningTime="2026-03-09 09:35:09.38164921 +0000 UTC m=+912.941577040" watchObservedRunningTime="2026-03-09 09:35:09.383305464 +0000 UTC m=+912.943233274" Mar 09 09:35:09 crc kubenswrapper[4971]: I0309 09:35:09.411073 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/openstack-galera-0" podStartSLOduration=8.672911738 podStartE2EDuration="18.411054401s" podCreationTimestamp="2026-03-09 09:34:51 +0000 UTC" firstStartedPulling="2026-03-09 09:34:54.102943508 +0000 UTC m=+897.662871318" lastFinishedPulling="2026-03-09 09:35:03.841086171 +0000 UTC m=+907.401013981" observedRunningTime="2026-03-09 09:35:09.406687135 +0000 UTC m=+912.966614945" watchObservedRunningTime="2026-03-09 09:35:09.411054401 +0000 UTC m=+912.970982211" Mar 09 09:35:11 crc kubenswrapper[4971]: I0309 09:35:11.140221 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-f85lg"] Mar 09 09:35:11 crc kubenswrapper[4971]: I0309 09:35:11.141721 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-f85lg" Mar 09 09:35:11 crc kubenswrapper[4971]: I0309 09:35:11.145362 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-qj55g" Mar 09 09:35:11 crc kubenswrapper[4971]: I0309 09:35:11.160403 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-f85lg"] Mar 09 09:35:11 crc kubenswrapper[4971]: I0309 09:35:11.229495 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sm54\" (UniqueName: \"kubernetes.io/projected/1d93efdc-b806-4f44-806b-9b6b43b80b22-kube-api-access-8sm54\") pod \"rabbitmq-cluster-operator-index-f85lg\" (UID: \"1d93efdc-b806-4f44-806b-9b6b43b80b22\") " pod="openstack-operators/rabbitmq-cluster-operator-index-f85lg" Mar 09 09:35:11 crc kubenswrapper[4971]: I0309 09:35:11.331061 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sm54\" (UniqueName: \"kubernetes.io/projected/1d93efdc-b806-4f44-806b-9b6b43b80b22-kube-api-access-8sm54\") pod \"rabbitmq-cluster-operator-index-f85lg\" (UID: \"1d93efdc-b806-4f44-806b-9b6b43b80b22\") " pod="openstack-operators/rabbitmq-cluster-operator-index-f85lg" Mar 09 09:35:11 crc kubenswrapper[4971]: I0309 09:35:11.352390 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sm54\" (UniqueName: \"kubernetes.io/projected/1d93efdc-b806-4f44-806b-9b6b43b80b22-kube-api-access-8sm54\") pod \"rabbitmq-cluster-operator-index-f85lg\" (UID: \"1d93efdc-b806-4f44-806b-9b6b43b80b22\") " pod="openstack-operators/rabbitmq-cluster-operator-index-f85lg" Mar 09 09:35:11 crc kubenswrapper[4971]: I0309 09:35:11.368843 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/memcached-0" event={"ID":"e4c9ed17-abec-40ab-acd0-aa857fd946f9","Type":"ContainerStarted","Data":"f4cdb6f1e1992214f1f628f9fba2ab0b2ba5c8b666ab0bee19ff93c0116f1a1d"} Mar 09 09:35:11 crc kubenswrapper[4971]: I0309 09:35:11.369753 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/memcached-0" Mar 09 09:35:11 crc kubenswrapper[4971]: I0309 09:35:11.385588 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/memcached-0" podStartSLOduration=1.961634907 podStartE2EDuration="4.38556775s" podCreationTimestamp="2026-03-09 09:35:07 +0000 UTC" firstStartedPulling="2026-03-09 09:35:08.40664563 +0000 UTC m=+911.966573440" lastFinishedPulling="2026-03-09 09:35:10.830578473 +0000 UTC m=+914.390506283" observedRunningTime="2026-03-09 09:35:11.383534386 +0000 UTC m=+914.943462216" watchObservedRunningTime="2026-03-09 09:35:11.38556775 +0000 UTC m=+914.945495560" Mar 09 09:35:11 crc kubenswrapper[4971]: I0309 09:35:11.497996 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-f85lg" Mar 09 09:35:11 crc kubenswrapper[4971]: I0309 09:35:11.995057 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-f85lg"] Mar 09 09:35:12 crc kubenswrapper[4971]: W0309 09:35:12.009640 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d93efdc_b806_4f44_806b_9b6b43b80b22.slice/crio-102482abd7b09be789c8967d329755a5eb84617832c1a11c34c3a5dfb85b1297 WatchSource:0}: Error finding container 102482abd7b09be789c8967d329755a5eb84617832c1a11c34c3a5dfb85b1297: Status 404 returned error can't find the container with id 102482abd7b09be789c8967d329755a5eb84617832c1a11c34c3a5dfb85b1297 Mar 09 09:35:12 crc kubenswrapper[4971]: I0309 09:35:12.398058 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-f85lg" event={"ID":"1d93efdc-b806-4f44-806b-9b6b43b80b22","Type":"ContainerStarted","Data":"102482abd7b09be789c8967d329755a5eb84617832c1a11c34c3a5dfb85b1297"} Mar 09 09:35:13 crc kubenswrapper[4971]: I0309 09:35:13.296734 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/openstack-galera-2" Mar 09 09:35:13 crc kubenswrapper[4971]: I0309 09:35:13.297077 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="swift-kuttl-tests/openstack-galera-2" Mar 09 09:35:13 crc kubenswrapper[4971]: I0309 09:35:13.311400 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/openstack-galera-0" Mar 09 09:35:13 crc kubenswrapper[4971]: I0309 09:35:13.311449 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="swift-kuttl-tests/openstack-galera-0" Mar 09 09:35:13 crc kubenswrapper[4971]: I0309 09:35:13.321614 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/openstack-galera-1" Mar 09 09:35:13 crc kubenswrapper[4971]: I0309 09:35:13.321656 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="swift-kuttl-tests/openstack-galera-1" Mar 09 09:35:14 crc kubenswrapper[4971]: I0309 09:35:14.794515 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:35:14 crc kubenswrapper[4971]: I0309 09:35:14.794865 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:35:16 crc kubenswrapper[4971]: I0309 09:35:16.711893 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p4n75" Mar 09 09:35:16 crc kubenswrapper[4971]: I0309 09:35:16.755739 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p4n75" Mar 09 09:35:17 crc kubenswrapper[4971]: I0309 09:35:17.446444 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-f85lg" event={"ID":"1d93efdc-b806-4f44-806b-9b6b43b80b22","Type":"ContainerStarted","Data":"657f98da4bb5c8ec052b2dcdc8dff1635f4fa17500dc4de37e7e07c18ba12bfb"} Mar 09 09:35:17 crc kubenswrapper[4971]: I0309 09:35:17.463402 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-f85lg" podStartSLOduration=1.731521639 podStartE2EDuration="6.463382746s" podCreationTimestamp="2026-03-09 09:35:11 +0000 UTC" firstStartedPulling="2026-03-09 09:35:12.021442364 +0000 UTC m=+915.581370174" lastFinishedPulling="2026-03-09 09:35:16.753303471 +0000 UTC m=+920.313231281" observedRunningTime="2026-03-09 09:35:17.460742535 +0000 UTC m=+921.020670345" watchObservedRunningTime="2026-03-09 09:35:17.463382746 +0000 UTC m=+921.023310556" Mar 09 09:35:17 crc kubenswrapper[4971]: I0309 09:35:17.802053 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="swift-kuttl-tests/openstack-galera-2" Mar 09 09:35:17 crc kubenswrapper[4971]: I0309 09:35:17.874128 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/openstack-galera-2" Mar 09 09:35:17 crc kubenswrapper[4971]: I0309 09:35:17.916474 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/memcached-0" Mar 09 09:35:21 crc kubenswrapper[4971]: I0309 09:35:21.498503 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-f85lg" Mar 09 09:35:21 crc kubenswrapper[4971]: I0309 09:35:21.499078 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-f85lg" Mar 09 09:35:21 crc kubenswrapper[4971]: I0309 09:35:21.536884 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-f85lg" Mar 09 09:35:21 crc kubenswrapper[4971]: I0309 09:35:21.735245 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p4n75"] Mar 09 09:35:21 crc kubenswrapper[4971]: I0309 09:35:21.735868 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p4n75" podUID="4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a" containerName="registry-server" containerID="cri-o://d95c412ac78408149037104c692c549018b850d13202811d62603b7bf6993c87" gracePeriod=2 Mar 09 09:35:22 crc kubenswrapper[4971]: I0309 09:35:22.002524 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/root-account-create-update-kx6hb"] Mar 09 09:35:22 crc kubenswrapper[4971]: I0309 09:35:22.003521 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/root-account-create-update-kx6hb" Mar 09 09:35:22 crc kubenswrapper[4971]: I0309 09:35:22.007726 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"openstack-mariadb-root-db-secret" Mar 09 09:35:22 crc kubenswrapper[4971]: I0309 09:35:22.008440 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/root-account-create-update-kx6hb"] Mar 09 09:35:22 crc kubenswrapper[4971]: I0309 09:35:22.186830 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f15888f6-dc1e-4f8d-8f06-3cf15b21ad21-operator-scripts\") pod \"root-account-create-update-kx6hb\" (UID: \"f15888f6-dc1e-4f8d-8f06-3cf15b21ad21\") " pod="swift-kuttl-tests/root-account-create-update-kx6hb" Mar 09 09:35:22 crc kubenswrapper[4971]: I0309 09:35:22.186898 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfrb9\" (UniqueName: \"kubernetes.io/projected/f15888f6-dc1e-4f8d-8f06-3cf15b21ad21-kube-api-access-mfrb9\") pod \"root-account-create-update-kx6hb\" (UID: \"f15888f6-dc1e-4f8d-8f06-3cf15b21ad21\") " pod="swift-kuttl-tests/root-account-create-update-kx6hb" Mar 09 09:35:22 crc kubenswrapper[4971]: I0309 09:35:22.288692 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f15888f6-dc1e-4f8d-8f06-3cf15b21ad21-operator-scripts\") pod \"root-account-create-update-kx6hb\" (UID: \"f15888f6-dc1e-4f8d-8f06-3cf15b21ad21\") " pod="swift-kuttl-tests/root-account-create-update-kx6hb" Mar 09 09:35:22 crc kubenswrapper[4971]: I0309 09:35:22.288771 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfrb9\" (UniqueName: \"kubernetes.io/projected/f15888f6-dc1e-4f8d-8f06-3cf15b21ad21-kube-api-access-mfrb9\") pod \"root-account-create-update-kx6hb\" (UID: \"f15888f6-dc1e-4f8d-8f06-3cf15b21ad21\") " pod="swift-kuttl-tests/root-account-create-update-kx6hb" Mar 09 09:35:22 crc kubenswrapper[4971]: I0309 09:35:22.289901 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f15888f6-dc1e-4f8d-8f06-3cf15b21ad21-operator-scripts\") pod \"root-account-create-update-kx6hb\" (UID: \"f15888f6-dc1e-4f8d-8f06-3cf15b21ad21\") " pod="swift-kuttl-tests/root-account-create-update-kx6hb" Mar 09 09:35:22 crc kubenswrapper[4971]: I0309 09:35:22.314386 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfrb9\" (UniqueName: \"kubernetes.io/projected/f15888f6-dc1e-4f8d-8f06-3cf15b21ad21-kube-api-access-mfrb9\") pod \"root-account-create-update-kx6hb\" (UID: \"f15888f6-dc1e-4f8d-8f06-3cf15b21ad21\") " pod="swift-kuttl-tests/root-account-create-update-kx6hb" Mar 09 09:35:22 crc kubenswrapper[4971]: I0309 09:35:22.381155 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/root-account-create-update-kx6hb" Mar 09 09:35:22 crc kubenswrapper[4971]: I0309 09:35:22.478808 4971 generic.go:334] "Generic (PLEG): container finished" podID="4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a" containerID="d95c412ac78408149037104c692c549018b850d13202811d62603b7bf6993c87" exitCode=0 Mar 09 09:35:22 crc kubenswrapper[4971]: I0309 09:35:22.478899 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p4n75" event={"ID":"4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a","Type":"ContainerDied","Data":"d95c412ac78408149037104c692c549018b850d13202811d62603b7bf6993c87"} Mar 09 09:35:22 crc kubenswrapper[4971]: I0309 09:35:22.508247 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-f85lg" Mar 09 09:35:23 crc kubenswrapper[4971]: I0309 09:35:23.367506 4971 prober.go:107] "Probe failed" probeType="Readiness" pod="swift-kuttl-tests/openstack-galera-2" podUID="0248bf28-3089-40e7-9ab1-2131010368c4" containerName="galera" probeResult="failure" output=< Mar 09 09:35:23 crc kubenswrapper[4971]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Mar 09 09:35:23 crc kubenswrapper[4971]: > Mar 09 09:35:24 crc kubenswrapper[4971]: I0309 09:35:24.377915 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p"] Mar 09 09:35:24 crc kubenswrapper[4971]: I0309 09:35:24.379438 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p" Mar 09 09:35:24 crc kubenswrapper[4971]: I0309 09:35:24.382943 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-w69pb" Mar 09 09:35:24 crc kubenswrapper[4971]: I0309 09:35:24.392388 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p"] Mar 09 09:35:24 crc kubenswrapper[4971]: I0309 09:35:24.527048 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7a9f6bd-2366-4ffb-95a1-14d177e046a6-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p\" (UID: \"f7a9f6bd-2366-4ffb-95a1-14d177e046a6\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p" Mar 09 09:35:24 crc kubenswrapper[4971]: I0309 09:35:24.527125 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7a9f6bd-2366-4ffb-95a1-14d177e046a6-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p\" (UID: \"f7a9f6bd-2366-4ffb-95a1-14d177e046a6\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p" Mar 09 09:35:24 crc kubenswrapper[4971]: I0309 09:35:24.527251 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt5gt\" (UniqueName: \"kubernetes.io/projected/f7a9f6bd-2366-4ffb-95a1-14d177e046a6-kube-api-access-qt5gt\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p\" (UID: \"f7a9f6bd-2366-4ffb-95a1-14d177e046a6\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p" Mar 09 09:35:24 crc kubenswrapper[4971]: I0309 09:35:24.630337 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt5gt\" (UniqueName: \"kubernetes.io/projected/f7a9f6bd-2366-4ffb-95a1-14d177e046a6-kube-api-access-qt5gt\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p\" (UID: \"f7a9f6bd-2366-4ffb-95a1-14d177e046a6\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p" Mar 09 09:35:24 crc kubenswrapper[4971]: I0309 09:35:24.630786 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7a9f6bd-2366-4ffb-95a1-14d177e046a6-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p\" (UID: \"f7a9f6bd-2366-4ffb-95a1-14d177e046a6\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p" Mar 09 09:35:24 crc kubenswrapper[4971]: I0309 09:35:24.630822 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7a9f6bd-2366-4ffb-95a1-14d177e046a6-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p\" (UID: \"f7a9f6bd-2366-4ffb-95a1-14d177e046a6\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p" Mar 09 09:35:24 crc kubenswrapper[4971]: I0309 09:35:24.631548 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7a9f6bd-2366-4ffb-95a1-14d177e046a6-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p\" (UID: \"f7a9f6bd-2366-4ffb-95a1-14d177e046a6\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p" Mar 09 09:35:24 crc kubenswrapper[4971]: I0309 09:35:24.632040 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7a9f6bd-2366-4ffb-95a1-14d177e046a6-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p\" (UID: \"f7a9f6bd-2366-4ffb-95a1-14d177e046a6\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p" Mar 09 09:35:24 crc kubenswrapper[4971]: I0309 09:35:24.666432 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt5gt\" (UniqueName: \"kubernetes.io/projected/f7a9f6bd-2366-4ffb-95a1-14d177e046a6-kube-api-access-qt5gt\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p\" (UID: \"f7a9f6bd-2366-4ffb-95a1-14d177e046a6\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p" Mar 09 09:35:24 crc kubenswrapper[4971]: I0309 09:35:24.697892 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p" Mar 09 09:35:24 crc kubenswrapper[4971]: I0309 09:35:24.982851 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/root-account-create-update-kx6hb"] Mar 09 09:35:24 crc kubenswrapper[4971]: W0309 09:35:24.992873 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf15888f6_dc1e_4f8d_8f06_3cf15b21ad21.slice/crio-e01e5835f770ef701455d29727220c3e014007e9770fd81065bed1b39881f61b WatchSource:0}: Error finding container e01e5835f770ef701455d29727220c3e014007e9770fd81065bed1b39881f61b: Status 404 returned error can't find the container with id e01e5835f770ef701455d29727220c3e014007e9770fd81065bed1b39881f61b Mar 09 09:35:25 crc kubenswrapper[4971]: I0309 09:35:25.260206 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p"] Mar 09 09:35:25 crc kubenswrapper[4971]: W0309 09:35:25.274607 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7a9f6bd_2366_4ffb_95a1_14d177e046a6.slice/crio-070361f52e9b481f58b7051f65e83c57b18ff07d09107d3282056709a7dc6130 WatchSource:0}: Error finding container 070361f52e9b481f58b7051f65e83c57b18ff07d09107d3282056709a7dc6130: Status 404 returned error can't find the container with id 070361f52e9b481f58b7051f65e83c57b18ff07d09107d3282056709a7dc6130 Mar 09 09:35:25 crc kubenswrapper[4971]: I0309 09:35:25.275108 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p4n75" Mar 09 09:35:25 crc kubenswrapper[4971]: I0309 09:35:25.445390 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a-utilities\") pod \"4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a\" (UID: \"4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a\") " Mar 09 09:35:25 crc kubenswrapper[4971]: I0309 09:35:25.445572 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a-catalog-content\") pod \"4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a\" (UID: \"4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a\") " Mar 09 09:35:25 crc kubenswrapper[4971]: I0309 09:35:25.445619 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nljfm\" (UniqueName: \"kubernetes.io/projected/4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a-kube-api-access-nljfm\") pod \"4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a\" (UID: \"4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a\") " Mar 09 09:35:25 crc kubenswrapper[4971]: I0309 09:35:25.446555 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a-utilities" (OuterVolumeSpecName: "utilities") pod "4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a" (UID: "4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:35:25 crc kubenswrapper[4971]: I0309 09:35:25.457411 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a-kube-api-access-nljfm" (OuterVolumeSpecName: "kube-api-access-nljfm") pod "4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a" (UID: "4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a"). InnerVolumeSpecName "kube-api-access-nljfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:35:25 crc kubenswrapper[4971]: I0309 09:35:25.483591 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a" (UID: "4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:35:25 crc kubenswrapper[4971]: I0309 09:35:25.499890 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/root-account-create-update-kx6hb" event={"ID":"f15888f6-dc1e-4f8d-8f06-3cf15b21ad21","Type":"ContainerStarted","Data":"9703dcd74fa7b81e233bb1f220d60ca4230be8db77e5a78031bb8d2c9692f4f5"} Mar 09 09:35:25 crc kubenswrapper[4971]: I0309 09:35:25.499964 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/root-account-create-update-kx6hb" event={"ID":"f15888f6-dc1e-4f8d-8f06-3cf15b21ad21","Type":"ContainerStarted","Data":"e01e5835f770ef701455d29727220c3e014007e9770fd81065bed1b39881f61b"} Mar 09 09:35:25 crc kubenswrapper[4971]: I0309 09:35:25.501785 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p" event={"ID":"f7a9f6bd-2366-4ffb-95a1-14d177e046a6","Type":"ContainerStarted","Data":"d53e2ca8d587b5a812aa3a1d5832077fe1fd6c6d53e3de096051a4a288672712"} Mar 09 09:35:25 crc kubenswrapper[4971]: I0309 09:35:25.501833 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p" event={"ID":"f7a9f6bd-2366-4ffb-95a1-14d177e046a6","Type":"ContainerStarted","Data":"070361f52e9b481f58b7051f65e83c57b18ff07d09107d3282056709a7dc6130"} Mar 09 09:35:25 crc kubenswrapper[4971]: I0309 09:35:25.505574 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p4n75" event={"ID":"4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a","Type":"ContainerDied","Data":"b485b47de8345387d6cc7d7799058b47f4f1600d9cb0f9610519d5c7ea85e9ea"} Mar 09 09:35:25 crc kubenswrapper[4971]: I0309 09:35:25.505643 4971 scope.go:117] "RemoveContainer" containerID="d95c412ac78408149037104c692c549018b850d13202811d62603b7bf6993c87" Mar 09 09:35:25 crc kubenswrapper[4971]: I0309 09:35:25.505639 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p4n75" Mar 09 09:35:25 crc kubenswrapper[4971]: I0309 09:35:25.523543 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/root-account-create-update-kx6hb" podStartSLOduration=4.523519397 podStartE2EDuration="4.523519397s" podCreationTimestamp="2026-03-09 09:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:35:25.520224159 +0000 UTC m=+929.080151969" watchObservedRunningTime="2026-03-09 09:35:25.523519397 +0000 UTC m=+929.083447207" Mar 09 09:35:25 crc kubenswrapper[4971]: I0309 09:35:25.530282 4971 scope.go:117] "RemoveContainer" containerID="1e0f697926ebbc93419b5e0b85d3c7c2a80b58a707c8fc2a9e36ce1382c34a0e" Mar 09 09:35:25 crc kubenswrapper[4971]: I0309 09:35:25.553411 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:35:25 crc kubenswrapper[4971]: I0309 09:35:25.553453 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:35:25 crc kubenswrapper[4971]: I0309 09:35:25.553469 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nljfm\" (UniqueName: \"kubernetes.io/projected/4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a-kube-api-access-nljfm\") on node \"crc\" DevicePath \"\"" Mar 09 09:35:25 crc kubenswrapper[4971]: I0309 09:35:25.570975 4971 scope.go:117] "RemoveContainer" containerID="e6b100f102496b79d3f0aa6d4a19a718bd10f3d799f31cb1342fecf86484ce82" Mar 09 09:35:25 crc kubenswrapper[4971]: I0309 09:35:25.671027 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p4n75"] Mar 09 09:35:25 crc kubenswrapper[4971]: I0309 09:35:25.676044 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p4n75"] Mar 09 09:35:26 crc kubenswrapper[4971]: I0309 09:35:26.512388 4971 generic.go:334] "Generic (PLEG): container finished" podID="f7a9f6bd-2366-4ffb-95a1-14d177e046a6" containerID="d53e2ca8d587b5a812aa3a1d5832077fe1fd6c6d53e3de096051a4a288672712" exitCode=0 Mar 09 09:35:26 crc kubenswrapper[4971]: I0309 09:35:26.512691 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p" event={"ID":"f7a9f6bd-2366-4ffb-95a1-14d177e046a6","Type":"ContainerDied","Data":"d53e2ca8d587b5a812aa3a1d5832077fe1fd6c6d53e3de096051a4a288672712"} Mar 09 09:35:27 crc kubenswrapper[4971]: I0309 09:35:27.161103 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a" path="/var/lib/kubelet/pods/4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a/volumes" Mar 09 09:35:27 crc kubenswrapper[4971]: I0309 09:35:27.521705 4971 generic.go:334] "Generic (PLEG): container finished" podID="f15888f6-dc1e-4f8d-8f06-3cf15b21ad21" containerID="9703dcd74fa7b81e233bb1f220d60ca4230be8db77e5a78031bb8d2c9692f4f5" exitCode=0 Mar 09 09:35:27 crc kubenswrapper[4971]: I0309 09:35:27.521884 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/root-account-create-update-kx6hb" event={"ID":"f15888f6-dc1e-4f8d-8f06-3cf15b21ad21","Type":"ContainerDied","Data":"9703dcd74fa7b81e233bb1f220d60ca4230be8db77e5a78031bb8d2c9692f4f5"} Mar 09 09:35:27 crc kubenswrapper[4971]: I0309 09:35:27.524610 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p" event={"ID":"f7a9f6bd-2366-4ffb-95a1-14d177e046a6","Type":"ContainerStarted","Data":"0de5a223d2e3cdd9f3ee39524f77d1bf9e81e3a209f6a8f37dc7f81abc7a47f1"} Mar 09 09:35:28 crc kubenswrapper[4971]: I0309 09:35:28.530509 4971 generic.go:334] "Generic (PLEG): container finished" podID="f7a9f6bd-2366-4ffb-95a1-14d177e046a6" containerID="0de5a223d2e3cdd9f3ee39524f77d1bf9e81e3a209f6a8f37dc7f81abc7a47f1" exitCode=0 Mar 09 09:35:28 crc kubenswrapper[4971]: I0309 09:35:28.531317 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p" event={"ID":"f7a9f6bd-2366-4ffb-95a1-14d177e046a6","Type":"ContainerDied","Data":"0de5a223d2e3cdd9f3ee39524f77d1bf9e81e3a209f6a8f37dc7f81abc7a47f1"} Mar 09 09:35:28 crc kubenswrapper[4971]: I0309 09:35:28.890211 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/root-account-create-update-kx6hb" Mar 09 09:35:28 crc kubenswrapper[4971]: I0309 09:35:28.900694 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfrb9\" (UniqueName: \"kubernetes.io/projected/f15888f6-dc1e-4f8d-8f06-3cf15b21ad21-kube-api-access-mfrb9\") pod \"f15888f6-dc1e-4f8d-8f06-3cf15b21ad21\" (UID: \"f15888f6-dc1e-4f8d-8f06-3cf15b21ad21\") " Mar 09 09:35:28 crc kubenswrapper[4971]: I0309 09:35:28.900743 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f15888f6-dc1e-4f8d-8f06-3cf15b21ad21-operator-scripts\") pod \"f15888f6-dc1e-4f8d-8f06-3cf15b21ad21\" (UID: \"f15888f6-dc1e-4f8d-8f06-3cf15b21ad21\") " Mar 09 09:35:28 crc kubenswrapper[4971]: I0309 09:35:28.901574 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f15888f6-dc1e-4f8d-8f06-3cf15b21ad21-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f15888f6-dc1e-4f8d-8f06-3cf15b21ad21" (UID: "f15888f6-dc1e-4f8d-8f06-3cf15b21ad21"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:35:28 crc kubenswrapper[4971]: I0309 09:35:28.914221 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f15888f6-dc1e-4f8d-8f06-3cf15b21ad21-kube-api-access-mfrb9" (OuterVolumeSpecName: "kube-api-access-mfrb9") pod "f15888f6-dc1e-4f8d-8f06-3cf15b21ad21" (UID: "f15888f6-dc1e-4f8d-8f06-3cf15b21ad21"). InnerVolumeSpecName "kube-api-access-mfrb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:35:29 crc kubenswrapper[4971]: I0309 09:35:29.002672 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfrb9\" (UniqueName: \"kubernetes.io/projected/f15888f6-dc1e-4f8d-8f06-3cf15b21ad21-kube-api-access-mfrb9\") on node \"crc\" DevicePath \"\"" Mar 09 09:35:29 crc kubenswrapper[4971]: I0309 09:35:29.002717 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f15888f6-dc1e-4f8d-8f06-3cf15b21ad21-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:35:29 crc kubenswrapper[4971]: I0309 09:35:29.537717 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/root-account-create-update-kx6hb" event={"ID":"f15888f6-dc1e-4f8d-8f06-3cf15b21ad21","Type":"ContainerDied","Data":"e01e5835f770ef701455d29727220c3e014007e9770fd81065bed1b39881f61b"} Mar 09 09:35:29 crc kubenswrapper[4971]: I0309 09:35:29.537786 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e01e5835f770ef701455d29727220c3e014007e9770fd81065bed1b39881f61b" Mar 09 09:35:29 crc kubenswrapper[4971]: I0309 09:35:29.537908 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/root-account-create-update-kx6hb" Mar 09 09:35:29 crc kubenswrapper[4971]: I0309 09:35:29.540115 4971 generic.go:334] "Generic (PLEG): container finished" podID="f7a9f6bd-2366-4ffb-95a1-14d177e046a6" containerID="e0ea029d783a4311e1777929ed8e877cf4b953f3dde561c6a1008d68a7d7c597" exitCode=0 Mar 09 09:35:29 crc kubenswrapper[4971]: I0309 09:35:29.540165 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p" event={"ID":"f7a9f6bd-2366-4ffb-95a1-14d177e046a6","Type":"ContainerDied","Data":"e0ea029d783a4311e1777929ed8e877cf4b953f3dde561c6a1008d68a7d7c597"} Mar 09 09:35:30 crc kubenswrapper[4971]: I0309 09:35:30.142432 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kdzx5"] Mar 09 09:35:30 crc kubenswrapper[4971]: E0309 09:35:30.142747 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a" containerName="extract-content" Mar 09 09:35:30 crc kubenswrapper[4971]: I0309 09:35:30.142773 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a" containerName="extract-content" Mar 09 09:35:30 crc kubenswrapper[4971]: E0309 09:35:30.142786 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a" containerName="extract-utilities" Mar 09 09:35:30 crc kubenswrapper[4971]: I0309 09:35:30.142794 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a" containerName="extract-utilities" Mar 09 09:35:30 crc kubenswrapper[4971]: E0309 09:35:30.142802 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a" containerName="registry-server" Mar 09 09:35:30 crc kubenswrapper[4971]: I0309 09:35:30.142812 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a" containerName="registry-server" Mar 09 09:35:30 crc kubenswrapper[4971]: E0309 09:35:30.142835 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f15888f6-dc1e-4f8d-8f06-3cf15b21ad21" containerName="mariadb-account-create-update" Mar 09 09:35:30 crc kubenswrapper[4971]: I0309 09:35:30.142843 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f15888f6-dc1e-4f8d-8f06-3cf15b21ad21" containerName="mariadb-account-create-update" Mar 09 09:35:30 crc kubenswrapper[4971]: I0309 09:35:30.142984 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f15888f6-dc1e-4f8d-8f06-3cf15b21ad21" containerName="mariadb-account-create-update" Mar 09 09:35:30 crc kubenswrapper[4971]: I0309 09:35:30.143007 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cbf5a0d-5c57-4f0a-8c65-a7c67dd9464a" containerName="registry-server" Mar 09 09:35:30 crc kubenswrapper[4971]: I0309 09:35:30.144400 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdzx5" Mar 09 09:35:30 crc kubenswrapper[4971]: I0309 09:35:30.162416 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kdzx5"] Mar 09 09:35:30 crc kubenswrapper[4971]: I0309 09:35:30.317505 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7322ca67-380d-4de9-bc40-b636ad1e57b7-utilities\") pod \"certified-operators-kdzx5\" (UID: \"7322ca67-380d-4de9-bc40-b636ad1e57b7\") " pod="openshift-marketplace/certified-operators-kdzx5" Mar 09 09:35:30 crc kubenswrapper[4971]: I0309 09:35:30.317562 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbfwd\" (UniqueName: \"kubernetes.io/projected/7322ca67-380d-4de9-bc40-b636ad1e57b7-kube-api-access-sbfwd\") pod \"certified-operators-kdzx5\" (UID: \"7322ca67-380d-4de9-bc40-b636ad1e57b7\") " pod="openshift-marketplace/certified-operators-kdzx5" Mar 09 09:35:30 crc kubenswrapper[4971]: I0309 09:35:30.317685 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7322ca67-380d-4de9-bc40-b636ad1e57b7-catalog-content\") pod \"certified-operators-kdzx5\" (UID: \"7322ca67-380d-4de9-bc40-b636ad1e57b7\") " pod="openshift-marketplace/certified-operators-kdzx5" Mar 09 09:35:30 crc kubenswrapper[4971]: I0309 09:35:30.418793 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7322ca67-380d-4de9-bc40-b636ad1e57b7-catalog-content\") pod \"certified-operators-kdzx5\" (UID: \"7322ca67-380d-4de9-bc40-b636ad1e57b7\") " pod="openshift-marketplace/certified-operators-kdzx5" Mar 09 09:35:30 crc kubenswrapper[4971]: I0309 09:35:30.418875 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7322ca67-380d-4de9-bc40-b636ad1e57b7-utilities\") pod \"certified-operators-kdzx5\" (UID: \"7322ca67-380d-4de9-bc40-b636ad1e57b7\") " pod="openshift-marketplace/certified-operators-kdzx5" Mar 09 09:35:30 crc kubenswrapper[4971]: I0309 09:35:30.418903 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbfwd\" (UniqueName: \"kubernetes.io/projected/7322ca67-380d-4de9-bc40-b636ad1e57b7-kube-api-access-sbfwd\") pod \"certified-operators-kdzx5\" (UID: \"7322ca67-380d-4de9-bc40-b636ad1e57b7\") " pod="openshift-marketplace/certified-operators-kdzx5" Mar 09 09:35:30 crc kubenswrapper[4971]: I0309 09:35:30.419482 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7322ca67-380d-4de9-bc40-b636ad1e57b7-catalog-content\") pod \"certified-operators-kdzx5\" (UID: \"7322ca67-380d-4de9-bc40-b636ad1e57b7\") " pod="openshift-marketplace/certified-operators-kdzx5" Mar 09 09:35:30 crc kubenswrapper[4971]: I0309 09:35:30.419496 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7322ca67-380d-4de9-bc40-b636ad1e57b7-utilities\") pod \"certified-operators-kdzx5\" (UID: \"7322ca67-380d-4de9-bc40-b636ad1e57b7\") " pod="openshift-marketplace/certified-operators-kdzx5" Mar 09 09:35:30 crc kubenswrapper[4971]: I0309 09:35:30.438825 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbfwd\" (UniqueName: \"kubernetes.io/projected/7322ca67-380d-4de9-bc40-b636ad1e57b7-kube-api-access-sbfwd\") pod \"certified-operators-kdzx5\" (UID: \"7322ca67-380d-4de9-bc40-b636ad1e57b7\") " pod="openshift-marketplace/certified-operators-kdzx5" Mar 09 09:35:30 crc kubenswrapper[4971]: I0309 09:35:30.459234 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdzx5" Mar 09 09:35:30 crc kubenswrapper[4971]: I0309 09:35:30.754987 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kdzx5"] Mar 09 09:35:30 crc kubenswrapper[4971]: W0309 09:35:30.761511 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7322ca67_380d_4de9_bc40_b636ad1e57b7.slice/crio-49cecdf75475d31aeb0cf7f2ab4604588eb24c2f5bfb7cc03fe0c8b4b56ad55a WatchSource:0}: Error finding container 49cecdf75475d31aeb0cf7f2ab4604588eb24c2f5bfb7cc03fe0c8b4b56ad55a: Status 404 returned error can't find the container with id 49cecdf75475d31aeb0cf7f2ab4604588eb24c2f5bfb7cc03fe0c8b4b56ad55a Mar 09 09:35:30 crc kubenswrapper[4971]: I0309 09:35:30.776923 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="swift-kuttl-tests/openstack-galera-0" Mar 09 09:35:30 crc kubenswrapper[4971]: I0309 09:35:30.891981 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/openstack-galera-0" Mar 09 09:35:31 crc kubenswrapper[4971]: I0309 09:35:31.005301 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p" Mar 09 09:35:31 crc kubenswrapper[4971]: I0309 09:35:31.130219 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7a9f6bd-2366-4ffb-95a1-14d177e046a6-util\") pod \"f7a9f6bd-2366-4ffb-95a1-14d177e046a6\" (UID: \"f7a9f6bd-2366-4ffb-95a1-14d177e046a6\") " Mar 09 09:35:31 crc kubenswrapper[4971]: I0309 09:35:31.130365 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7a9f6bd-2366-4ffb-95a1-14d177e046a6-bundle\") pod \"f7a9f6bd-2366-4ffb-95a1-14d177e046a6\" (UID: \"f7a9f6bd-2366-4ffb-95a1-14d177e046a6\") " Mar 09 09:35:31 crc kubenswrapper[4971]: I0309 09:35:31.130440 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt5gt\" (UniqueName: \"kubernetes.io/projected/f7a9f6bd-2366-4ffb-95a1-14d177e046a6-kube-api-access-qt5gt\") pod \"f7a9f6bd-2366-4ffb-95a1-14d177e046a6\" (UID: \"f7a9f6bd-2366-4ffb-95a1-14d177e046a6\") " Mar 09 09:35:31 crc kubenswrapper[4971]: I0309 09:35:31.131141 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7a9f6bd-2366-4ffb-95a1-14d177e046a6-bundle" (OuterVolumeSpecName: "bundle") pod "f7a9f6bd-2366-4ffb-95a1-14d177e046a6" (UID: "f7a9f6bd-2366-4ffb-95a1-14d177e046a6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:35:31 crc kubenswrapper[4971]: I0309 09:35:31.135927 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7a9f6bd-2366-4ffb-95a1-14d177e046a6-kube-api-access-qt5gt" (OuterVolumeSpecName: "kube-api-access-qt5gt") pod "f7a9f6bd-2366-4ffb-95a1-14d177e046a6" (UID: "f7a9f6bd-2366-4ffb-95a1-14d177e046a6"). InnerVolumeSpecName "kube-api-access-qt5gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:35:31 crc kubenswrapper[4971]: I0309 09:35:31.156503 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7a9f6bd-2366-4ffb-95a1-14d177e046a6-util" (OuterVolumeSpecName: "util") pod "f7a9f6bd-2366-4ffb-95a1-14d177e046a6" (UID: "f7a9f6bd-2366-4ffb-95a1-14d177e046a6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:35:31 crc kubenswrapper[4971]: I0309 09:35:31.231704 4971 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7a9f6bd-2366-4ffb-95a1-14d177e046a6-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:35:31 crc kubenswrapper[4971]: I0309 09:35:31.231777 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt5gt\" (UniqueName: \"kubernetes.io/projected/f7a9f6bd-2366-4ffb-95a1-14d177e046a6-kube-api-access-qt5gt\") on node \"crc\" DevicePath \"\"" Mar 09 09:35:31 crc kubenswrapper[4971]: I0309 09:35:31.231793 4971 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7a9f6bd-2366-4ffb-95a1-14d177e046a6-util\") on node \"crc\" DevicePath \"\"" Mar 09 09:35:31 crc kubenswrapper[4971]: I0309 09:35:31.560316 4971 generic.go:334] "Generic (PLEG): container finished" podID="7322ca67-380d-4de9-bc40-b636ad1e57b7" containerID="253aa339abd7230f31d3b0fe2838573e32d1ed71f0de7d18a02a5eff1f556403" exitCode=0 Mar 09 09:35:31 crc kubenswrapper[4971]: I0309 09:35:31.560387 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdzx5" event={"ID":"7322ca67-380d-4de9-bc40-b636ad1e57b7","Type":"ContainerDied","Data":"253aa339abd7230f31d3b0fe2838573e32d1ed71f0de7d18a02a5eff1f556403"} Mar 09 09:35:31 crc kubenswrapper[4971]: I0309 09:35:31.560442 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdzx5" event={"ID":"7322ca67-380d-4de9-bc40-b636ad1e57b7","Type":"ContainerStarted","Data":"49cecdf75475d31aeb0cf7f2ab4604588eb24c2f5bfb7cc03fe0c8b4b56ad55a"} Mar 09 09:35:31 crc kubenswrapper[4971]: I0309 09:35:31.563894 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p" event={"ID":"f7a9f6bd-2366-4ffb-95a1-14d177e046a6","Type":"ContainerDied","Data":"070361f52e9b481f58b7051f65e83c57b18ff07d09107d3282056709a7dc6130"} Mar 09 09:35:31 crc kubenswrapper[4971]: I0309 09:35:31.563962 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="070361f52e9b481f58b7051f65e83c57b18ff07d09107d3282056709a7dc6130" Mar 09 09:35:31 crc kubenswrapper[4971]: I0309 09:35:31.563922 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p" Mar 09 09:35:31 crc kubenswrapper[4971]: I0309 09:35:31.863334 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="swift-kuttl-tests/openstack-galera-1" Mar 09 09:35:31 crc kubenswrapper[4971]: I0309 09:35:31.950782 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/openstack-galera-1" Mar 09 09:35:32 crc kubenswrapper[4971]: I0309 09:35:32.573876 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdzx5" event={"ID":"7322ca67-380d-4de9-bc40-b636ad1e57b7","Type":"ContainerStarted","Data":"612de3d985d18b7ab4b035aaa2a45fc299ae220418c2fdaf79f9f673fcb70e9e"} Mar 09 09:35:33 crc kubenswrapper[4971]: I0309 09:35:33.581833 4971 generic.go:334] "Generic (PLEG): container finished" podID="7322ca67-380d-4de9-bc40-b636ad1e57b7" containerID="612de3d985d18b7ab4b035aaa2a45fc299ae220418c2fdaf79f9f673fcb70e9e" exitCode=0 Mar 09 09:35:33 crc kubenswrapper[4971]: I0309 09:35:33.582018 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdzx5" event={"ID":"7322ca67-380d-4de9-bc40-b636ad1e57b7","Type":"ContainerDied","Data":"612de3d985d18b7ab4b035aaa2a45fc299ae220418c2fdaf79f9f673fcb70e9e"} Mar 09 09:35:33 crc kubenswrapper[4971]: I0309 09:35:33.582117 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdzx5" event={"ID":"7322ca67-380d-4de9-bc40-b636ad1e57b7","Type":"ContainerStarted","Data":"d6e84b01688a53ab222224507b669f0fef92055b20defc45ff74dbb1eb530cfc"} Mar 09 09:35:33 crc kubenswrapper[4971]: I0309 09:35:33.627201 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kdzx5" podStartSLOduration=2.196112704 podStartE2EDuration="3.627177632s" podCreationTimestamp="2026-03-09 09:35:30 +0000 UTC" firstStartedPulling="2026-03-09 09:35:31.562486219 +0000 UTC m=+935.122414029" lastFinishedPulling="2026-03-09 09:35:32.993551147 +0000 UTC m=+936.553478957" observedRunningTime="2026-03-09 09:35:33.621395589 +0000 UTC m=+937.181323409" watchObservedRunningTime="2026-03-09 09:35:33.627177632 +0000 UTC m=+937.187105442" Mar 09 09:35:40 crc kubenswrapper[4971]: I0309 09:35:40.460048 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kdzx5" Mar 09 09:35:40 crc kubenswrapper[4971]: I0309 09:35:40.461152 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kdzx5" Mar 09 09:35:40 crc kubenswrapper[4971]: I0309 09:35:40.504580 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kdzx5" Mar 09 09:35:40 crc kubenswrapper[4971]: I0309 09:35:40.659296 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kdzx5" Mar 09 09:35:40 crc kubenswrapper[4971]: I0309 09:35:40.870607 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-gvnmw"] Mar 09 09:35:40 crc kubenswrapper[4971]: E0309 09:35:40.870899 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7a9f6bd-2366-4ffb-95a1-14d177e046a6" containerName="pull" Mar 09 09:35:40 crc kubenswrapper[4971]: I0309 09:35:40.870913 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a9f6bd-2366-4ffb-95a1-14d177e046a6" containerName="pull" Mar 09 09:35:40 crc kubenswrapper[4971]: E0309 09:35:40.870928 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7a9f6bd-2366-4ffb-95a1-14d177e046a6" containerName="extract" Mar 09 09:35:40 crc kubenswrapper[4971]: I0309 09:35:40.870937 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a9f6bd-2366-4ffb-95a1-14d177e046a6" containerName="extract" Mar 09 09:35:40 crc kubenswrapper[4971]: E0309 09:35:40.870948 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7a9f6bd-2366-4ffb-95a1-14d177e046a6" containerName="util" Mar 09 09:35:40 crc kubenswrapper[4971]: I0309 09:35:40.870956 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a9f6bd-2366-4ffb-95a1-14d177e046a6" containerName="util" Mar 09 09:35:40 crc kubenswrapper[4971]: I0309 09:35:40.871105 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7a9f6bd-2366-4ffb-95a1-14d177e046a6" containerName="extract" Mar 09 09:35:40 crc kubenswrapper[4971]: I0309 09:35:40.871617 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-gvnmw" Mar 09 09:35:40 crc kubenswrapper[4971]: I0309 09:35:40.873810 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-bthq4" Mar 09 09:35:40 crc kubenswrapper[4971]: I0309 09:35:40.888489 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-gvnmw"] Mar 09 09:35:40 crc kubenswrapper[4971]: I0309 09:35:40.921688 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q4dg\" (UniqueName: \"kubernetes.io/projected/76fe4e94-ba21-4369-882b-efdc47c25ec3-kube-api-access-5q4dg\") pod \"rabbitmq-cluster-operator-779fc9694b-gvnmw\" (UID: \"76fe4e94-ba21-4369-882b-efdc47c25ec3\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-gvnmw" Mar 09 09:35:41 crc kubenswrapper[4971]: I0309 09:35:41.022761 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q4dg\" (UniqueName: \"kubernetes.io/projected/76fe4e94-ba21-4369-882b-efdc47c25ec3-kube-api-access-5q4dg\") pod \"rabbitmq-cluster-operator-779fc9694b-gvnmw\" (UID: \"76fe4e94-ba21-4369-882b-efdc47c25ec3\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-gvnmw" Mar 09 09:35:41 crc kubenswrapper[4971]: I0309 09:35:41.043744 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q4dg\" (UniqueName: \"kubernetes.io/projected/76fe4e94-ba21-4369-882b-efdc47c25ec3-kube-api-access-5q4dg\") pod \"rabbitmq-cluster-operator-779fc9694b-gvnmw\" (UID: \"76fe4e94-ba21-4369-882b-efdc47c25ec3\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-gvnmw" Mar 09 09:35:41 crc kubenswrapper[4971]: I0309 09:35:41.190336 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-gvnmw" Mar 09 09:35:41 crc kubenswrapper[4971]: I0309 09:35:41.587114 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-gvnmw"] Mar 09 09:35:41 crc kubenswrapper[4971]: I0309 09:35:41.628551 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-gvnmw" event={"ID":"76fe4e94-ba21-4369-882b-efdc47c25ec3","Type":"ContainerStarted","Data":"00cc8ff184ac8c9231f77e2670bfdeaffe445f3d8c005077994e042bf4f76c40"} Mar 09 09:35:43 crc kubenswrapper[4971]: I0309 09:35:43.932522 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kdzx5"] Mar 09 09:35:43 crc kubenswrapper[4971]: I0309 09:35:43.933058 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kdzx5" podUID="7322ca67-380d-4de9-bc40-b636ad1e57b7" containerName="registry-server" containerID="cri-o://d6e84b01688a53ab222224507b669f0fef92055b20defc45ff74dbb1eb530cfc" gracePeriod=2 Mar 09 09:35:44 crc kubenswrapper[4971]: I0309 09:35:44.649472 4971 generic.go:334] "Generic (PLEG): container finished" podID="7322ca67-380d-4de9-bc40-b636ad1e57b7" containerID="d6e84b01688a53ab222224507b669f0fef92055b20defc45ff74dbb1eb530cfc" exitCode=0 Mar 09 09:35:44 crc kubenswrapper[4971]: I0309 09:35:44.649503 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdzx5" event={"ID":"7322ca67-380d-4de9-bc40-b636ad1e57b7","Type":"ContainerDied","Data":"d6e84b01688a53ab222224507b669f0fef92055b20defc45ff74dbb1eb530cfc"} Mar 09 09:35:44 crc kubenswrapper[4971]: I0309 09:35:44.794307 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:35:44 crc kubenswrapper[4971]: I0309 09:35:44.794385 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:35:44 crc kubenswrapper[4971]: I0309 09:35:44.794433 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" Mar 09 09:35:44 crc kubenswrapper[4971]: I0309 09:35:44.795055 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3faafb59e33c928765c2ecf23a7678ad846a40e6f9948d8c13dc3d6b7074865f"} pod="openshift-machine-config-operator/machine-config-daemon-p56wx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:35:44 crc kubenswrapper[4971]: I0309 09:35:44.795107 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" containerID="cri-o://3faafb59e33c928765c2ecf23a7678ad846a40e6f9948d8c13dc3d6b7074865f" gracePeriod=600 Mar 09 09:35:45 crc kubenswrapper[4971]: I0309 09:35:45.192989 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdzx5" Mar 09 09:35:45 crc kubenswrapper[4971]: I0309 09:35:45.383058 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7322ca67-380d-4de9-bc40-b636ad1e57b7-catalog-content\") pod \"7322ca67-380d-4de9-bc40-b636ad1e57b7\" (UID: \"7322ca67-380d-4de9-bc40-b636ad1e57b7\") " Mar 09 09:35:45 crc kubenswrapper[4971]: I0309 09:35:45.383631 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbfwd\" (UniqueName: \"kubernetes.io/projected/7322ca67-380d-4de9-bc40-b636ad1e57b7-kube-api-access-sbfwd\") pod \"7322ca67-380d-4de9-bc40-b636ad1e57b7\" (UID: \"7322ca67-380d-4de9-bc40-b636ad1e57b7\") " Mar 09 09:35:45 crc kubenswrapper[4971]: I0309 09:35:45.383666 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7322ca67-380d-4de9-bc40-b636ad1e57b7-utilities\") pod \"7322ca67-380d-4de9-bc40-b636ad1e57b7\" (UID: \"7322ca67-380d-4de9-bc40-b636ad1e57b7\") " Mar 09 09:35:45 crc kubenswrapper[4971]: I0309 09:35:45.384685 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7322ca67-380d-4de9-bc40-b636ad1e57b7-utilities" (OuterVolumeSpecName: "utilities") pod "7322ca67-380d-4de9-bc40-b636ad1e57b7" (UID: "7322ca67-380d-4de9-bc40-b636ad1e57b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:35:45 crc kubenswrapper[4971]: I0309 09:35:45.390068 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7322ca67-380d-4de9-bc40-b636ad1e57b7-kube-api-access-sbfwd" (OuterVolumeSpecName: "kube-api-access-sbfwd") pod "7322ca67-380d-4de9-bc40-b636ad1e57b7" (UID: "7322ca67-380d-4de9-bc40-b636ad1e57b7"). InnerVolumeSpecName "kube-api-access-sbfwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:35:45 crc kubenswrapper[4971]: I0309 09:35:45.448846 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7322ca67-380d-4de9-bc40-b636ad1e57b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7322ca67-380d-4de9-bc40-b636ad1e57b7" (UID: "7322ca67-380d-4de9-bc40-b636ad1e57b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:35:45 crc kubenswrapper[4971]: I0309 09:35:45.485386 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbfwd\" (UniqueName: \"kubernetes.io/projected/7322ca67-380d-4de9-bc40-b636ad1e57b7-kube-api-access-sbfwd\") on node \"crc\" DevicePath \"\"" Mar 09 09:35:45 crc kubenswrapper[4971]: I0309 09:35:45.485462 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7322ca67-380d-4de9-bc40-b636ad1e57b7-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:35:45 crc kubenswrapper[4971]: I0309 09:35:45.485476 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7322ca67-380d-4de9-bc40-b636ad1e57b7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:35:45 crc kubenswrapper[4971]: I0309 09:35:45.657239 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdzx5" Mar 09 09:35:45 crc kubenswrapper[4971]: I0309 09:35:45.657340 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdzx5" event={"ID":"7322ca67-380d-4de9-bc40-b636ad1e57b7","Type":"ContainerDied","Data":"49cecdf75475d31aeb0cf7f2ab4604588eb24c2f5bfb7cc03fe0c8b4b56ad55a"} Mar 09 09:35:45 crc kubenswrapper[4971]: I0309 09:35:45.657409 4971 scope.go:117] "RemoveContainer" containerID="d6e84b01688a53ab222224507b669f0fef92055b20defc45ff74dbb1eb530cfc" Mar 09 09:35:45 crc kubenswrapper[4971]: I0309 09:35:45.660976 4971 generic.go:334] "Generic (PLEG): container finished" podID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerID="3faafb59e33c928765c2ecf23a7678ad846a40e6f9948d8c13dc3d6b7074865f" exitCode=0 Mar 09 09:35:45 crc kubenswrapper[4971]: I0309 09:35:45.661024 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" event={"ID":"05fde3ad-1182-4b15-bb1a-f365ecc92d75","Type":"ContainerDied","Data":"3faafb59e33c928765c2ecf23a7678ad846a40e6f9948d8c13dc3d6b7074865f"} Mar 09 09:35:45 crc kubenswrapper[4971]: I0309 09:35:45.661045 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" event={"ID":"05fde3ad-1182-4b15-bb1a-f365ecc92d75","Type":"ContainerStarted","Data":"850265ce9f01a5c63d70bb3589fb993cb12b2014828540d2dac94573f14584e1"} Mar 09 09:35:45 crc kubenswrapper[4971]: I0309 09:35:45.663418 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-gvnmw" event={"ID":"76fe4e94-ba21-4369-882b-efdc47c25ec3","Type":"ContainerStarted","Data":"305228cb9c004495bb4cb657f76a9905f1734d0957cf0d072f072fa46a7eea3e"} Mar 09 09:35:45 crc kubenswrapper[4971]: I0309 09:35:45.672185 4971 scope.go:117] "RemoveContainer" containerID="612de3d985d18b7ab4b035aaa2a45fc299ae220418c2fdaf79f9f673fcb70e9e" Mar 09 09:35:45 crc kubenswrapper[4971]: I0309 09:35:45.695067 4971 scope.go:117] "RemoveContainer" containerID="253aa339abd7230f31d3b0fe2838573e32d1ed71f0de7d18a02a5eff1f556403" Mar 09 09:35:45 crc kubenswrapper[4971]: I0309 09:35:45.695251 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kdzx5"] Mar 09 09:35:45 crc kubenswrapper[4971]: I0309 09:35:45.700008 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kdzx5"] Mar 09 09:35:45 crc kubenswrapper[4971]: I0309 09:35:45.714196 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-gvnmw" podStartSLOduration=2.280500205 podStartE2EDuration="5.714142419s" podCreationTimestamp="2026-03-09 09:35:40 +0000 UTC" firstStartedPulling="2026-03-09 09:35:41.59902916 +0000 UTC m=+945.158956970" lastFinishedPulling="2026-03-09 09:35:45.032671374 +0000 UTC m=+948.592599184" observedRunningTime="2026-03-09 09:35:45.706604559 +0000 UTC m=+949.266532369" watchObservedRunningTime="2026-03-09 09:35:45.714142419 +0000 UTC m=+949.274070239" Mar 09 09:35:45 crc kubenswrapper[4971]: I0309 09:35:45.715788 4971 scope.go:117] "RemoveContainer" containerID="75bb88e6db008edd2980d5e44e1931a66833b416d839996571ee8b190f030a3c" Mar 09 09:35:47 crc kubenswrapper[4971]: I0309 09:35:47.159127 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7322ca67-380d-4de9-bc40-b636ad1e57b7" path="/var/lib/kubelet/pods/7322ca67-380d-4de9-bc40-b636ad1e57b7/volumes" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.501693 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/rabbitmq-server-0"] Mar 09 09:35:50 crc kubenswrapper[4971]: E0309 09:35:50.502248 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7322ca67-380d-4de9-bc40-b636ad1e57b7" containerName="extract-content" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.502267 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7322ca67-380d-4de9-bc40-b636ad1e57b7" containerName="extract-content" Mar 09 09:35:50 crc kubenswrapper[4971]: E0309 09:35:50.502284 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7322ca67-380d-4de9-bc40-b636ad1e57b7" containerName="registry-server" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.502292 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7322ca67-380d-4de9-bc40-b636ad1e57b7" containerName="registry-server" Mar 09 09:35:50 crc kubenswrapper[4971]: E0309 09:35:50.502308 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7322ca67-380d-4de9-bc40-b636ad1e57b7" containerName="extract-utilities" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.502315 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7322ca67-380d-4de9-bc40-b636ad1e57b7" containerName="extract-utilities" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.502462 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7322ca67-380d-4de9-bc40-b636ad1e57b7" containerName="registry-server" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.503130 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.505098 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"rabbitmq-erlang-cookie" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.505626 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"rabbitmq-plugins-conf" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.506442 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"rabbitmq-server-conf" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.506898 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"rabbitmq-default-user" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.510854 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"rabbitmq-server-dockercfg-7b5r5" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.524566 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/rabbitmq-server-0"] Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.657879 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5fbe67b5-f371-4d9a-9777-cbfeff3f2863-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5fbe67b5-f371-4d9a-9777-cbfeff3f2863\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.657950 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5fbe67b5-f371-4d9a-9777-cbfeff3f2863-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5fbe67b5-f371-4d9a-9777-cbfeff3f2863\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.657984 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ee81f973-0269-4b19-9f3e-45addddc23d0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ee81f973-0269-4b19-9f3e-45addddc23d0\") pod \"rabbitmq-server-0\" (UID: \"5fbe67b5-f371-4d9a-9777-cbfeff3f2863\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.658059 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5fbe67b5-f371-4d9a-9777-cbfeff3f2863-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5fbe67b5-f371-4d9a-9777-cbfeff3f2863\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.658088 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kqt5\" (UniqueName: \"kubernetes.io/projected/5fbe67b5-f371-4d9a-9777-cbfeff3f2863-kube-api-access-4kqt5\") pod \"rabbitmq-server-0\" (UID: \"5fbe67b5-f371-4d9a-9777-cbfeff3f2863\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.658120 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5fbe67b5-f371-4d9a-9777-cbfeff3f2863-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5fbe67b5-f371-4d9a-9777-cbfeff3f2863\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.658151 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5fbe67b5-f371-4d9a-9777-cbfeff3f2863-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5fbe67b5-f371-4d9a-9777-cbfeff3f2863\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.658172 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5fbe67b5-f371-4d9a-9777-cbfeff3f2863-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5fbe67b5-f371-4d9a-9777-cbfeff3f2863\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.759255 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5fbe67b5-f371-4d9a-9777-cbfeff3f2863-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5fbe67b5-f371-4d9a-9777-cbfeff3f2863\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.759320 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kqt5\" (UniqueName: \"kubernetes.io/projected/5fbe67b5-f371-4d9a-9777-cbfeff3f2863-kube-api-access-4kqt5\") pod \"rabbitmq-server-0\" (UID: \"5fbe67b5-f371-4d9a-9777-cbfeff3f2863\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.759385 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5fbe67b5-f371-4d9a-9777-cbfeff3f2863-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5fbe67b5-f371-4d9a-9777-cbfeff3f2863\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.759426 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5fbe67b5-f371-4d9a-9777-cbfeff3f2863-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5fbe67b5-f371-4d9a-9777-cbfeff3f2863\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.759454 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5fbe67b5-f371-4d9a-9777-cbfeff3f2863-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5fbe67b5-f371-4d9a-9777-cbfeff3f2863\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.759491 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5fbe67b5-f371-4d9a-9777-cbfeff3f2863-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5fbe67b5-f371-4d9a-9777-cbfeff3f2863\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.759518 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5fbe67b5-f371-4d9a-9777-cbfeff3f2863-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5fbe67b5-f371-4d9a-9777-cbfeff3f2863\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.759548 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ee81f973-0269-4b19-9f3e-45addddc23d0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ee81f973-0269-4b19-9f3e-45addddc23d0\") pod \"rabbitmq-server-0\" (UID: \"5fbe67b5-f371-4d9a-9777-cbfeff3f2863\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.760374 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5fbe67b5-f371-4d9a-9777-cbfeff3f2863-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5fbe67b5-f371-4d9a-9777-cbfeff3f2863\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.760462 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5fbe67b5-f371-4d9a-9777-cbfeff3f2863-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5fbe67b5-f371-4d9a-9777-cbfeff3f2863\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.760996 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5fbe67b5-f371-4d9a-9777-cbfeff3f2863-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5fbe67b5-f371-4d9a-9777-cbfeff3f2863\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.763288 4971 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.763330 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ee81f973-0269-4b19-9f3e-45addddc23d0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ee81f973-0269-4b19-9f3e-45addddc23d0\") pod \"rabbitmq-server-0\" (UID: \"5fbe67b5-f371-4d9a-9777-cbfeff3f2863\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/78ba8cb52c80b97088f5a1c2c5d076b0c147e56d839f0b624930657c0bef0d9d/globalmount\"" pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.766904 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5fbe67b5-f371-4d9a-9777-cbfeff3f2863-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5fbe67b5-f371-4d9a-9777-cbfeff3f2863\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.766928 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5fbe67b5-f371-4d9a-9777-cbfeff3f2863-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5fbe67b5-f371-4d9a-9777-cbfeff3f2863\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.770207 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5fbe67b5-f371-4d9a-9777-cbfeff3f2863-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5fbe67b5-f371-4d9a-9777-cbfeff3f2863\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.783641 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kqt5\" (UniqueName: \"kubernetes.io/projected/5fbe67b5-f371-4d9a-9777-cbfeff3f2863-kube-api-access-4kqt5\") pod \"rabbitmq-server-0\" (UID: \"5fbe67b5-f371-4d9a-9777-cbfeff3f2863\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.789240 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ee81f973-0269-4b19-9f3e-45addddc23d0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ee81f973-0269-4b19-9f3e-45addddc23d0\") pod \"rabbitmq-server-0\" (UID: \"5fbe67b5-f371-4d9a-9777-cbfeff3f2863\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:50 crc kubenswrapper[4971]: I0309 09:35:50.818703 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:35:51 crc kubenswrapper[4971]: I0309 09:35:51.365787 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/rabbitmq-server-0"] Mar 09 09:35:51 crc kubenswrapper[4971]: I0309 09:35:51.707595 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/rabbitmq-server-0" event={"ID":"5fbe67b5-f371-4d9a-9777-cbfeff3f2863","Type":"ContainerStarted","Data":"5b9093b8f10543a2d611024696f8177731ccf9dac108e88d459f7505742fa77e"} Mar 09 09:35:52 crc kubenswrapper[4971]: I0309 09:35:52.548768 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-4m4x8"] Mar 09 09:35:52 crc kubenswrapper[4971]: I0309 09:35:52.549936 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-4m4x8" Mar 09 09:35:52 crc kubenswrapper[4971]: I0309 09:35:52.553246 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-2p6xc" Mar 09 09:35:52 crc kubenswrapper[4971]: I0309 09:35:52.566859 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-4m4x8"] Mar 09 09:35:52 crc kubenswrapper[4971]: I0309 09:35:52.688724 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zqc4\" (UniqueName: \"kubernetes.io/projected/9be7483c-58ce-4857-b90f-fe74b32b3bdd-kube-api-access-4zqc4\") pod \"keystone-operator-index-4m4x8\" (UID: \"9be7483c-58ce-4857-b90f-fe74b32b3bdd\") " pod="openstack-operators/keystone-operator-index-4m4x8" Mar 09 09:35:52 crc kubenswrapper[4971]: I0309 09:35:52.790242 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zqc4\" (UniqueName: \"kubernetes.io/projected/9be7483c-58ce-4857-b90f-fe74b32b3bdd-kube-api-access-4zqc4\") pod \"keystone-operator-index-4m4x8\" (UID: \"9be7483c-58ce-4857-b90f-fe74b32b3bdd\") " pod="openstack-operators/keystone-operator-index-4m4x8" Mar 09 09:35:52 crc kubenswrapper[4971]: I0309 09:35:52.810735 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zqc4\" (UniqueName: \"kubernetes.io/projected/9be7483c-58ce-4857-b90f-fe74b32b3bdd-kube-api-access-4zqc4\") pod \"keystone-operator-index-4m4x8\" (UID: \"9be7483c-58ce-4857-b90f-fe74b32b3bdd\") " pod="openstack-operators/keystone-operator-index-4m4x8" Mar 09 09:35:52 crc kubenswrapper[4971]: I0309 09:35:52.872284 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-4m4x8" Mar 09 09:35:53 crc kubenswrapper[4971]: I0309 09:35:53.323125 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-4m4x8"] Mar 09 09:35:53 crc kubenswrapper[4971]: I0309 09:35:53.724736 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-4m4x8" event={"ID":"9be7483c-58ce-4857-b90f-fe74b32b3bdd","Type":"ContainerStarted","Data":"d7a767d0b6b611e713adbfd7647830404588b9173069e34003108dc7fc1bc2aa"} Mar 09 09:35:58 crc kubenswrapper[4971]: I0309 09:35:58.757240 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-4m4x8" event={"ID":"9be7483c-58ce-4857-b90f-fe74b32b3bdd","Type":"ContainerStarted","Data":"536ad08b51425a071bc73366cd52c724905cf8cda91565e76178b3881ec57cbb"} Mar 09 09:35:58 crc kubenswrapper[4971]: I0309 09:35:58.760847 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/rabbitmq-server-0" event={"ID":"5fbe67b5-f371-4d9a-9777-cbfeff3f2863","Type":"ContainerStarted","Data":"dc9da9973797e7ebae0e820bdbe861db28c73aa9f5dd3ea8cb258beb202c037b"} Mar 09 09:35:58 crc kubenswrapper[4971]: I0309 09:35:58.779392 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-4m4x8" podStartSLOduration=2.357891288 podStartE2EDuration="6.7793285s" podCreationTimestamp="2026-03-09 09:35:52 +0000 UTC" firstStartedPulling="2026-03-09 09:35:53.345748682 +0000 UTC m=+956.905676492" lastFinishedPulling="2026-03-09 09:35:57.767185894 +0000 UTC m=+961.327113704" observedRunningTime="2026-03-09 09:35:58.771233055 +0000 UTC m=+962.331160875" watchObservedRunningTime="2026-03-09 09:35:58.7793285 +0000 UTC m=+962.339256310" Mar 09 09:36:00 crc kubenswrapper[4971]: I0309 09:36:00.127961 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550816-8d9h6"] Mar 09 09:36:00 crc kubenswrapper[4971]: I0309 09:36:00.129070 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550816-8d9h6" Mar 09 09:36:00 crc kubenswrapper[4971]: I0309 09:36:00.131466 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:36:00 crc kubenswrapper[4971]: I0309 09:36:00.131671 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-xhrv2" Mar 09 09:36:00 crc kubenswrapper[4971]: I0309 09:36:00.131771 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:36:00 crc kubenswrapper[4971]: I0309 09:36:00.140219 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550816-8d9h6"] Mar 09 09:36:00 crc kubenswrapper[4971]: I0309 09:36:00.308138 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tngpr\" (UniqueName: \"kubernetes.io/projected/57b3d640-db55-4357-9b31-46a45640a583-kube-api-access-tngpr\") pod \"auto-csr-approver-29550816-8d9h6\" (UID: \"57b3d640-db55-4357-9b31-46a45640a583\") " pod="openshift-infra/auto-csr-approver-29550816-8d9h6" Mar 09 09:36:00 crc kubenswrapper[4971]: I0309 09:36:00.409466 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tngpr\" (UniqueName: \"kubernetes.io/projected/57b3d640-db55-4357-9b31-46a45640a583-kube-api-access-tngpr\") pod \"auto-csr-approver-29550816-8d9h6\" (UID: \"57b3d640-db55-4357-9b31-46a45640a583\") " pod="openshift-infra/auto-csr-approver-29550816-8d9h6" Mar 09 09:36:00 crc kubenswrapper[4971]: I0309 09:36:00.436256 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tngpr\" (UniqueName: \"kubernetes.io/projected/57b3d640-db55-4357-9b31-46a45640a583-kube-api-access-tngpr\") pod \"auto-csr-approver-29550816-8d9h6\" (UID: \"57b3d640-db55-4357-9b31-46a45640a583\") " pod="openshift-infra/auto-csr-approver-29550816-8d9h6" Mar 09 09:36:00 crc kubenswrapper[4971]: I0309 09:36:00.453221 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550816-8d9h6" Mar 09 09:36:00 crc kubenswrapper[4971]: I0309 09:36:00.895611 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550816-8d9h6"] Mar 09 09:36:01 crc kubenswrapper[4971]: I0309 09:36:01.780278 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550816-8d9h6" event={"ID":"57b3d640-db55-4357-9b31-46a45640a583","Type":"ContainerStarted","Data":"f68ee38a3b9c47d5b14f3747aeff89e4644448e46c4e452468dcada91f62ce58"} Mar 09 09:36:02 crc kubenswrapper[4971]: I0309 09:36:02.788694 4971 generic.go:334] "Generic (PLEG): container finished" podID="57b3d640-db55-4357-9b31-46a45640a583" containerID="f1721017276f6a271ee3c417166add857e6faf7b1a4707bafe463298269b1ab2" exitCode=0 Mar 09 09:36:02 crc kubenswrapper[4971]: I0309 09:36:02.788972 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550816-8d9h6" event={"ID":"57b3d640-db55-4357-9b31-46a45640a583","Type":"ContainerDied","Data":"f1721017276f6a271ee3c417166add857e6faf7b1a4707bafe463298269b1ab2"} Mar 09 09:36:02 crc kubenswrapper[4971]: I0309 09:36:02.873056 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-4m4x8" Mar 09 09:36:02 crc kubenswrapper[4971]: I0309 09:36:02.873099 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-4m4x8" Mar 09 09:36:02 crc kubenswrapper[4971]: I0309 09:36:02.901752 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-4m4x8" Mar 09 09:36:03 crc kubenswrapper[4971]: I0309 09:36:03.847313 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-4m4x8" Mar 09 09:36:04 crc kubenswrapper[4971]: I0309 09:36:04.115501 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550816-8d9h6" Mar 09 09:36:04 crc kubenswrapper[4971]: I0309 09:36:04.267052 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tngpr\" (UniqueName: \"kubernetes.io/projected/57b3d640-db55-4357-9b31-46a45640a583-kube-api-access-tngpr\") pod \"57b3d640-db55-4357-9b31-46a45640a583\" (UID: \"57b3d640-db55-4357-9b31-46a45640a583\") " Mar 09 09:36:04 crc kubenswrapper[4971]: I0309 09:36:04.272210 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57b3d640-db55-4357-9b31-46a45640a583-kube-api-access-tngpr" (OuterVolumeSpecName: "kube-api-access-tngpr") pod "57b3d640-db55-4357-9b31-46a45640a583" (UID: "57b3d640-db55-4357-9b31-46a45640a583"). InnerVolumeSpecName "kube-api-access-tngpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:36:04 crc kubenswrapper[4971]: I0309 09:36:04.368309 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tngpr\" (UniqueName: \"kubernetes.io/projected/57b3d640-db55-4357-9b31-46a45640a583-kube-api-access-tngpr\") on node \"crc\" DevicePath \"\"" Mar 09 09:36:04 crc kubenswrapper[4971]: I0309 09:36:04.802235 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550816-8d9h6" event={"ID":"57b3d640-db55-4357-9b31-46a45640a583","Type":"ContainerDied","Data":"f68ee38a3b9c47d5b14f3747aeff89e4644448e46c4e452468dcada91f62ce58"} Mar 09 09:36:04 crc kubenswrapper[4971]: I0309 09:36:04.802257 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550816-8d9h6" Mar 09 09:36:04 crc kubenswrapper[4971]: I0309 09:36:04.802276 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f68ee38a3b9c47d5b14f3747aeff89e4644448e46c4e452468dcada91f62ce58" Mar 09 09:36:05 crc kubenswrapper[4971]: I0309 09:36:05.142228 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f6f5p"] Mar 09 09:36:05 crc kubenswrapper[4971]: E0309 09:36:05.142506 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57b3d640-db55-4357-9b31-46a45640a583" containerName="oc" Mar 09 09:36:05 crc kubenswrapper[4971]: I0309 09:36:05.142524 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b3d640-db55-4357-9b31-46a45640a583" containerName="oc" Mar 09 09:36:05 crc kubenswrapper[4971]: I0309 09:36:05.142639 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="57b3d640-db55-4357-9b31-46a45640a583" containerName="oc" Mar 09 09:36:05 crc kubenswrapper[4971]: I0309 09:36:05.143527 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f6f5p" Mar 09 09:36:05 crc kubenswrapper[4971]: I0309 09:36:05.172125 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f6f5p"] Mar 09 09:36:05 crc kubenswrapper[4971]: I0309 09:36:05.199833 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550810-ls2c9"] Mar 09 09:36:05 crc kubenswrapper[4971]: I0309 09:36:05.204275 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550810-ls2c9"] Mar 09 09:36:05 crc kubenswrapper[4971]: I0309 09:36:05.281540 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/299ba20e-3df0-4e8d-9f7b-8e2201422c98-catalog-content\") pod \"community-operators-f6f5p\" (UID: \"299ba20e-3df0-4e8d-9f7b-8e2201422c98\") " pod="openshift-marketplace/community-operators-f6f5p" Mar 09 09:36:05 crc kubenswrapper[4971]: I0309 09:36:05.281669 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/299ba20e-3df0-4e8d-9f7b-8e2201422c98-utilities\") pod \"community-operators-f6f5p\" (UID: \"299ba20e-3df0-4e8d-9f7b-8e2201422c98\") " pod="openshift-marketplace/community-operators-f6f5p" Mar 09 09:36:05 crc kubenswrapper[4971]: I0309 09:36:05.281702 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67wwt\" (UniqueName: \"kubernetes.io/projected/299ba20e-3df0-4e8d-9f7b-8e2201422c98-kube-api-access-67wwt\") pod \"community-operators-f6f5p\" (UID: \"299ba20e-3df0-4e8d-9f7b-8e2201422c98\") " pod="openshift-marketplace/community-operators-f6f5p" Mar 09 09:36:05 crc kubenswrapper[4971]: I0309 09:36:05.383457 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/299ba20e-3df0-4e8d-9f7b-8e2201422c98-utilities\") pod \"community-operators-f6f5p\" (UID: \"299ba20e-3df0-4e8d-9f7b-8e2201422c98\") " pod="openshift-marketplace/community-operators-f6f5p" Mar 09 09:36:05 crc kubenswrapper[4971]: I0309 09:36:05.384592 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67wwt\" (UniqueName: \"kubernetes.io/projected/299ba20e-3df0-4e8d-9f7b-8e2201422c98-kube-api-access-67wwt\") pod \"community-operators-f6f5p\" (UID: \"299ba20e-3df0-4e8d-9f7b-8e2201422c98\") " pod="openshift-marketplace/community-operators-f6f5p" Mar 09 09:36:05 crc kubenswrapper[4971]: I0309 09:36:05.384119 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/299ba20e-3df0-4e8d-9f7b-8e2201422c98-utilities\") pod \"community-operators-f6f5p\" (UID: \"299ba20e-3df0-4e8d-9f7b-8e2201422c98\") " pod="openshift-marketplace/community-operators-f6f5p" Mar 09 09:36:05 crc kubenswrapper[4971]: I0309 09:36:05.384714 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/299ba20e-3df0-4e8d-9f7b-8e2201422c98-catalog-content\") pod \"community-operators-f6f5p\" (UID: \"299ba20e-3df0-4e8d-9f7b-8e2201422c98\") " pod="openshift-marketplace/community-operators-f6f5p" Mar 09 09:36:05 crc kubenswrapper[4971]: I0309 09:36:05.385146 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/299ba20e-3df0-4e8d-9f7b-8e2201422c98-catalog-content\") pod \"community-operators-f6f5p\" (UID: \"299ba20e-3df0-4e8d-9f7b-8e2201422c98\") " pod="openshift-marketplace/community-operators-f6f5p" Mar 09 09:36:05 crc kubenswrapper[4971]: I0309 09:36:05.423632 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67wwt\" (UniqueName: \"kubernetes.io/projected/299ba20e-3df0-4e8d-9f7b-8e2201422c98-kube-api-access-67wwt\") pod \"community-operators-f6f5p\" (UID: \"299ba20e-3df0-4e8d-9f7b-8e2201422c98\") " pod="openshift-marketplace/community-operators-f6f5p" Mar 09 09:36:05 crc kubenswrapper[4971]: I0309 09:36:05.459154 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f6f5p" Mar 09 09:36:05 crc kubenswrapper[4971]: I0309 09:36:05.976454 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f6f5p"] Mar 09 09:36:05 crc kubenswrapper[4971]: W0309 09:36:05.980279 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod299ba20e_3df0_4e8d_9f7b_8e2201422c98.slice/crio-7886bbdb587aeb75c3bef0d033f1a1979ff2d6376e8021e0b23949fa14c136bc WatchSource:0}: Error finding container 7886bbdb587aeb75c3bef0d033f1a1979ff2d6376e8021e0b23949fa14c136bc: Status 404 returned error can't find the container with id 7886bbdb587aeb75c3bef0d033f1a1979ff2d6376e8021e0b23949fa14c136bc Mar 09 09:36:06 crc kubenswrapper[4971]: I0309 09:36:06.815524 4971 generic.go:334] "Generic (PLEG): container finished" podID="299ba20e-3df0-4e8d-9f7b-8e2201422c98" containerID="ae4de3f2b6bee1a90beecee2467a52fa3967849f30bef4397c6098c738b5b640" exitCode=0 Mar 09 09:36:06 crc kubenswrapper[4971]: I0309 09:36:06.815572 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f6f5p" event={"ID":"299ba20e-3df0-4e8d-9f7b-8e2201422c98","Type":"ContainerDied","Data":"ae4de3f2b6bee1a90beecee2467a52fa3967849f30bef4397c6098c738b5b640"} Mar 09 09:36:06 crc kubenswrapper[4971]: I0309 09:36:06.815785 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f6f5p" event={"ID":"299ba20e-3df0-4e8d-9f7b-8e2201422c98","Type":"ContainerStarted","Data":"7886bbdb587aeb75c3bef0d033f1a1979ff2d6376e8021e0b23949fa14c136bc"} Mar 09 09:36:07 crc kubenswrapper[4971]: I0309 09:36:07.158924 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5a4e33a-3851-4e23-8f30-c766b7326dc0" path="/var/lib/kubelet/pods/b5a4e33a-3851-4e23-8f30-c766b7326dc0/volumes" Mar 09 09:36:10 crc kubenswrapper[4971]: I0309 09:36:10.596682 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586"] Mar 09 09:36:10 crc kubenswrapper[4971]: I0309 09:36:10.598888 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586" Mar 09 09:36:10 crc kubenswrapper[4971]: I0309 09:36:10.601006 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-w69pb" Mar 09 09:36:10 crc kubenswrapper[4971]: I0309 09:36:10.609268 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586"] Mar 09 09:36:10 crc kubenswrapper[4971]: I0309 09:36:10.763282 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af0437bc-ace3-44dd-97d2-f23bee5b48f7-bundle\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586\" (UID: \"af0437bc-ace3-44dd-97d2-f23bee5b48f7\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586" Mar 09 09:36:10 crc kubenswrapper[4971]: I0309 09:36:10.763425 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld46l\" (UniqueName: \"kubernetes.io/projected/af0437bc-ace3-44dd-97d2-f23bee5b48f7-kube-api-access-ld46l\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586\" (UID: \"af0437bc-ace3-44dd-97d2-f23bee5b48f7\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586" Mar 09 09:36:10 crc kubenswrapper[4971]: I0309 09:36:10.763479 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af0437bc-ace3-44dd-97d2-f23bee5b48f7-util\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586\" (UID: \"af0437bc-ace3-44dd-97d2-f23bee5b48f7\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586" Mar 09 09:36:10 crc kubenswrapper[4971]: I0309 09:36:10.864555 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af0437bc-ace3-44dd-97d2-f23bee5b48f7-bundle\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586\" (UID: \"af0437bc-ace3-44dd-97d2-f23bee5b48f7\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586" Mar 09 09:36:10 crc kubenswrapper[4971]: I0309 09:36:10.864936 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld46l\" (UniqueName: \"kubernetes.io/projected/af0437bc-ace3-44dd-97d2-f23bee5b48f7-kube-api-access-ld46l\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586\" (UID: \"af0437bc-ace3-44dd-97d2-f23bee5b48f7\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586" Mar 09 09:36:10 crc kubenswrapper[4971]: I0309 09:36:10.864983 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af0437bc-ace3-44dd-97d2-f23bee5b48f7-util\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586\" (UID: \"af0437bc-ace3-44dd-97d2-f23bee5b48f7\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586" Mar 09 09:36:10 crc kubenswrapper[4971]: I0309 09:36:10.865190 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af0437bc-ace3-44dd-97d2-f23bee5b48f7-bundle\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586\" (UID: \"af0437bc-ace3-44dd-97d2-f23bee5b48f7\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586" Mar 09 09:36:10 crc kubenswrapper[4971]: I0309 09:36:10.865664 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af0437bc-ace3-44dd-97d2-f23bee5b48f7-util\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586\" (UID: \"af0437bc-ace3-44dd-97d2-f23bee5b48f7\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586" Mar 09 09:36:10 crc kubenswrapper[4971]: I0309 09:36:10.889786 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld46l\" (UniqueName: \"kubernetes.io/projected/af0437bc-ace3-44dd-97d2-f23bee5b48f7-kube-api-access-ld46l\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586\" (UID: \"af0437bc-ace3-44dd-97d2-f23bee5b48f7\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586" Mar 09 09:36:10 crc kubenswrapper[4971]: I0309 09:36:10.921453 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586" Mar 09 09:36:12 crc kubenswrapper[4971]: I0309 09:36:12.346614 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586"] Mar 09 09:36:12 crc kubenswrapper[4971]: I0309 09:36:12.854421 4971 generic.go:334] "Generic (PLEG): container finished" podID="299ba20e-3df0-4e8d-9f7b-8e2201422c98" containerID="4efe5b3dbcf2236564dd0569a069f7facb6723f7a0ec5dc7dc5e52ef9a450c8e" exitCode=0 Mar 09 09:36:12 crc kubenswrapper[4971]: I0309 09:36:12.855730 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f6f5p" event={"ID":"299ba20e-3df0-4e8d-9f7b-8e2201422c98","Type":"ContainerDied","Data":"4efe5b3dbcf2236564dd0569a069f7facb6723f7a0ec5dc7dc5e52ef9a450c8e"} Mar 09 09:36:12 crc kubenswrapper[4971]: I0309 09:36:12.857680 4971 generic.go:334] "Generic (PLEG): container finished" podID="af0437bc-ace3-44dd-97d2-f23bee5b48f7" containerID="539f5f3d25f0c5956ee772b3977e6529958fd98a0ab0849019e9e193e29515eb" exitCode=0 Mar 09 09:36:12 crc kubenswrapper[4971]: I0309 09:36:12.857711 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586" event={"ID":"af0437bc-ace3-44dd-97d2-f23bee5b48f7","Type":"ContainerDied","Data":"539f5f3d25f0c5956ee772b3977e6529958fd98a0ab0849019e9e193e29515eb"} Mar 09 09:36:12 crc kubenswrapper[4971]: I0309 09:36:12.857726 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586" event={"ID":"af0437bc-ace3-44dd-97d2-f23bee5b48f7","Type":"ContainerStarted","Data":"8d1b0b9e196f6a4b868938eca778d7a1384889810bb6f429f2135c8200f10d61"} Mar 09 09:36:13 crc kubenswrapper[4971]: I0309 09:36:13.865240 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f6f5p" event={"ID":"299ba20e-3df0-4e8d-9f7b-8e2201422c98","Type":"ContainerStarted","Data":"0317bf3523f7d691eefea12dcb37adac6b118b639b81ce2da73222e1e00f67ba"} Mar 09 09:36:13 crc kubenswrapper[4971]: I0309 09:36:13.893054 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f6f5p" podStartSLOduration=2.442553081 podStartE2EDuration="8.893027176s" podCreationTimestamp="2026-03-09 09:36:05 +0000 UTC" firstStartedPulling="2026-03-09 09:36:06.816861311 +0000 UTC m=+970.376789121" lastFinishedPulling="2026-03-09 09:36:13.267335406 +0000 UTC m=+976.827263216" observedRunningTime="2026-03-09 09:36:13.885991301 +0000 UTC m=+977.445919111" watchObservedRunningTime="2026-03-09 09:36:13.893027176 +0000 UTC m=+977.452954986" Mar 09 09:36:14 crc kubenswrapper[4971]: I0309 09:36:14.875327 4971 generic.go:334] "Generic (PLEG): container finished" podID="af0437bc-ace3-44dd-97d2-f23bee5b48f7" containerID="473b518878c41e62813d983de131275d6959836bee720a54a947e590ce5043f2" exitCode=0 Mar 09 09:36:14 crc kubenswrapper[4971]: I0309 09:36:14.875396 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586" event={"ID":"af0437bc-ace3-44dd-97d2-f23bee5b48f7","Type":"ContainerDied","Data":"473b518878c41e62813d983de131275d6959836bee720a54a947e590ce5043f2"} Mar 09 09:36:15 crc kubenswrapper[4971]: I0309 09:36:15.460575 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f6f5p" Mar 09 09:36:15 crc kubenswrapper[4971]: I0309 09:36:15.461303 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f6f5p" Mar 09 09:36:15 crc kubenswrapper[4971]: I0309 09:36:15.508546 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f6f5p" Mar 09 09:36:15 crc kubenswrapper[4971]: I0309 09:36:15.887125 4971 generic.go:334] "Generic (PLEG): container finished" podID="af0437bc-ace3-44dd-97d2-f23bee5b48f7" containerID="3e792976c5cdaa9357e3399a3c4e9f29179e2bc324207083845dbf7542d24c30" exitCode=0 Mar 09 09:36:15 crc kubenswrapper[4971]: I0309 09:36:15.887229 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586" event={"ID":"af0437bc-ace3-44dd-97d2-f23bee5b48f7","Type":"ContainerDied","Data":"3e792976c5cdaa9357e3399a3c4e9f29179e2bc324207083845dbf7542d24c30"} Mar 09 09:36:17 crc kubenswrapper[4971]: I0309 09:36:17.136476 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586" Mar 09 09:36:17 crc kubenswrapper[4971]: I0309 09:36:17.260602 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af0437bc-ace3-44dd-97d2-f23bee5b48f7-bundle\") pod \"af0437bc-ace3-44dd-97d2-f23bee5b48f7\" (UID: \"af0437bc-ace3-44dd-97d2-f23bee5b48f7\") " Mar 09 09:36:17 crc kubenswrapper[4971]: I0309 09:36:17.260960 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af0437bc-ace3-44dd-97d2-f23bee5b48f7-util\") pod \"af0437bc-ace3-44dd-97d2-f23bee5b48f7\" (UID: \"af0437bc-ace3-44dd-97d2-f23bee5b48f7\") " Mar 09 09:36:17 crc kubenswrapper[4971]: I0309 09:36:17.261085 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld46l\" (UniqueName: \"kubernetes.io/projected/af0437bc-ace3-44dd-97d2-f23bee5b48f7-kube-api-access-ld46l\") pod \"af0437bc-ace3-44dd-97d2-f23bee5b48f7\" (UID: \"af0437bc-ace3-44dd-97d2-f23bee5b48f7\") " Mar 09 09:36:17 crc kubenswrapper[4971]: I0309 09:36:17.261643 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af0437bc-ace3-44dd-97d2-f23bee5b48f7-bundle" (OuterVolumeSpecName: "bundle") pod "af0437bc-ace3-44dd-97d2-f23bee5b48f7" (UID: "af0437bc-ace3-44dd-97d2-f23bee5b48f7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:36:17 crc kubenswrapper[4971]: I0309 09:36:17.267011 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af0437bc-ace3-44dd-97d2-f23bee5b48f7-kube-api-access-ld46l" (OuterVolumeSpecName: "kube-api-access-ld46l") pod "af0437bc-ace3-44dd-97d2-f23bee5b48f7" (UID: "af0437bc-ace3-44dd-97d2-f23bee5b48f7"). InnerVolumeSpecName "kube-api-access-ld46l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:36:17 crc kubenswrapper[4971]: I0309 09:36:17.276308 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af0437bc-ace3-44dd-97d2-f23bee5b48f7-util" (OuterVolumeSpecName: "util") pod "af0437bc-ace3-44dd-97d2-f23bee5b48f7" (UID: "af0437bc-ace3-44dd-97d2-f23bee5b48f7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:36:17 crc kubenswrapper[4971]: I0309 09:36:17.362733 4971 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af0437bc-ace3-44dd-97d2-f23bee5b48f7-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:36:17 crc kubenswrapper[4971]: I0309 09:36:17.362772 4971 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af0437bc-ace3-44dd-97d2-f23bee5b48f7-util\") on node \"crc\" DevicePath \"\"" Mar 09 09:36:17 crc kubenswrapper[4971]: I0309 09:36:17.362782 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld46l\" (UniqueName: \"kubernetes.io/projected/af0437bc-ace3-44dd-97d2-f23bee5b48f7-kube-api-access-ld46l\") on node \"crc\" DevicePath \"\"" Mar 09 09:36:17 crc kubenswrapper[4971]: I0309 09:36:17.904313 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586" event={"ID":"af0437bc-ace3-44dd-97d2-f23bee5b48f7","Type":"ContainerDied","Data":"8d1b0b9e196f6a4b868938eca778d7a1384889810bb6f429f2135c8200f10d61"} Mar 09 09:36:17 crc kubenswrapper[4971]: I0309 09:36:17.904379 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d1b0b9e196f6a4b868938eca778d7a1384889810bb6f429f2135c8200f10d61" Mar 09 09:36:17 crc kubenswrapper[4971]: I0309 09:36:17.904445 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586" Mar 09 09:36:25 crc kubenswrapper[4971]: I0309 09:36:25.497490 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f6f5p" Mar 09 09:36:26 crc kubenswrapper[4971]: I0309 09:36:26.961379 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6659f69886-7494k"] Mar 09 09:36:26 crc kubenswrapper[4971]: E0309 09:36:26.961707 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af0437bc-ace3-44dd-97d2-f23bee5b48f7" containerName="extract" Mar 09 09:36:26 crc kubenswrapper[4971]: I0309 09:36:26.961725 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="af0437bc-ace3-44dd-97d2-f23bee5b48f7" containerName="extract" Mar 09 09:36:26 crc kubenswrapper[4971]: E0309 09:36:26.961743 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af0437bc-ace3-44dd-97d2-f23bee5b48f7" containerName="util" Mar 09 09:36:26 crc kubenswrapper[4971]: I0309 09:36:26.961751 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="af0437bc-ace3-44dd-97d2-f23bee5b48f7" containerName="util" Mar 09 09:36:26 crc kubenswrapper[4971]: E0309 09:36:26.961777 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af0437bc-ace3-44dd-97d2-f23bee5b48f7" containerName="pull" Mar 09 09:36:26 crc kubenswrapper[4971]: I0309 09:36:26.961786 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="af0437bc-ace3-44dd-97d2-f23bee5b48f7" containerName="pull" Mar 09 09:36:26 crc kubenswrapper[4971]: I0309 09:36:26.961898 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="af0437bc-ace3-44dd-97d2-f23bee5b48f7" containerName="extract" Mar 09 09:36:26 crc kubenswrapper[4971]: I0309 09:36:26.962339 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6659f69886-7494k" Mar 09 09:36:26 crc kubenswrapper[4971]: W0309 09:36:26.967591 4971 reflector.go:561] object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-4vjmm": failed to list *v1.Secret: secrets "keystone-operator-controller-manager-dockercfg-4vjmm" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Mar 09 09:36:26 crc kubenswrapper[4971]: E0309 09:36:26.967630 4971 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"keystone-operator-controller-manager-dockercfg-4vjmm\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"keystone-operator-controller-manager-dockercfg-4vjmm\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:36:26 crc kubenswrapper[4971]: W0309 09:36:26.967667 4971 reflector.go:561] object-"openstack-operators"/"keystone-operator-controller-manager-service-cert": failed to list *v1.Secret: secrets "keystone-operator-controller-manager-service-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Mar 09 09:36:26 crc kubenswrapper[4971]: E0309 09:36:26.967677 4971 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"keystone-operator-controller-manager-service-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"keystone-operator-controller-manager-service-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:36:27 crc kubenswrapper[4971]: I0309 09:36:27.024470 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6659f69886-7494k"] Mar 09 09:36:27 crc kubenswrapper[4971]: I0309 09:36:27.046062 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f6f5p"] Mar 09 09:36:27 crc kubenswrapper[4971]: I0309 09:36:27.118002 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95vnv\" (UniqueName: \"kubernetes.io/projected/2fead548-d73c-4b70-8a1f-84aedf664c53-kube-api-access-95vnv\") pod \"keystone-operator-controller-manager-6659f69886-7494k\" (UID: \"2fead548-d73c-4b70-8a1f-84aedf664c53\") " pod="openstack-operators/keystone-operator-controller-manager-6659f69886-7494k" Mar 09 09:36:27 crc kubenswrapper[4971]: I0309 09:36:27.118109 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2fead548-d73c-4b70-8a1f-84aedf664c53-webhook-cert\") pod \"keystone-operator-controller-manager-6659f69886-7494k\" (UID: \"2fead548-d73c-4b70-8a1f-84aedf664c53\") " pod="openstack-operators/keystone-operator-controller-manager-6659f69886-7494k" Mar 09 09:36:27 crc kubenswrapper[4971]: I0309 09:36:27.118194 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2fead548-d73c-4b70-8a1f-84aedf664c53-apiservice-cert\") pod \"keystone-operator-controller-manager-6659f69886-7494k\" (UID: \"2fead548-d73c-4b70-8a1f-84aedf664c53\") " pod="openstack-operators/keystone-operator-controller-manager-6659f69886-7494k" Mar 09 09:36:27 crc kubenswrapper[4971]: I0309 09:36:27.219882 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2fead548-d73c-4b70-8a1f-84aedf664c53-apiservice-cert\") pod \"keystone-operator-controller-manager-6659f69886-7494k\" (UID: \"2fead548-d73c-4b70-8a1f-84aedf664c53\") " pod="openstack-operators/keystone-operator-controller-manager-6659f69886-7494k" Mar 09 09:36:27 crc kubenswrapper[4971]: I0309 09:36:27.220230 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95vnv\" (UniqueName: \"kubernetes.io/projected/2fead548-d73c-4b70-8a1f-84aedf664c53-kube-api-access-95vnv\") pod \"keystone-operator-controller-manager-6659f69886-7494k\" (UID: \"2fead548-d73c-4b70-8a1f-84aedf664c53\") " pod="openstack-operators/keystone-operator-controller-manager-6659f69886-7494k" Mar 09 09:36:27 crc kubenswrapper[4971]: I0309 09:36:27.220458 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2fead548-d73c-4b70-8a1f-84aedf664c53-webhook-cert\") pod \"keystone-operator-controller-manager-6659f69886-7494k\" (UID: \"2fead548-d73c-4b70-8a1f-84aedf664c53\") " pod="openstack-operators/keystone-operator-controller-manager-6659f69886-7494k" Mar 09 09:36:27 crc kubenswrapper[4971]: I0309 09:36:27.243400 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95vnv\" (UniqueName: \"kubernetes.io/projected/2fead548-d73c-4b70-8a1f-84aedf664c53-kube-api-access-95vnv\") pod \"keystone-operator-controller-manager-6659f69886-7494k\" (UID: \"2fead548-d73c-4b70-8a1f-84aedf664c53\") " pod="openstack-operators/keystone-operator-controller-manager-6659f69886-7494k" Mar 09 09:36:27 crc kubenswrapper[4971]: I0309 09:36:27.531761 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9v8xz"] Mar 09 09:36:27 crc kubenswrapper[4971]: I0309 09:36:27.532057 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9v8xz" podUID="02ebe249-212b-44fd-87f9-3c8db2c3b826" containerName="registry-server" containerID="cri-o://5cc7555bec3e72637dec29ad212fbd7ba71248794cf263c2f7c103f3e10fd5d7" gracePeriod=2 Mar 09 09:36:27 crc kubenswrapper[4971]: I0309 09:36:27.922364 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9v8xz" Mar 09 09:36:27 crc kubenswrapper[4971]: I0309 09:36:27.964398 4971 generic.go:334] "Generic (PLEG): container finished" podID="02ebe249-212b-44fd-87f9-3c8db2c3b826" containerID="5cc7555bec3e72637dec29ad212fbd7ba71248794cf263c2f7c103f3e10fd5d7" exitCode=0 Mar 09 09:36:27 crc kubenswrapper[4971]: I0309 09:36:27.964459 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9v8xz" event={"ID":"02ebe249-212b-44fd-87f9-3c8db2c3b826","Type":"ContainerDied","Data":"5cc7555bec3e72637dec29ad212fbd7ba71248794cf263c2f7c103f3e10fd5d7"} Mar 09 09:36:27 crc kubenswrapper[4971]: I0309 09:36:27.964505 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9v8xz" event={"ID":"02ebe249-212b-44fd-87f9-3c8db2c3b826","Type":"ContainerDied","Data":"23aeccd20330515235e1c074078bb22a05148521812684504fd473de10d5131a"} Mar 09 09:36:27 crc kubenswrapper[4971]: I0309 09:36:27.964525 4971 scope.go:117] "RemoveContainer" containerID="5cc7555bec3e72637dec29ad212fbd7ba71248794cf263c2f7c103f3e10fd5d7" Mar 09 09:36:27 crc kubenswrapper[4971]: I0309 09:36:27.964683 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9v8xz" Mar 09 09:36:27 crc kubenswrapper[4971]: I0309 09:36:27.988927 4971 scope.go:117] "RemoveContainer" containerID="9b67b694ddeb0d3bd22c53a8bd844238f81ad1e69f46fa235960fe8e373a5a69" Mar 09 09:36:28 crc kubenswrapper[4971]: I0309 09:36:28.004134 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-4vjmm" Mar 09 09:36:28 crc kubenswrapper[4971]: I0309 09:36:28.008943 4971 scope.go:117] "RemoveContainer" containerID="4a3f7bbdc784658174581915946ffe133ca608dc459390c6d7dcf9cb5c84b7e5" Mar 09 09:36:28 crc kubenswrapper[4971]: I0309 09:36:28.030886 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02ebe249-212b-44fd-87f9-3c8db2c3b826-utilities\") pod \"02ebe249-212b-44fd-87f9-3c8db2c3b826\" (UID: \"02ebe249-212b-44fd-87f9-3c8db2c3b826\") " Mar 09 09:36:28 crc kubenswrapper[4971]: I0309 09:36:28.031015 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02ebe249-212b-44fd-87f9-3c8db2c3b826-catalog-content\") pod \"02ebe249-212b-44fd-87f9-3c8db2c3b826\" (UID: \"02ebe249-212b-44fd-87f9-3c8db2c3b826\") " Mar 09 09:36:28 crc kubenswrapper[4971]: I0309 09:36:28.031083 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx45n\" (UniqueName: \"kubernetes.io/projected/02ebe249-212b-44fd-87f9-3c8db2c3b826-kube-api-access-nx45n\") pod \"02ebe249-212b-44fd-87f9-3c8db2c3b826\" (UID: \"02ebe249-212b-44fd-87f9-3c8db2c3b826\") " Mar 09 09:36:28 crc kubenswrapper[4971]: I0309 09:36:28.036535 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02ebe249-212b-44fd-87f9-3c8db2c3b826-kube-api-access-nx45n" (OuterVolumeSpecName: "kube-api-access-nx45n") pod "02ebe249-212b-44fd-87f9-3c8db2c3b826" (UID: "02ebe249-212b-44fd-87f9-3c8db2c3b826"). InnerVolumeSpecName "kube-api-access-nx45n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:36:28 crc kubenswrapper[4971]: I0309 09:36:28.036591 4971 scope.go:117] "RemoveContainer" containerID="5cc7555bec3e72637dec29ad212fbd7ba71248794cf263c2f7c103f3e10fd5d7" Mar 09 09:36:28 crc kubenswrapper[4971]: I0309 09:36:28.036833 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02ebe249-212b-44fd-87f9-3c8db2c3b826-utilities" (OuterVolumeSpecName: "utilities") pod "02ebe249-212b-44fd-87f9-3c8db2c3b826" (UID: "02ebe249-212b-44fd-87f9-3c8db2c3b826"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:36:28 crc kubenswrapper[4971]: E0309 09:36:28.037104 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cc7555bec3e72637dec29ad212fbd7ba71248794cf263c2f7c103f3e10fd5d7\": container with ID starting with 5cc7555bec3e72637dec29ad212fbd7ba71248794cf263c2f7c103f3e10fd5d7 not found: ID does not exist" containerID="5cc7555bec3e72637dec29ad212fbd7ba71248794cf263c2f7c103f3e10fd5d7" Mar 09 09:36:28 crc kubenswrapper[4971]: I0309 09:36:28.037184 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cc7555bec3e72637dec29ad212fbd7ba71248794cf263c2f7c103f3e10fd5d7"} err="failed to get container status \"5cc7555bec3e72637dec29ad212fbd7ba71248794cf263c2f7c103f3e10fd5d7\": rpc error: code = NotFound desc = could not find container \"5cc7555bec3e72637dec29ad212fbd7ba71248794cf263c2f7c103f3e10fd5d7\": container with ID starting with 5cc7555bec3e72637dec29ad212fbd7ba71248794cf263c2f7c103f3e10fd5d7 not found: ID does not exist" Mar 09 09:36:28 crc kubenswrapper[4971]: I0309 09:36:28.037214 4971 scope.go:117] "RemoveContainer" containerID="9b67b694ddeb0d3bd22c53a8bd844238f81ad1e69f46fa235960fe8e373a5a69" Mar 09 09:36:28 crc kubenswrapper[4971]: E0309 09:36:28.038330 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b67b694ddeb0d3bd22c53a8bd844238f81ad1e69f46fa235960fe8e373a5a69\": container with ID starting with 9b67b694ddeb0d3bd22c53a8bd844238f81ad1e69f46fa235960fe8e373a5a69 not found: ID does not exist" containerID="9b67b694ddeb0d3bd22c53a8bd844238f81ad1e69f46fa235960fe8e373a5a69" Mar 09 09:36:28 crc kubenswrapper[4971]: I0309 09:36:28.038834 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b67b694ddeb0d3bd22c53a8bd844238f81ad1e69f46fa235960fe8e373a5a69"} err="failed to get container status \"9b67b694ddeb0d3bd22c53a8bd844238f81ad1e69f46fa235960fe8e373a5a69\": rpc error: code = NotFound desc = could not find container \"9b67b694ddeb0d3bd22c53a8bd844238f81ad1e69f46fa235960fe8e373a5a69\": container with ID starting with 9b67b694ddeb0d3bd22c53a8bd844238f81ad1e69f46fa235960fe8e373a5a69 not found: ID does not exist" Mar 09 09:36:28 crc kubenswrapper[4971]: I0309 09:36:28.038861 4971 scope.go:117] "RemoveContainer" containerID="4a3f7bbdc784658174581915946ffe133ca608dc459390c6d7dcf9cb5c84b7e5" Mar 09 09:36:28 crc kubenswrapper[4971]: E0309 09:36:28.040529 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a3f7bbdc784658174581915946ffe133ca608dc459390c6d7dcf9cb5c84b7e5\": container with ID starting with 4a3f7bbdc784658174581915946ffe133ca608dc459390c6d7dcf9cb5c84b7e5 not found: ID does not exist" containerID="4a3f7bbdc784658174581915946ffe133ca608dc459390c6d7dcf9cb5c84b7e5" Mar 09 09:36:28 crc kubenswrapper[4971]: I0309 09:36:28.040566 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a3f7bbdc784658174581915946ffe133ca608dc459390c6d7dcf9cb5c84b7e5"} err="failed to get container status \"4a3f7bbdc784658174581915946ffe133ca608dc459390c6d7dcf9cb5c84b7e5\": rpc error: code = NotFound desc = could not find container \"4a3f7bbdc784658174581915946ffe133ca608dc459390c6d7dcf9cb5c84b7e5\": container with ID starting with 4a3f7bbdc784658174581915946ffe133ca608dc459390c6d7dcf9cb5c84b7e5 not found: ID does not exist" Mar 09 09:36:28 crc kubenswrapper[4971]: I0309 09:36:28.101884 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02ebe249-212b-44fd-87f9-3c8db2c3b826-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02ebe249-212b-44fd-87f9-3c8db2c3b826" (UID: "02ebe249-212b-44fd-87f9-3c8db2c3b826"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:36:28 crc kubenswrapper[4971]: I0309 09:36:28.132693 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02ebe249-212b-44fd-87f9-3c8db2c3b826-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:36:28 crc kubenswrapper[4971]: I0309 09:36:28.132980 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02ebe249-212b-44fd-87f9-3c8db2c3b826-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:36:28 crc kubenswrapper[4971]: I0309 09:36:28.133056 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nx45n\" (UniqueName: \"kubernetes.io/projected/02ebe249-212b-44fd-87f9-3c8db2c3b826-kube-api-access-nx45n\") on node \"crc\" DevicePath \"\"" Mar 09 09:36:28 crc kubenswrapper[4971]: E0309 09:36:28.221253 4971 secret.go:188] Couldn't get secret openstack-operators/keystone-operator-controller-manager-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 09:36:28 crc kubenswrapper[4971]: E0309 09:36:28.221285 4971 secret.go:188] Couldn't get secret openstack-operators/keystone-operator-controller-manager-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 09:36:28 crc kubenswrapper[4971]: E0309 09:36:28.221361 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fead548-d73c-4b70-8a1f-84aedf664c53-apiservice-cert podName:2fead548-d73c-4b70-8a1f-84aedf664c53 nodeName:}" failed. No retries permitted until 2026-03-09 09:36:28.721326213 +0000 UTC m=+992.281254023 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/2fead548-d73c-4b70-8a1f-84aedf664c53-apiservice-cert") pod "keystone-operator-controller-manager-6659f69886-7494k" (UID: "2fead548-d73c-4b70-8a1f-84aedf664c53") : failed to sync secret cache: timed out waiting for the condition Mar 09 09:36:28 crc kubenswrapper[4971]: E0309 09:36:28.221398 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fead548-d73c-4b70-8a1f-84aedf664c53-webhook-cert podName:2fead548-d73c-4b70-8a1f-84aedf664c53 nodeName:}" failed. No retries permitted until 2026-03-09 09:36:28.721371884 +0000 UTC m=+992.281299694 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/2fead548-d73c-4b70-8a1f-84aedf664c53-webhook-cert") pod "keystone-operator-controller-manager-6659f69886-7494k" (UID: "2fead548-d73c-4b70-8a1f-84aedf664c53") : failed to sync secret cache: timed out waiting for the condition Mar 09 09:36:28 crc kubenswrapper[4971]: I0309 09:36:28.257837 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Mar 09 09:36:28 crc kubenswrapper[4971]: I0309 09:36:28.299014 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9v8xz"] Mar 09 09:36:28 crc kubenswrapper[4971]: I0309 09:36:28.304867 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9v8xz"] Mar 09 09:36:28 crc kubenswrapper[4971]: I0309 09:36:28.742708 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2fead548-d73c-4b70-8a1f-84aedf664c53-apiservice-cert\") pod \"keystone-operator-controller-manager-6659f69886-7494k\" (UID: \"2fead548-d73c-4b70-8a1f-84aedf664c53\") " pod="openstack-operators/keystone-operator-controller-manager-6659f69886-7494k" Mar 09 09:36:28 crc kubenswrapper[4971]: I0309 09:36:28.742822 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2fead548-d73c-4b70-8a1f-84aedf664c53-webhook-cert\") pod \"keystone-operator-controller-manager-6659f69886-7494k\" (UID: \"2fead548-d73c-4b70-8a1f-84aedf664c53\") " pod="openstack-operators/keystone-operator-controller-manager-6659f69886-7494k" Mar 09 09:36:28 crc kubenswrapper[4971]: I0309 09:36:28.748834 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2fead548-d73c-4b70-8a1f-84aedf664c53-apiservice-cert\") pod \"keystone-operator-controller-manager-6659f69886-7494k\" (UID: \"2fead548-d73c-4b70-8a1f-84aedf664c53\") " pod="openstack-operators/keystone-operator-controller-manager-6659f69886-7494k" Mar 09 09:36:28 crc kubenswrapper[4971]: I0309 09:36:28.749960 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2fead548-d73c-4b70-8a1f-84aedf664c53-webhook-cert\") pod \"keystone-operator-controller-manager-6659f69886-7494k\" (UID: \"2fead548-d73c-4b70-8a1f-84aedf664c53\") " pod="openstack-operators/keystone-operator-controller-manager-6659f69886-7494k" Mar 09 09:36:28 crc kubenswrapper[4971]: I0309 09:36:28.790449 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6659f69886-7494k" Mar 09 09:36:29 crc kubenswrapper[4971]: I0309 09:36:29.164300 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02ebe249-212b-44fd-87f9-3c8db2c3b826" path="/var/lib/kubelet/pods/02ebe249-212b-44fd-87f9-3c8db2c3b826/volumes" Mar 09 09:36:29 crc kubenswrapper[4971]: I0309 09:36:29.302050 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6659f69886-7494k"] Mar 09 09:36:29 crc kubenswrapper[4971]: W0309 09:36:29.310099 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fead548_d73c_4b70_8a1f_84aedf664c53.slice/crio-6c644a3b3ba5713e2633e1ce0255cd7fc464c20bac7046929323394fbdfd0013 WatchSource:0}: Error finding container 6c644a3b3ba5713e2633e1ce0255cd7fc464c20bac7046929323394fbdfd0013: Status 404 returned error can't find the container with id 6c644a3b3ba5713e2633e1ce0255cd7fc464c20bac7046929323394fbdfd0013 Mar 09 09:36:29 crc kubenswrapper[4971]: I0309 09:36:29.982374 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6659f69886-7494k" event={"ID":"2fead548-d73c-4b70-8a1f-84aedf664c53","Type":"ContainerStarted","Data":"6c644a3b3ba5713e2633e1ce0255cd7fc464c20bac7046929323394fbdfd0013"} Mar 09 09:36:30 crc kubenswrapper[4971]: I0309 09:36:30.990536 4971 generic.go:334] "Generic (PLEG): container finished" podID="5fbe67b5-f371-4d9a-9777-cbfeff3f2863" containerID="dc9da9973797e7ebae0e820bdbe861db28c73aa9f5dd3ea8cb258beb202c037b" exitCode=0 Mar 09 09:36:30 crc kubenswrapper[4971]: I0309 09:36:30.990638 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/rabbitmq-server-0" event={"ID":"5fbe67b5-f371-4d9a-9777-cbfeff3f2863","Type":"ContainerDied","Data":"dc9da9973797e7ebae0e820bdbe861db28c73aa9f5dd3ea8cb258beb202c037b"} Mar 09 09:36:32 crc kubenswrapper[4971]: I0309 09:36:32.011400 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/rabbitmq-server-0" event={"ID":"5fbe67b5-f371-4d9a-9777-cbfeff3f2863","Type":"ContainerStarted","Data":"66b4b95af5987cb6950ae9fae3edef5d769de543434394434c36a0b9849fb7cb"} Mar 09 09:36:32 crc kubenswrapper[4971]: I0309 09:36:32.011989 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:36:32 crc kubenswrapper[4971]: I0309 09:36:32.045207 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.701943549 podStartE2EDuration="43.045185801s" podCreationTimestamp="2026-03-09 09:35:49 +0000 UTC" firstStartedPulling="2026-03-09 09:35:51.372131786 +0000 UTC m=+954.932059596" lastFinishedPulling="2026-03-09 09:35:57.715374038 +0000 UTC m=+961.275301848" observedRunningTime="2026-03-09 09:36:32.043316956 +0000 UTC m=+995.603244766" watchObservedRunningTime="2026-03-09 09:36:32.045185801 +0000 UTC m=+995.605113611" Mar 09 09:36:34 crc kubenswrapper[4971]: I0309 09:36:34.026156 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6659f69886-7494k" event={"ID":"2fead548-d73c-4b70-8a1f-84aedf664c53","Type":"ContainerStarted","Data":"58a74eb5056cb75c31d8db9a3c35b0b6642db0214bcd9f7f85ffd5d3449df8a1"} Mar 09 09:36:34 crc kubenswrapper[4971]: I0309 09:36:34.026537 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-6659f69886-7494k" Mar 09 09:36:38 crc kubenswrapper[4971]: I0309 09:36:38.796418 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-6659f69886-7494k" Mar 09 09:36:38 crc kubenswrapper[4971]: I0309 09:36:38.812871 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-6659f69886-7494k" podStartSLOduration=8.394689925 podStartE2EDuration="12.812850648s" podCreationTimestamp="2026-03-09 09:36:26 +0000 UTC" firstStartedPulling="2026-03-09 09:36:29.3126331 +0000 UTC m=+992.872560910" lastFinishedPulling="2026-03-09 09:36:33.730793813 +0000 UTC m=+997.290721633" observedRunningTime="2026-03-09 09:36:34.047711602 +0000 UTC m=+997.607639412" watchObservedRunningTime="2026-03-09 09:36:38.812850648 +0000 UTC m=+1002.372778458" Mar 09 09:36:41 crc kubenswrapper[4971]: I0309 09:36:41.180227 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/keystone-db-create-qt784"] Mar 09 09:36:41 crc kubenswrapper[4971]: E0309 09:36:41.180771 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02ebe249-212b-44fd-87f9-3c8db2c3b826" containerName="extract-content" Mar 09 09:36:41 crc kubenswrapper[4971]: I0309 09:36:41.180786 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="02ebe249-212b-44fd-87f9-3c8db2c3b826" containerName="extract-content" Mar 09 09:36:41 crc kubenswrapper[4971]: E0309 09:36:41.180800 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02ebe249-212b-44fd-87f9-3c8db2c3b826" containerName="registry-server" Mar 09 09:36:41 crc kubenswrapper[4971]: I0309 09:36:41.180805 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="02ebe249-212b-44fd-87f9-3c8db2c3b826" containerName="registry-server" Mar 09 09:36:41 crc kubenswrapper[4971]: E0309 09:36:41.180822 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02ebe249-212b-44fd-87f9-3c8db2c3b826" containerName="extract-utilities" Mar 09 09:36:41 crc kubenswrapper[4971]: I0309 09:36:41.180828 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="02ebe249-212b-44fd-87f9-3c8db2c3b826" containerName="extract-utilities" Mar 09 09:36:41 crc kubenswrapper[4971]: I0309 09:36:41.180946 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="02ebe249-212b-44fd-87f9-3c8db2c3b826" containerName="registry-server" Mar 09 09:36:41 crc kubenswrapper[4971]: I0309 09:36:41.181388 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-create-qt784" Mar 09 09:36:41 crc kubenswrapper[4971]: I0309 09:36:41.194366 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-db-create-qt784"] Mar 09 09:36:41 crc kubenswrapper[4971]: I0309 09:36:41.229766 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88p4f\" (UniqueName: \"kubernetes.io/projected/7ccc8050-0beb-48dc-9422-04484a337b7e-kube-api-access-88p4f\") pod \"keystone-db-create-qt784\" (UID: \"7ccc8050-0beb-48dc-9422-04484a337b7e\") " pod="swift-kuttl-tests/keystone-db-create-qt784" Mar 09 09:36:41 crc kubenswrapper[4971]: I0309 09:36:41.229806 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ccc8050-0beb-48dc-9422-04484a337b7e-operator-scripts\") pod \"keystone-db-create-qt784\" (UID: \"7ccc8050-0beb-48dc-9422-04484a337b7e\") " pod="swift-kuttl-tests/keystone-db-create-qt784" Mar 09 09:36:41 crc kubenswrapper[4971]: I0309 09:36:41.274136 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/keystone-1f04-account-create-update-4wtvd"] Mar 09 09:36:41 crc kubenswrapper[4971]: I0309 09:36:41.275377 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-1f04-account-create-update-4wtvd" Mar 09 09:36:41 crc kubenswrapper[4971]: I0309 09:36:41.278883 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-db-secret" Mar 09 09:36:41 crc kubenswrapper[4971]: I0309 09:36:41.285793 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-1f04-account-create-update-4wtvd"] Mar 09 09:36:41 crc kubenswrapper[4971]: I0309 09:36:41.331288 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1869c051-5ff7-4504-92c2-cbf07998153d-operator-scripts\") pod \"keystone-1f04-account-create-update-4wtvd\" (UID: \"1869c051-5ff7-4504-92c2-cbf07998153d\") " pod="swift-kuttl-tests/keystone-1f04-account-create-update-4wtvd" Mar 09 09:36:41 crc kubenswrapper[4971]: I0309 09:36:41.331338 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88p4f\" (UniqueName: \"kubernetes.io/projected/7ccc8050-0beb-48dc-9422-04484a337b7e-kube-api-access-88p4f\") pod \"keystone-db-create-qt784\" (UID: \"7ccc8050-0beb-48dc-9422-04484a337b7e\") " pod="swift-kuttl-tests/keystone-db-create-qt784" Mar 09 09:36:41 crc kubenswrapper[4971]: I0309 09:36:41.331379 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ccc8050-0beb-48dc-9422-04484a337b7e-operator-scripts\") pod \"keystone-db-create-qt784\" (UID: \"7ccc8050-0beb-48dc-9422-04484a337b7e\") " pod="swift-kuttl-tests/keystone-db-create-qt784" Mar 09 09:36:41 crc kubenswrapper[4971]: I0309 09:36:41.331428 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scp7x\" (UniqueName: \"kubernetes.io/projected/1869c051-5ff7-4504-92c2-cbf07998153d-kube-api-access-scp7x\") pod \"keystone-1f04-account-create-update-4wtvd\" (UID: \"1869c051-5ff7-4504-92c2-cbf07998153d\") " pod="swift-kuttl-tests/keystone-1f04-account-create-update-4wtvd" Mar 09 09:36:41 crc kubenswrapper[4971]: I0309 09:36:41.332331 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ccc8050-0beb-48dc-9422-04484a337b7e-operator-scripts\") pod \"keystone-db-create-qt784\" (UID: \"7ccc8050-0beb-48dc-9422-04484a337b7e\") " pod="swift-kuttl-tests/keystone-db-create-qt784" Mar 09 09:36:41 crc kubenswrapper[4971]: I0309 09:36:41.364493 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88p4f\" (UniqueName: \"kubernetes.io/projected/7ccc8050-0beb-48dc-9422-04484a337b7e-kube-api-access-88p4f\") pod \"keystone-db-create-qt784\" (UID: \"7ccc8050-0beb-48dc-9422-04484a337b7e\") " pod="swift-kuttl-tests/keystone-db-create-qt784" Mar 09 09:36:41 crc kubenswrapper[4971]: I0309 09:36:41.433289 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1869c051-5ff7-4504-92c2-cbf07998153d-operator-scripts\") pod \"keystone-1f04-account-create-update-4wtvd\" (UID: \"1869c051-5ff7-4504-92c2-cbf07998153d\") " pod="swift-kuttl-tests/keystone-1f04-account-create-update-4wtvd" Mar 09 09:36:41 crc kubenswrapper[4971]: I0309 09:36:41.433393 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scp7x\" (UniqueName: \"kubernetes.io/projected/1869c051-5ff7-4504-92c2-cbf07998153d-kube-api-access-scp7x\") pod \"keystone-1f04-account-create-update-4wtvd\" (UID: \"1869c051-5ff7-4504-92c2-cbf07998153d\") " pod="swift-kuttl-tests/keystone-1f04-account-create-update-4wtvd" Mar 09 09:36:41 crc kubenswrapper[4971]: I0309 09:36:41.437041 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1869c051-5ff7-4504-92c2-cbf07998153d-operator-scripts\") pod \"keystone-1f04-account-create-update-4wtvd\" (UID: \"1869c051-5ff7-4504-92c2-cbf07998153d\") " pod="swift-kuttl-tests/keystone-1f04-account-create-update-4wtvd" Mar 09 09:36:41 crc kubenswrapper[4971]: I0309 09:36:41.459924 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scp7x\" (UniqueName: \"kubernetes.io/projected/1869c051-5ff7-4504-92c2-cbf07998153d-kube-api-access-scp7x\") pod \"keystone-1f04-account-create-update-4wtvd\" (UID: \"1869c051-5ff7-4504-92c2-cbf07998153d\") " pod="swift-kuttl-tests/keystone-1f04-account-create-update-4wtvd" Mar 09 09:36:41 crc kubenswrapper[4971]: I0309 09:36:41.500472 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-create-qt784" Mar 09 09:36:41 crc kubenswrapper[4971]: I0309 09:36:41.589278 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-1f04-account-create-update-4wtvd" Mar 09 09:36:42 crc kubenswrapper[4971]: I0309 09:36:42.404846 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-1f04-account-create-update-4wtvd"] Mar 09 09:36:42 crc kubenswrapper[4971]: I0309 09:36:42.538070 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-db-create-qt784"] Mar 09 09:36:42 crc kubenswrapper[4971]: W0309 09:36:42.539742 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ccc8050_0beb_48dc_9422_04484a337b7e.slice/crio-c762ccdc2e87aaca2cce2661f47cc1dc9baae3eb76dac7f89329a5a00c1291d9 WatchSource:0}: Error finding container c762ccdc2e87aaca2cce2661f47cc1dc9baae3eb76dac7f89329a5a00c1291d9: Status 404 returned error can't find the container with id c762ccdc2e87aaca2cce2661f47cc1dc9baae3eb76dac7f89329a5a00c1291d9 Mar 09 09:36:42 crc kubenswrapper[4971]: I0309 09:36:42.946274 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-index-kv892"] Mar 09 09:36:42 crc kubenswrapper[4971]: I0309 09:36:42.947470 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-kv892" Mar 09 09:36:42 crc kubenswrapper[4971]: I0309 09:36:42.949161 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-index-dockercfg-fw57w" Mar 09 09:36:42 crc kubenswrapper[4971]: I0309 09:36:42.951952 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-kv892"] Mar 09 09:36:43 crc kubenswrapper[4971]: I0309 09:36:43.064051 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcmvz\" (UniqueName: \"kubernetes.io/projected/667399eb-3c03-434e-b2c2-008b7a882a3f-kube-api-access-xcmvz\") pod \"barbican-operator-index-kv892\" (UID: \"667399eb-3c03-434e-b2c2-008b7a882a3f\") " pod="openstack-operators/barbican-operator-index-kv892" Mar 09 09:36:43 crc kubenswrapper[4971]: I0309 09:36:43.120677 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-1f04-account-create-update-4wtvd" event={"ID":"1869c051-5ff7-4504-92c2-cbf07998153d","Type":"ContainerStarted","Data":"e96aba92dae40cbde8de408946678f639a7c5a3a10f8b14c96b8d614e73390a0"} Mar 09 09:36:43 crc kubenswrapper[4971]: I0309 09:36:43.121998 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-create-qt784" event={"ID":"7ccc8050-0beb-48dc-9422-04484a337b7e","Type":"ContainerStarted","Data":"c762ccdc2e87aaca2cce2661f47cc1dc9baae3eb76dac7f89329a5a00c1291d9"} Mar 09 09:36:43 crc kubenswrapper[4971]: I0309 09:36:43.166445 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcmvz\" (UniqueName: \"kubernetes.io/projected/667399eb-3c03-434e-b2c2-008b7a882a3f-kube-api-access-xcmvz\") pod \"barbican-operator-index-kv892\" (UID: \"667399eb-3c03-434e-b2c2-008b7a882a3f\") " pod="openstack-operators/barbican-operator-index-kv892" Mar 09 09:36:43 crc kubenswrapper[4971]: I0309 09:36:43.205170 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcmvz\" (UniqueName: \"kubernetes.io/projected/667399eb-3c03-434e-b2c2-008b7a882a3f-kube-api-access-xcmvz\") pod \"barbican-operator-index-kv892\" (UID: \"667399eb-3c03-434e-b2c2-008b7a882a3f\") " pod="openstack-operators/barbican-operator-index-kv892" Mar 09 09:36:43 crc kubenswrapper[4971]: I0309 09:36:43.267886 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-kv892" Mar 09 09:36:43 crc kubenswrapper[4971]: I0309 09:36:43.720999 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-kv892"] Mar 09 09:36:43 crc kubenswrapper[4971]: W0309 09:36:43.737602 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod667399eb_3c03_434e_b2c2_008b7a882a3f.slice/crio-545a1e7586dd15e4059f69b0f6506b3048ba739f89aae797ea54b076a7670b6b WatchSource:0}: Error finding container 545a1e7586dd15e4059f69b0f6506b3048ba739f89aae797ea54b076a7670b6b: Status 404 returned error can't find the container with id 545a1e7586dd15e4059f69b0f6506b3048ba739f89aae797ea54b076a7670b6b Mar 09 09:36:44 crc kubenswrapper[4971]: I0309 09:36:44.129116 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-kv892" event={"ID":"667399eb-3c03-434e-b2c2-008b7a882a3f","Type":"ContainerStarted","Data":"545a1e7586dd15e4059f69b0f6506b3048ba739f89aae797ea54b076a7670b6b"} Mar 09 09:36:44 crc kubenswrapper[4971]: I0309 09:36:44.130512 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-1f04-account-create-update-4wtvd" event={"ID":"1869c051-5ff7-4504-92c2-cbf07998153d","Type":"ContainerStarted","Data":"c34a56e58e19b4a4e4f7a4d883f80da11086d79f2d74adf9c01878e889d48f17"} Mar 09 09:36:44 crc kubenswrapper[4971]: I0309 09:36:44.132720 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-create-qt784" event={"ID":"7ccc8050-0beb-48dc-9422-04484a337b7e","Type":"ContainerStarted","Data":"ba076d7843d817330fd2fde06e7d4350f89f50e00698db6cf8ed6476608f7a0d"} Mar 09 09:36:44 crc kubenswrapper[4971]: I0309 09:36:44.147246 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/keystone-1f04-account-create-update-4wtvd" podStartSLOduration=3.147226399 podStartE2EDuration="3.147226399s" podCreationTimestamp="2026-03-09 09:36:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:36:44.142756239 +0000 UTC m=+1007.702684059" watchObservedRunningTime="2026-03-09 09:36:44.147226399 +0000 UTC m=+1007.707154209" Mar 09 09:36:44 crc kubenswrapper[4971]: I0309 09:36:44.162133 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/keystone-db-create-qt784" podStartSLOduration=3.162118143 podStartE2EDuration="3.162118143s" podCreationTimestamp="2026-03-09 09:36:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:36:44.161446953 +0000 UTC m=+1007.721374763" watchObservedRunningTime="2026-03-09 09:36:44.162118143 +0000 UTC m=+1007.722045953" Mar 09 09:36:45 crc kubenswrapper[4971]: I0309 09:36:45.140064 4971 generic.go:334] "Generic (PLEG): container finished" podID="7ccc8050-0beb-48dc-9422-04484a337b7e" containerID="ba076d7843d817330fd2fde06e7d4350f89f50e00698db6cf8ed6476608f7a0d" exitCode=0 Mar 09 09:36:45 crc kubenswrapper[4971]: I0309 09:36:45.140105 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-create-qt784" event={"ID":"7ccc8050-0beb-48dc-9422-04484a337b7e","Type":"ContainerDied","Data":"ba076d7843d817330fd2fde06e7d4350f89f50e00698db6cf8ed6476608f7a0d"} Mar 09 09:36:45 crc kubenswrapper[4971]: I0309 09:36:45.143800 4971 generic.go:334] "Generic (PLEG): container finished" podID="1869c051-5ff7-4504-92c2-cbf07998153d" containerID="c34a56e58e19b4a4e4f7a4d883f80da11086d79f2d74adf9c01878e889d48f17" exitCode=0 Mar 09 09:36:45 crc kubenswrapper[4971]: I0309 09:36:45.143872 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-1f04-account-create-update-4wtvd" event={"ID":"1869c051-5ff7-4504-92c2-cbf07998153d","Type":"ContainerDied","Data":"c34a56e58e19b4a4e4f7a4d883f80da11086d79f2d74adf9c01878e889d48f17"} Mar 09 09:36:46 crc kubenswrapper[4971]: I0309 09:36:46.152449 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-kv892" event={"ID":"667399eb-3c03-434e-b2c2-008b7a882a3f","Type":"ContainerStarted","Data":"a5370ed454d93138f467c69b249f02333c6cf4e34a2491b9fed028bcc31a69c7"} Mar 09 09:36:46 crc kubenswrapper[4971]: I0309 09:36:46.168592 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-index-kv892" podStartSLOduration=2.328923699 podStartE2EDuration="4.168569899s" podCreationTimestamp="2026-03-09 09:36:42 +0000 UTC" firstStartedPulling="2026-03-09 09:36:43.739197277 +0000 UTC m=+1007.299125097" lastFinishedPulling="2026-03-09 09:36:45.578843487 +0000 UTC m=+1009.138771297" observedRunningTime="2026-03-09 09:36:46.165724206 +0000 UTC m=+1009.725652026" watchObservedRunningTime="2026-03-09 09:36:46.168569899 +0000 UTC m=+1009.728497709" Mar 09 09:36:46 crc kubenswrapper[4971]: I0309 09:36:46.478846 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-1f04-account-create-update-4wtvd" Mar 09 09:36:46 crc kubenswrapper[4971]: I0309 09:36:46.484741 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-create-qt784" Mar 09 09:36:46 crc kubenswrapper[4971]: I0309 09:36:46.625205 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1869c051-5ff7-4504-92c2-cbf07998153d-operator-scripts\") pod \"1869c051-5ff7-4504-92c2-cbf07998153d\" (UID: \"1869c051-5ff7-4504-92c2-cbf07998153d\") " Mar 09 09:36:46 crc kubenswrapper[4971]: I0309 09:36:46.625397 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ccc8050-0beb-48dc-9422-04484a337b7e-operator-scripts\") pod \"7ccc8050-0beb-48dc-9422-04484a337b7e\" (UID: \"7ccc8050-0beb-48dc-9422-04484a337b7e\") " Mar 09 09:36:46 crc kubenswrapper[4971]: I0309 09:36:46.625449 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88p4f\" (UniqueName: \"kubernetes.io/projected/7ccc8050-0beb-48dc-9422-04484a337b7e-kube-api-access-88p4f\") pod \"7ccc8050-0beb-48dc-9422-04484a337b7e\" (UID: \"7ccc8050-0beb-48dc-9422-04484a337b7e\") " Mar 09 09:36:46 crc kubenswrapper[4971]: I0309 09:36:46.625485 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scp7x\" (UniqueName: \"kubernetes.io/projected/1869c051-5ff7-4504-92c2-cbf07998153d-kube-api-access-scp7x\") pod \"1869c051-5ff7-4504-92c2-cbf07998153d\" (UID: \"1869c051-5ff7-4504-92c2-cbf07998153d\") " Mar 09 09:36:46 crc kubenswrapper[4971]: I0309 09:36:46.626161 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ccc8050-0beb-48dc-9422-04484a337b7e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ccc8050-0beb-48dc-9422-04484a337b7e" (UID: "7ccc8050-0beb-48dc-9422-04484a337b7e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:36:46 crc kubenswrapper[4971]: I0309 09:36:46.626322 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1869c051-5ff7-4504-92c2-cbf07998153d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1869c051-5ff7-4504-92c2-cbf07998153d" (UID: "1869c051-5ff7-4504-92c2-cbf07998153d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:36:46 crc kubenswrapper[4971]: I0309 09:36:46.631223 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1869c051-5ff7-4504-92c2-cbf07998153d-kube-api-access-scp7x" (OuterVolumeSpecName: "kube-api-access-scp7x") pod "1869c051-5ff7-4504-92c2-cbf07998153d" (UID: "1869c051-5ff7-4504-92c2-cbf07998153d"). InnerVolumeSpecName "kube-api-access-scp7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:36:46 crc kubenswrapper[4971]: I0309 09:36:46.631316 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ccc8050-0beb-48dc-9422-04484a337b7e-kube-api-access-88p4f" (OuterVolumeSpecName: "kube-api-access-88p4f") pod "7ccc8050-0beb-48dc-9422-04484a337b7e" (UID: "7ccc8050-0beb-48dc-9422-04484a337b7e"). InnerVolumeSpecName "kube-api-access-88p4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:36:46 crc kubenswrapper[4971]: I0309 09:36:46.726797 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ccc8050-0beb-48dc-9422-04484a337b7e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:36:46 crc kubenswrapper[4971]: I0309 09:36:46.726838 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88p4f\" (UniqueName: \"kubernetes.io/projected/7ccc8050-0beb-48dc-9422-04484a337b7e-kube-api-access-88p4f\") on node \"crc\" DevicePath \"\"" Mar 09 09:36:46 crc kubenswrapper[4971]: I0309 09:36:46.726850 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scp7x\" (UniqueName: \"kubernetes.io/projected/1869c051-5ff7-4504-92c2-cbf07998153d-kube-api-access-scp7x\") on node \"crc\" DevicePath \"\"" Mar 09 09:36:46 crc kubenswrapper[4971]: I0309 09:36:46.726859 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1869c051-5ff7-4504-92c2-cbf07998153d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:36:47 crc kubenswrapper[4971]: I0309 09:36:47.139227 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-index-kv892"] Mar 09 09:36:47 crc kubenswrapper[4971]: I0309 09:36:47.161369 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-1f04-account-create-update-4wtvd" Mar 09 09:36:47 crc kubenswrapper[4971]: I0309 09:36:47.164437 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-create-qt784" Mar 09 09:36:47 crc kubenswrapper[4971]: I0309 09:36:47.164758 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-1f04-account-create-update-4wtvd" event={"ID":"1869c051-5ff7-4504-92c2-cbf07998153d","Type":"ContainerDied","Data":"e96aba92dae40cbde8de408946678f639a7c5a3a10f8b14c96b8d614e73390a0"} Mar 09 09:36:47 crc kubenswrapper[4971]: I0309 09:36:47.164805 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e96aba92dae40cbde8de408946678f639a7c5a3a10f8b14c96b8d614e73390a0" Mar 09 09:36:47 crc kubenswrapper[4971]: I0309 09:36:47.164816 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-create-qt784" event={"ID":"7ccc8050-0beb-48dc-9422-04484a337b7e","Type":"ContainerDied","Data":"c762ccdc2e87aaca2cce2661f47cc1dc9baae3eb76dac7f89329a5a00c1291d9"} Mar 09 09:36:47 crc kubenswrapper[4971]: I0309 09:36:47.164826 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c762ccdc2e87aaca2cce2661f47cc1dc9baae3eb76dac7f89329a5a00c1291d9" Mar 09 09:36:47 crc kubenswrapper[4971]: I0309 09:36:47.739126 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-index-p2vwl"] Mar 09 09:36:47 crc kubenswrapper[4971]: E0309 09:36:47.739670 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ccc8050-0beb-48dc-9422-04484a337b7e" containerName="mariadb-database-create" Mar 09 09:36:47 crc kubenswrapper[4971]: I0309 09:36:47.739682 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ccc8050-0beb-48dc-9422-04484a337b7e" containerName="mariadb-database-create" Mar 09 09:36:47 crc kubenswrapper[4971]: E0309 09:36:47.739707 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1869c051-5ff7-4504-92c2-cbf07998153d" containerName="mariadb-account-create-update" Mar 09 09:36:47 crc kubenswrapper[4971]: I0309 09:36:47.739714 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1869c051-5ff7-4504-92c2-cbf07998153d" containerName="mariadb-account-create-update" Mar 09 09:36:47 crc kubenswrapper[4971]: I0309 09:36:47.739822 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ccc8050-0beb-48dc-9422-04484a337b7e" containerName="mariadb-database-create" Mar 09 09:36:47 crc kubenswrapper[4971]: I0309 09:36:47.739839 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1869c051-5ff7-4504-92c2-cbf07998153d" containerName="mariadb-account-create-update" Mar 09 09:36:47 crc kubenswrapper[4971]: I0309 09:36:47.740257 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-p2vwl" Mar 09 09:36:47 crc kubenswrapper[4971]: I0309 09:36:47.751110 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-p2vwl"] Mar 09 09:36:47 crc kubenswrapper[4971]: I0309 09:36:47.847977 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqdxg\" (UniqueName: \"kubernetes.io/projected/02129c03-c7b1-4165-b737-019e757c635d-kube-api-access-fqdxg\") pod \"barbican-operator-index-p2vwl\" (UID: \"02129c03-c7b1-4165-b737-019e757c635d\") " pod="openstack-operators/barbican-operator-index-p2vwl" Mar 09 09:36:47 crc kubenswrapper[4971]: I0309 09:36:47.949142 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqdxg\" (UniqueName: \"kubernetes.io/projected/02129c03-c7b1-4165-b737-019e757c635d-kube-api-access-fqdxg\") pod \"barbican-operator-index-p2vwl\" (UID: \"02129c03-c7b1-4165-b737-019e757c635d\") " pod="openstack-operators/barbican-operator-index-p2vwl" Mar 09 09:36:47 crc kubenswrapper[4971]: I0309 09:36:47.971489 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqdxg\" (UniqueName: \"kubernetes.io/projected/02129c03-c7b1-4165-b737-019e757c635d-kube-api-access-fqdxg\") pod \"barbican-operator-index-p2vwl\" (UID: \"02129c03-c7b1-4165-b737-019e757c635d\") " pod="openstack-operators/barbican-operator-index-p2vwl" Mar 09 09:36:48 crc kubenswrapper[4971]: I0309 09:36:48.086330 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-p2vwl" Mar 09 09:36:48 crc kubenswrapper[4971]: I0309 09:36:48.175461 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/barbican-operator-index-kv892" podUID="667399eb-3c03-434e-b2c2-008b7a882a3f" containerName="registry-server" containerID="cri-o://a5370ed454d93138f467c69b249f02333c6cf4e34a2491b9fed028bcc31a69c7" gracePeriod=2 Mar 09 09:36:48 crc kubenswrapper[4971]: I0309 09:36:48.481778 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-p2vwl"] Mar 09 09:36:48 crc kubenswrapper[4971]: W0309 09:36:48.488986 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02129c03_c7b1_4165_b737_019e757c635d.slice/crio-213a0ed111b035255d44b829a16502bd39c67b3db0f02a5dc4fdfeb491da44cb WatchSource:0}: Error finding container 213a0ed111b035255d44b829a16502bd39c67b3db0f02a5dc4fdfeb491da44cb: Status 404 returned error can't find the container with id 213a0ed111b035255d44b829a16502bd39c67b3db0f02a5dc4fdfeb491da44cb Mar 09 09:36:48 crc kubenswrapper[4971]: I0309 09:36:48.555371 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-kv892" Mar 09 09:36:48 crc kubenswrapper[4971]: I0309 09:36:48.658182 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcmvz\" (UniqueName: \"kubernetes.io/projected/667399eb-3c03-434e-b2c2-008b7a882a3f-kube-api-access-xcmvz\") pod \"667399eb-3c03-434e-b2c2-008b7a882a3f\" (UID: \"667399eb-3c03-434e-b2c2-008b7a882a3f\") " Mar 09 09:36:48 crc kubenswrapper[4971]: I0309 09:36:48.665903 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/667399eb-3c03-434e-b2c2-008b7a882a3f-kube-api-access-xcmvz" (OuterVolumeSpecName: "kube-api-access-xcmvz") pod "667399eb-3c03-434e-b2c2-008b7a882a3f" (UID: "667399eb-3c03-434e-b2c2-008b7a882a3f"). InnerVolumeSpecName "kube-api-access-xcmvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:36:48 crc kubenswrapper[4971]: I0309 09:36:48.760480 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcmvz\" (UniqueName: \"kubernetes.io/projected/667399eb-3c03-434e-b2c2-008b7a882a3f-kube-api-access-xcmvz\") on node \"crc\" DevicePath \"\"" Mar 09 09:36:49 crc kubenswrapper[4971]: I0309 09:36:49.183230 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-p2vwl" event={"ID":"02129c03-c7b1-4165-b737-019e757c635d","Type":"ContainerStarted","Data":"264d5a9e5f5a9a128efaf840075634b91eeb9e5210770c24fd36c16162fcb4ad"} Mar 09 09:36:49 crc kubenswrapper[4971]: I0309 09:36:49.183296 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-p2vwl" event={"ID":"02129c03-c7b1-4165-b737-019e757c635d","Type":"ContainerStarted","Data":"213a0ed111b035255d44b829a16502bd39c67b3db0f02a5dc4fdfeb491da44cb"} Mar 09 09:36:49 crc kubenswrapper[4971]: I0309 09:36:49.185881 4971 generic.go:334] "Generic (PLEG): container finished" podID="667399eb-3c03-434e-b2c2-008b7a882a3f" containerID="a5370ed454d93138f467c69b249f02333c6cf4e34a2491b9fed028bcc31a69c7" exitCode=0 Mar 09 09:36:49 crc kubenswrapper[4971]: I0309 09:36:49.185930 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-kv892" event={"ID":"667399eb-3c03-434e-b2c2-008b7a882a3f","Type":"ContainerDied","Data":"a5370ed454d93138f467c69b249f02333c6cf4e34a2491b9fed028bcc31a69c7"} Mar 09 09:36:49 crc kubenswrapper[4971]: I0309 09:36:49.185957 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-kv892" event={"ID":"667399eb-3c03-434e-b2c2-008b7a882a3f","Type":"ContainerDied","Data":"545a1e7586dd15e4059f69b0f6506b3048ba739f89aae797ea54b076a7670b6b"} Mar 09 09:36:49 crc kubenswrapper[4971]: I0309 09:36:49.185980 4971 scope.go:117] "RemoveContainer" containerID="a5370ed454d93138f467c69b249f02333c6cf4e34a2491b9fed028bcc31a69c7" Mar 09 09:36:49 crc kubenswrapper[4971]: I0309 09:36:49.186105 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-kv892" Mar 09 09:36:49 crc kubenswrapper[4971]: I0309 09:36:49.211576 4971 scope.go:117] "RemoveContainer" containerID="a5370ed454d93138f467c69b249f02333c6cf4e34a2491b9fed028bcc31a69c7" Mar 09 09:36:49 crc kubenswrapper[4971]: E0309 09:36:49.211958 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5370ed454d93138f467c69b249f02333c6cf4e34a2491b9fed028bcc31a69c7\": container with ID starting with a5370ed454d93138f467c69b249f02333c6cf4e34a2491b9fed028bcc31a69c7 not found: ID does not exist" containerID="a5370ed454d93138f467c69b249f02333c6cf4e34a2491b9fed028bcc31a69c7" Mar 09 09:36:49 crc kubenswrapper[4971]: I0309 09:36:49.211993 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5370ed454d93138f467c69b249f02333c6cf4e34a2491b9fed028bcc31a69c7"} err="failed to get container status \"a5370ed454d93138f467c69b249f02333c6cf4e34a2491b9fed028bcc31a69c7\": rpc error: code = NotFound desc = could not find container \"a5370ed454d93138f467c69b249f02333c6cf4e34a2491b9fed028bcc31a69c7\": container with ID starting with a5370ed454d93138f467c69b249f02333c6cf4e34a2491b9fed028bcc31a69c7 not found: ID does not exist" Mar 09 09:36:49 crc kubenswrapper[4971]: I0309 09:36:49.214198 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-index-p2vwl" podStartSLOduration=1.7473614309999999 podStartE2EDuration="2.214177684s" podCreationTimestamp="2026-03-09 09:36:47 +0000 UTC" firstStartedPulling="2026-03-09 09:36:48.494821237 +0000 UTC m=+1012.054749047" lastFinishedPulling="2026-03-09 09:36:48.96163749 +0000 UTC m=+1012.521565300" observedRunningTime="2026-03-09 09:36:49.201384861 +0000 UTC m=+1012.761312681" watchObservedRunningTime="2026-03-09 09:36:49.214177684 +0000 UTC m=+1012.774105494" Mar 09 09:36:49 crc kubenswrapper[4971]: I0309 09:36:49.221472 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-index-kv892"] Mar 09 09:36:49 crc kubenswrapper[4971]: I0309 09:36:49.226508 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/barbican-operator-index-kv892"] Mar 09 09:36:50 crc kubenswrapper[4971]: I0309 09:36:50.821762 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/rabbitmq-server-0" Mar 09 09:36:51 crc kubenswrapper[4971]: I0309 09:36:51.159570 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="667399eb-3c03-434e-b2c2-008b7a882a3f" path="/var/lib/kubelet/pods/667399eb-3c03-434e-b2c2-008b7a882a3f/volumes" Mar 09 09:36:51 crc kubenswrapper[4971]: I0309 09:36:51.894963 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/keystone-db-sync-n9hkd"] Mar 09 09:36:51 crc kubenswrapper[4971]: E0309 09:36:51.895528 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="667399eb-3c03-434e-b2c2-008b7a882a3f" containerName="registry-server" Mar 09 09:36:51 crc kubenswrapper[4971]: I0309 09:36:51.895541 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="667399eb-3c03-434e-b2c2-008b7a882a3f" containerName="registry-server" Mar 09 09:36:51 crc kubenswrapper[4971]: I0309 09:36:51.895647 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="667399eb-3c03-434e-b2c2-008b7a882a3f" containerName="registry-server" Mar 09 09:36:51 crc kubenswrapper[4971]: I0309 09:36:51.896030 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-sync-n9hkd" Mar 09 09:36:51 crc kubenswrapper[4971]: I0309 09:36:51.898813 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone" Mar 09 09:36:51 crc kubenswrapper[4971]: I0309 09:36:51.899247 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-keystone-dockercfg-p855s" Mar 09 09:36:51 crc kubenswrapper[4971]: I0309 09:36:51.899691 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-scripts" Mar 09 09:36:51 crc kubenswrapper[4971]: I0309 09:36:51.899933 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-config-data" Mar 09 09:36:51 crc kubenswrapper[4971]: I0309 09:36:51.904167 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-db-sync-n9hkd"] Mar 09 09:36:52 crc kubenswrapper[4971]: I0309 09:36:52.006229 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a61e8dbe-0b29-4be0-a931-1ef393790f86-config-data\") pod \"keystone-db-sync-n9hkd\" (UID: \"a61e8dbe-0b29-4be0-a931-1ef393790f86\") " pod="swift-kuttl-tests/keystone-db-sync-n9hkd" Mar 09 09:36:52 crc kubenswrapper[4971]: I0309 09:36:52.006434 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg2c8\" (UniqueName: \"kubernetes.io/projected/a61e8dbe-0b29-4be0-a931-1ef393790f86-kube-api-access-hg2c8\") pod \"keystone-db-sync-n9hkd\" (UID: \"a61e8dbe-0b29-4be0-a931-1ef393790f86\") " pod="swift-kuttl-tests/keystone-db-sync-n9hkd" Mar 09 09:36:52 crc kubenswrapper[4971]: I0309 09:36:52.107999 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg2c8\" (UniqueName: \"kubernetes.io/projected/a61e8dbe-0b29-4be0-a931-1ef393790f86-kube-api-access-hg2c8\") pod \"keystone-db-sync-n9hkd\" (UID: \"a61e8dbe-0b29-4be0-a931-1ef393790f86\") " pod="swift-kuttl-tests/keystone-db-sync-n9hkd" Mar 09 09:36:52 crc kubenswrapper[4971]: I0309 09:36:52.108076 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a61e8dbe-0b29-4be0-a931-1ef393790f86-config-data\") pod \"keystone-db-sync-n9hkd\" (UID: \"a61e8dbe-0b29-4be0-a931-1ef393790f86\") " pod="swift-kuttl-tests/keystone-db-sync-n9hkd" Mar 09 09:36:52 crc kubenswrapper[4971]: I0309 09:36:52.120088 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a61e8dbe-0b29-4be0-a931-1ef393790f86-config-data\") pod \"keystone-db-sync-n9hkd\" (UID: \"a61e8dbe-0b29-4be0-a931-1ef393790f86\") " pod="swift-kuttl-tests/keystone-db-sync-n9hkd" Mar 09 09:36:52 crc kubenswrapper[4971]: I0309 09:36:52.123260 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg2c8\" (UniqueName: \"kubernetes.io/projected/a61e8dbe-0b29-4be0-a931-1ef393790f86-kube-api-access-hg2c8\") pod \"keystone-db-sync-n9hkd\" (UID: \"a61e8dbe-0b29-4be0-a931-1ef393790f86\") " pod="swift-kuttl-tests/keystone-db-sync-n9hkd" Mar 09 09:36:52 crc kubenswrapper[4971]: I0309 09:36:52.212149 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-sync-n9hkd" Mar 09 09:36:52 crc kubenswrapper[4971]: I0309 09:36:52.639094 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-db-sync-n9hkd"] Mar 09 09:36:52 crc kubenswrapper[4971]: W0309 09:36:52.649414 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda61e8dbe_0b29_4be0_a931_1ef393790f86.slice/crio-c9b84a6424b660b8f9e2ee1b8444415d79d277816940e2d214db6b3bcbf3431f WatchSource:0}: Error finding container c9b84a6424b660b8f9e2ee1b8444415d79d277816940e2d214db6b3bcbf3431f: Status 404 returned error can't find the container with id c9b84a6424b660b8f9e2ee1b8444415d79d277816940e2d214db6b3bcbf3431f Mar 09 09:36:53 crc kubenswrapper[4971]: I0309 09:36:53.211193 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-sync-n9hkd" event={"ID":"a61e8dbe-0b29-4be0-a931-1ef393790f86","Type":"ContainerStarted","Data":"c9b84a6424b660b8f9e2ee1b8444415d79d277816940e2d214db6b3bcbf3431f"} Mar 09 09:36:58 crc kubenswrapper[4971]: I0309 09:36:58.086695 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/barbican-operator-index-p2vwl" Mar 09 09:36:58 crc kubenswrapper[4971]: I0309 09:36:58.087257 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-index-p2vwl" Mar 09 09:36:58 crc kubenswrapper[4971]: I0309 09:36:58.167658 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/barbican-operator-index-p2vwl" Mar 09 09:36:58 crc kubenswrapper[4971]: I0309 09:36:58.265942 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-index-p2vwl" Mar 09 09:36:59 crc kubenswrapper[4971]: I0309 09:36:59.267062 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-sync-n9hkd" event={"ID":"a61e8dbe-0b29-4be0-a931-1ef393790f86","Type":"ContainerStarted","Data":"f696cf4677bbf2f912323d0ee1352fb90272bdbdaf527ee6b58742236234a074"} Mar 09 09:36:59 crc kubenswrapper[4971]: I0309 09:36:59.308976 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/keystone-db-sync-n9hkd" podStartSLOduration=1.952236753 podStartE2EDuration="8.308959854s" podCreationTimestamp="2026-03-09 09:36:51 +0000 UTC" firstStartedPulling="2026-03-09 09:36:52.652856565 +0000 UTC m=+1016.212784375" lastFinishedPulling="2026-03-09 09:36:59.009579666 +0000 UTC m=+1022.569507476" observedRunningTime="2026-03-09 09:36:59.308048978 +0000 UTC m=+1022.867976788" watchObservedRunningTime="2026-03-09 09:36:59.308959854 +0000 UTC m=+1022.868887664" Mar 09 09:36:59 crc kubenswrapper[4971]: I0309 09:36:59.779162 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb"] Mar 09 09:36:59 crc kubenswrapper[4971]: I0309 09:36:59.780920 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb" Mar 09 09:36:59 crc kubenswrapper[4971]: I0309 09:36:59.784118 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-w69pb" Mar 09 09:36:59 crc kubenswrapper[4971]: I0309 09:36:59.787272 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb"] Mar 09 09:36:59 crc kubenswrapper[4971]: I0309 09:36:59.921679 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d7c7637a-be84-42d8-bb09-14af7f6acc0b-bundle\") pod \"30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb\" (UID: \"d7c7637a-be84-42d8-bb09-14af7f6acc0b\") " pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb" Mar 09 09:36:59 crc kubenswrapper[4971]: I0309 09:36:59.921725 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d7c7637a-be84-42d8-bb09-14af7f6acc0b-util\") pod \"30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb\" (UID: \"d7c7637a-be84-42d8-bb09-14af7f6acc0b\") " pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb" Mar 09 09:36:59 crc kubenswrapper[4971]: I0309 09:36:59.921758 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd4x2\" (UniqueName: \"kubernetes.io/projected/d7c7637a-be84-42d8-bb09-14af7f6acc0b-kube-api-access-vd4x2\") pod \"30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb\" (UID: \"d7c7637a-be84-42d8-bb09-14af7f6acc0b\") " pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb" Mar 09 09:37:00 crc kubenswrapper[4971]: I0309 09:37:00.023408 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d7c7637a-be84-42d8-bb09-14af7f6acc0b-bundle\") pod \"30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb\" (UID: \"d7c7637a-be84-42d8-bb09-14af7f6acc0b\") " pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb" Mar 09 09:37:00 crc kubenswrapper[4971]: I0309 09:37:00.023469 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d7c7637a-be84-42d8-bb09-14af7f6acc0b-util\") pod \"30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb\" (UID: \"d7c7637a-be84-42d8-bb09-14af7f6acc0b\") " pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb" Mar 09 09:37:00 crc kubenswrapper[4971]: I0309 09:37:00.023499 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd4x2\" (UniqueName: \"kubernetes.io/projected/d7c7637a-be84-42d8-bb09-14af7f6acc0b-kube-api-access-vd4x2\") pod \"30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb\" (UID: \"d7c7637a-be84-42d8-bb09-14af7f6acc0b\") " pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb" Mar 09 09:37:00 crc kubenswrapper[4971]: I0309 09:37:00.024003 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d7c7637a-be84-42d8-bb09-14af7f6acc0b-bundle\") pod \"30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb\" (UID: \"d7c7637a-be84-42d8-bb09-14af7f6acc0b\") " pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb" Mar 09 09:37:00 crc kubenswrapper[4971]: I0309 09:37:00.024255 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d7c7637a-be84-42d8-bb09-14af7f6acc0b-util\") pod \"30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb\" (UID: \"d7c7637a-be84-42d8-bb09-14af7f6acc0b\") " pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb" Mar 09 09:37:00 crc kubenswrapper[4971]: I0309 09:37:00.042264 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd4x2\" (UniqueName: \"kubernetes.io/projected/d7c7637a-be84-42d8-bb09-14af7f6acc0b-kube-api-access-vd4x2\") pod \"30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb\" (UID: \"d7c7637a-be84-42d8-bb09-14af7f6acc0b\") " pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb" Mar 09 09:37:00 crc kubenswrapper[4971]: I0309 09:37:00.097146 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb" Mar 09 09:37:00 crc kubenswrapper[4971]: I0309 09:37:00.586304 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb"] Mar 09 09:37:01 crc kubenswrapper[4971]: I0309 09:37:01.293413 4971 generic.go:334] "Generic (PLEG): container finished" podID="d7c7637a-be84-42d8-bb09-14af7f6acc0b" containerID="4e61637ee05f458cc2f7ce39c643e2baf4421a3623182a99c9078ca9a925b492" exitCode=0 Mar 09 09:37:01 crc kubenswrapper[4971]: I0309 09:37:01.293493 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb" event={"ID":"d7c7637a-be84-42d8-bb09-14af7f6acc0b","Type":"ContainerDied","Data":"4e61637ee05f458cc2f7ce39c643e2baf4421a3623182a99c9078ca9a925b492"} Mar 09 09:37:01 crc kubenswrapper[4971]: I0309 09:37:01.293923 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb" event={"ID":"d7c7637a-be84-42d8-bb09-14af7f6acc0b","Type":"ContainerStarted","Data":"c6421edb95f1a9cec4354d42631ea93c4145da1d263e91634a2289779ecf7afb"} Mar 09 09:37:02 crc kubenswrapper[4971]: I0309 09:37:02.302013 4971 generic.go:334] "Generic (PLEG): container finished" podID="a61e8dbe-0b29-4be0-a931-1ef393790f86" containerID="f696cf4677bbf2f912323d0ee1352fb90272bdbdaf527ee6b58742236234a074" exitCode=0 Mar 09 09:37:02 crc kubenswrapper[4971]: I0309 09:37:02.302096 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-sync-n9hkd" event={"ID":"a61e8dbe-0b29-4be0-a931-1ef393790f86","Type":"ContainerDied","Data":"f696cf4677bbf2f912323d0ee1352fb90272bdbdaf527ee6b58742236234a074"} Mar 09 09:37:03 crc kubenswrapper[4971]: I0309 09:37:03.310551 4971 generic.go:334] "Generic (PLEG): container finished" podID="d7c7637a-be84-42d8-bb09-14af7f6acc0b" containerID="5e4320bbe8c76a347950821713e462ba40b8bc5cd4f3af6ca6d322a51d1fd494" exitCode=0 Mar 09 09:37:03 crc kubenswrapper[4971]: I0309 09:37:03.310663 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb" event={"ID":"d7c7637a-be84-42d8-bb09-14af7f6acc0b","Type":"ContainerDied","Data":"5e4320bbe8c76a347950821713e462ba40b8bc5cd4f3af6ca6d322a51d1fd494"} Mar 09 09:37:03 crc kubenswrapper[4971]: I0309 09:37:03.638589 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-sync-n9hkd" Mar 09 09:37:03 crc kubenswrapper[4971]: I0309 09:37:03.709446 4971 scope.go:117] "RemoveContainer" containerID="d907e0b21b6ee71fb5dc6e199d32241ec70725df639686feb43c09ab193fa9d4" Mar 09 09:37:03 crc kubenswrapper[4971]: I0309 09:37:03.770709 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a61e8dbe-0b29-4be0-a931-1ef393790f86-config-data\") pod \"a61e8dbe-0b29-4be0-a931-1ef393790f86\" (UID: \"a61e8dbe-0b29-4be0-a931-1ef393790f86\") " Mar 09 09:37:03 crc kubenswrapper[4971]: I0309 09:37:03.770811 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg2c8\" (UniqueName: \"kubernetes.io/projected/a61e8dbe-0b29-4be0-a931-1ef393790f86-kube-api-access-hg2c8\") pod \"a61e8dbe-0b29-4be0-a931-1ef393790f86\" (UID: \"a61e8dbe-0b29-4be0-a931-1ef393790f86\") " Mar 09 09:37:03 crc kubenswrapper[4971]: I0309 09:37:03.780806 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a61e8dbe-0b29-4be0-a931-1ef393790f86-kube-api-access-hg2c8" (OuterVolumeSpecName: "kube-api-access-hg2c8") pod "a61e8dbe-0b29-4be0-a931-1ef393790f86" (UID: "a61e8dbe-0b29-4be0-a931-1ef393790f86"). InnerVolumeSpecName "kube-api-access-hg2c8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:37:03 crc kubenswrapper[4971]: I0309 09:37:03.805057 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a61e8dbe-0b29-4be0-a931-1ef393790f86-config-data" (OuterVolumeSpecName: "config-data") pod "a61e8dbe-0b29-4be0-a931-1ef393790f86" (UID: "a61e8dbe-0b29-4be0-a931-1ef393790f86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:37:03 crc kubenswrapper[4971]: I0309 09:37:03.872161 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg2c8\" (UniqueName: \"kubernetes.io/projected/a61e8dbe-0b29-4be0-a931-1ef393790f86-kube-api-access-hg2c8\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:03 crc kubenswrapper[4971]: I0309 09:37:03.872188 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a61e8dbe-0b29-4be0-a931-1ef393790f86-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.321826 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-sync-n9hkd" event={"ID":"a61e8dbe-0b29-4be0-a931-1ef393790f86","Type":"ContainerDied","Data":"c9b84a6424b660b8f9e2ee1b8444415d79d277816940e2d214db6b3bcbf3431f"} Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.322223 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9b84a6424b660b8f9e2ee1b8444415d79d277816940e2d214db6b3bcbf3431f" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.322307 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-sync-n9hkd" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.327656 4971 generic.go:334] "Generic (PLEG): container finished" podID="d7c7637a-be84-42d8-bb09-14af7f6acc0b" containerID="438c12108b4d4e364be6a615d336b49a582862359ee5f74526867fd6004ff716" exitCode=0 Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.327724 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb" event={"ID":"d7c7637a-be84-42d8-bb09-14af7f6acc0b","Type":"ContainerDied","Data":"438c12108b4d4e364be6a615d336b49a582862359ee5f74526867fd6004ff716"} Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.515520 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/keystone-bootstrap-6snxg"] Mar 09 09:37:04 crc kubenswrapper[4971]: E0309 09:37:04.515815 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a61e8dbe-0b29-4be0-a931-1ef393790f86" containerName="keystone-db-sync" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.515836 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a61e8dbe-0b29-4be0-a931-1ef393790f86" containerName="keystone-db-sync" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.516059 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a61e8dbe-0b29-4be0-a931-1ef393790f86" containerName="keystone-db-sync" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.516515 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-bootstrap-6snxg" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.517865 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-scripts" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.517917 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-config-data" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.518643 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-keystone-dockercfg-p855s" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.518818 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"osp-secret" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.518996 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.532045 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-bootstrap-6snxg"] Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.588076 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbbrv\" (UniqueName: \"kubernetes.io/projected/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-kube-api-access-gbbrv\") pod \"keystone-bootstrap-6snxg\" (UID: \"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd\") " pod="swift-kuttl-tests/keystone-bootstrap-6snxg" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.588217 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-credential-keys\") pod \"keystone-bootstrap-6snxg\" (UID: \"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd\") " pod="swift-kuttl-tests/keystone-bootstrap-6snxg" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.588253 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-config-data\") pod \"keystone-bootstrap-6snxg\" (UID: \"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd\") " pod="swift-kuttl-tests/keystone-bootstrap-6snxg" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.588343 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-fernet-keys\") pod \"keystone-bootstrap-6snxg\" (UID: \"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd\") " pod="swift-kuttl-tests/keystone-bootstrap-6snxg" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.588393 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-scripts\") pod \"keystone-bootstrap-6snxg\" (UID: \"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd\") " pod="swift-kuttl-tests/keystone-bootstrap-6snxg" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.689216 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbbrv\" (UniqueName: \"kubernetes.io/projected/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-kube-api-access-gbbrv\") pod \"keystone-bootstrap-6snxg\" (UID: \"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd\") " pod="swift-kuttl-tests/keystone-bootstrap-6snxg" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.689311 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-credential-keys\") pod \"keystone-bootstrap-6snxg\" (UID: \"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd\") " pod="swift-kuttl-tests/keystone-bootstrap-6snxg" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.689362 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-config-data\") pod \"keystone-bootstrap-6snxg\" (UID: \"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd\") " pod="swift-kuttl-tests/keystone-bootstrap-6snxg" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.690401 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-fernet-keys\") pod \"keystone-bootstrap-6snxg\" (UID: \"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd\") " pod="swift-kuttl-tests/keystone-bootstrap-6snxg" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.690446 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-scripts\") pod \"keystone-bootstrap-6snxg\" (UID: \"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd\") " pod="swift-kuttl-tests/keystone-bootstrap-6snxg" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.693193 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-scripts\") pod \"keystone-bootstrap-6snxg\" (UID: \"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd\") " pod="swift-kuttl-tests/keystone-bootstrap-6snxg" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.693426 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-credential-keys\") pod \"keystone-bootstrap-6snxg\" (UID: \"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd\") " pod="swift-kuttl-tests/keystone-bootstrap-6snxg" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.693530 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-fernet-keys\") pod \"keystone-bootstrap-6snxg\" (UID: \"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd\") " pod="swift-kuttl-tests/keystone-bootstrap-6snxg" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.694701 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-config-data\") pod \"keystone-bootstrap-6snxg\" (UID: \"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd\") " pod="swift-kuttl-tests/keystone-bootstrap-6snxg" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.705107 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbbrv\" (UniqueName: \"kubernetes.io/projected/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-kube-api-access-gbbrv\") pod \"keystone-bootstrap-6snxg\" (UID: \"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd\") " pod="swift-kuttl-tests/keystone-bootstrap-6snxg" Mar 09 09:37:04 crc kubenswrapper[4971]: I0309 09:37:04.832256 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-bootstrap-6snxg" Mar 09 09:37:05 crc kubenswrapper[4971]: I0309 09:37:05.074436 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-bootstrap-6snxg"] Mar 09 09:37:05 crc kubenswrapper[4971]: I0309 09:37:05.337174 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-bootstrap-6snxg" event={"ID":"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd","Type":"ContainerStarted","Data":"6fca691ed1e7095eb057b9fa1f3238ac7ba426b229b8fed4dc336d76f3f79fc6"} Mar 09 09:37:05 crc kubenswrapper[4971]: I0309 09:37:05.337591 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-bootstrap-6snxg" event={"ID":"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd","Type":"ContainerStarted","Data":"b23ac4268ea058ce924e3c6e8d4e4d3c1467b6e9fd8e6ecf9bce80d59463a5cf"} Mar 09 09:37:05 crc kubenswrapper[4971]: I0309 09:37:05.358074 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/keystone-bootstrap-6snxg" podStartSLOduration=1.358056608 podStartE2EDuration="1.358056608s" podCreationTimestamp="2026-03-09 09:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:37:05.353014001 +0000 UTC m=+1028.912941821" watchObservedRunningTime="2026-03-09 09:37:05.358056608 +0000 UTC m=+1028.917984418" Mar 09 09:37:05 crc kubenswrapper[4971]: I0309 09:37:05.581541 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb" Mar 09 09:37:05 crc kubenswrapper[4971]: I0309 09:37:05.709810 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d7c7637a-be84-42d8-bb09-14af7f6acc0b-util\") pod \"d7c7637a-be84-42d8-bb09-14af7f6acc0b\" (UID: \"d7c7637a-be84-42d8-bb09-14af7f6acc0b\") " Mar 09 09:37:05 crc kubenswrapper[4971]: I0309 09:37:05.710035 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d7c7637a-be84-42d8-bb09-14af7f6acc0b-bundle\") pod \"d7c7637a-be84-42d8-bb09-14af7f6acc0b\" (UID: \"d7c7637a-be84-42d8-bb09-14af7f6acc0b\") " Mar 09 09:37:05 crc kubenswrapper[4971]: I0309 09:37:05.710075 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd4x2\" (UniqueName: \"kubernetes.io/projected/d7c7637a-be84-42d8-bb09-14af7f6acc0b-kube-api-access-vd4x2\") pod \"d7c7637a-be84-42d8-bb09-14af7f6acc0b\" (UID: \"d7c7637a-be84-42d8-bb09-14af7f6acc0b\") " Mar 09 09:37:05 crc kubenswrapper[4971]: I0309 09:37:05.712254 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7c7637a-be84-42d8-bb09-14af7f6acc0b-bundle" (OuterVolumeSpecName: "bundle") pod "d7c7637a-be84-42d8-bb09-14af7f6acc0b" (UID: "d7c7637a-be84-42d8-bb09-14af7f6acc0b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:37:05 crc kubenswrapper[4971]: I0309 09:37:05.717651 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7c7637a-be84-42d8-bb09-14af7f6acc0b-kube-api-access-vd4x2" (OuterVolumeSpecName: "kube-api-access-vd4x2") pod "d7c7637a-be84-42d8-bb09-14af7f6acc0b" (UID: "d7c7637a-be84-42d8-bb09-14af7f6acc0b"). InnerVolumeSpecName "kube-api-access-vd4x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:37:05 crc kubenswrapper[4971]: I0309 09:37:05.725894 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7c7637a-be84-42d8-bb09-14af7f6acc0b-util" (OuterVolumeSpecName: "util") pod "d7c7637a-be84-42d8-bb09-14af7f6acc0b" (UID: "d7c7637a-be84-42d8-bb09-14af7f6acc0b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:37:05 crc kubenswrapper[4971]: I0309 09:37:05.811494 4971 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d7c7637a-be84-42d8-bb09-14af7f6acc0b-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:05 crc kubenswrapper[4971]: I0309 09:37:05.811716 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd4x2\" (UniqueName: \"kubernetes.io/projected/d7c7637a-be84-42d8-bb09-14af7f6acc0b-kube-api-access-vd4x2\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:05 crc kubenswrapper[4971]: I0309 09:37:05.811804 4971 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d7c7637a-be84-42d8-bb09-14af7f6acc0b-util\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:06 crc kubenswrapper[4971]: I0309 09:37:06.348234 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb" Mar 09 09:37:06 crc kubenswrapper[4971]: I0309 09:37:06.348507 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb" event={"ID":"d7c7637a-be84-42d8-bb09-14af7f6acc0b","Type":"ContainerDied","Data":"c6421edb95f1a9cec4354d42631ea93c4145da1d263e91634a2289779ecf7afb"} Mar 09 09:37:06 crc kubenswrapper[4971]: I0309 09:37:06.349516 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6421edb95f1a9cec4354d42631ea93c4145da1d263e91634a2289779ecf7afb" Mar 09 09:37:08 crc kubenswrapper[4971]: I0309 09:37:08.367400 4971 generic.go:334] "Generic (PLEG): container finished" podID="e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd" containerID="6fca691ed1e7095eb057b9fa1f3238ac7ba426b229b8fed4dc336d76f3f79fc6" exitCode=0 Mar 09 09:37:08 crc kubenswrapper[4971]: I0309 09:37:08.367482 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-bootstrap-6snxg" event={"ID":"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd","Type":"ContainerDied","Data":"6fca691ed1e7095eb057b9fa1f3238ac7ba426b229b8fed4dc336d76f3f79fc6"} Mar 09 09:37:09 crc kubenswrapper[4971]: I0309 09:37:09.664962 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-bootstrap-6snxg" Mar 09 09:37:09 crc kubenswrapper[4971]: I0309 09:37:09.772974 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbbrv\" (UniqueName: \"kubernetes.io/projected/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-kube-api-access-gbbrv\") pod \"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd\" (UID: \"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd\") " Mar 09 09:37:09 crc kubenswrapper[4971]: I0309 09:37:09.773019 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-scripts\") pod \"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd\" (UID: \"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd\") " Mar 09 09:37:09 crc kubenswrapper[4971]: I0309 09:37:09.773066 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-credential-keys\") pod \"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd\" (UID: \"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd\") " Mar 09 09:37:09 crc kubenswrapper[4971]: I0309 09:37:09.773106 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-config-data\") pod \"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd\" (UID: \"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd\") " Mar 09 09:37:09 crc kubenswrapper[4971]: I0309 09:37:09.774020 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-fernet-keys\") pod \"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd\" (UID: \"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd\") " Mar 09 09:37:09 crc kubenswrapper[4971]: I0309 09:37:09.778967 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-kube-api-access-gbbrv" (OuterVolumeSpecName: "kube-api-access-gbbrv") pod "e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd" (UID: "e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd"). InnerVolumeSpecName "kube-api-access-gbbrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:37:09 crc kubenswrapper[4971]: I0309 09:37:09.779277 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd" (UID: "e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:37:09 crc kubenswrapper[4971]: I0309 09:37:09.779602 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd" (UID: "e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:37:09 crc kubenswrapper[4971]: I0309 09:37:09.780007 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-scripts" (OuterVolumeSpecName: "scripts") pod "e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd" (UID: "e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:37:09 crc kubenswrapper[4971]: I0309 09:37:09.796244 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-config-data" (OuterVolumeSpecName: "config-data") pod "e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd" (UID: "e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:37:09 crc kubenswrapper[4971]: I0309 09:37:09.876218 4971 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:09 crc kubenswrapper[4971]: I0309 09:37:09.876264 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbbrv\" (UniqueName: \"kubernetes.io/projected/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-kube-api-access-gbbrv\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:09 crc kubenswrapper[4971]: I0309 09:37:09.876280 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:09 crc kubenswrapper[4971]: I0309 09:37:09.876290 4971 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:09 crc kubenswrapper[4971]: I0309 09:37:09.876301 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.380906 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-bootstrap-6snxg" event={"ID":"e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd","Type":"ContainerDied","Data":"b23ac4268ea058ce924e3c6e8d4e4d3c1467b6e9fd8e6ecf9bce80d59463a5cf"} Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.380945 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b23ac4268ea058ce924e3c6e8d4e4d3c1467b6e9fd8e6ecf9bce80d59463a5cf" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.380974 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-bootstrap-6snxg" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.461723 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/keystone-5b6d9bc6b9-jlndm"] Mar 09 09:37:10 crc kubenswrapper[4971]: E0309 09:37:10.462009 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7c7637a-be84-42d8-bb09-14af7f6acc0b" containerName="util" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.462030 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c7637a-be84-42d8-bb09-14af7f6acc0b" containerName="util" Mar 09 09:37:10 crc kubenswrapper[4971]: E0309 09:37:10.462046 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7c7637a-be84-42d8-bb09-14af7f6acc0b" containerName="pull" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.462054 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c7637a-be84-42d8-bb09-14af7f6acc0b" containerName="pull" Mar 09 09:37:10 crc kubenswrapper[4971]: E0309 09:37:10.462072 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7c7637a-be84-42d8-bb09-14af7f6acc0b" containerName="extract" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.462080 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c7637a-be84-42d8-bb09-14af7f6acc0b" containerName="extract" Mar 09 09:37:10 crc kubenswrapper[4971]: E0309 09:37:10.462097 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd" containerName="keystone-bootstrap" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.462105 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd" containerName="keystone-bootstrap" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.462245 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7c7637a-be84-42d8-bb09-14af7f6acc0b" containerName="extract" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.462259 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd" containerName="keystone-bootstrap" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.462791 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-5b6d9bc6b9-jlndm" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.464913 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-keystone-dockercfg-p855s" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.464997 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-scripts" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.465362 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-config-data" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.466427 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.475891 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-5b6d9bc6b9-jlndm"] Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.586720 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/feed0de4-5e24-4a88-8a8c-4552940e76bb-fernet-keys\") pod \"keystone-5b6d9bc6b9-jlndm\" (UID: \"feed0de4-5e24-4a88-8a8c-4552940e76bb\") " pod="swift-kuttl-tests/keystone-5b6d9bc6b9-jlndm" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.586801 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/feed0de4-5e24-4a88-8a8c-4552940e76bb-credential-keys\") pod \"keystone-5b6d9bc6b9-jlndm\" (UID: \"feed0de4-5e24-4a88-8a8c-4552940e76bb\") " pod="swift-kuttl-tests/keystone-5b6d9bc6b9-jlndm" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.586832 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4znbg\" (UniqueName: \"kubernetes.io/projected/feed0de4-5e24-4a88-8a8c-4552940e76bb-kube-api-access-4znbg\") pod \"keystone-5b6d9bc6b9-jlndm\" (UID: \"feed0de4-5e24-4a88-8a8c-4552940e76bb\") " pod="swift-kuttl-tests/keystone-5b6d9bc6b9-jlndm" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.586941 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/feed0de4-5e24-4a88-8a8c-4552940e76bb-scripts\") pod \"keystone-5b6d9bc6b9-jlndm\" (UID: \"feed0de4-5e24-4a88-8a8c-4552940e76bb\") " pod="swift-kuttl-tests/keystone-5b6d9bc6b9-jlndm" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.587033 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feed0de4-5e24-4a88-8a8c-4552940e76bb-config-data\") pod \"keystone-5b6d9bc6b9-jlndm\" (UID: \"feed0de4-5e24-4a88-8a8c-4552940e76bb\") " pod="swift-kuttl-tests/keystone-5b6d9bc6b9-jlndm" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.689088 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/feed0de4-5e24-4a88-8a8c-4552940e76bb-fernet-keys\") pod \"keystone-5b6d9bc6b9-jlndm\" (UID: \"feed0de4-5e24-4a88-8a8c-4552940e76bb\") " pod="swift-kuttl-tests/keystone-5b6d9bc6b9-jlndm" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.689167 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/feed0de4-5e24-4a88-8a8c-4552940e76bb-credential-keys\") pod \"keystone-5b6d9bc6b9-jlndm\" (UID: \"feed0de4-5e24-4a88-8a8c-4552940e76bb\") " pod="swift-kuttl-tests/keystone-5b6d9bc6b9-jlndm" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.689191 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4znbg\" (UniqueName: \"kubernetes.io/projected/feed0de4-5e24-4a88-8a8c-4552940e76bb-kube-api-access-4znbg\") pod \"keystone-5b6d9bc6b9-jlndm\" (UID: \"feed0de4-5e24-4a88-8a8c-4552940e76bb\") " pod="swift-kuttl-tests/keystone-5b6d9bc6b9-jlndm" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.689212 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/feed0de4-5e24-4a88-8a8c-4552940e76bb-scripts\") pod \"keystone-5b6d9bc6b9-jlndm\" (UID: \"feed0de4-5e24-4a88-8a8c-4552940e76bb\") " pod="swift-kuttl-tests/keystone-5b6d9bc6b9-jlndm" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.689231 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feed0de4-5e24-4a88-8a8c-4552940e76bb-config-data\") pod \"keystone-5b6d9bc6b9-jlndm\" (UID: \"feed0de4-5e24-4a88-8a8c-4552940e76bb\") " pod="swift-kuttl-tests/keystone-5b6d9bc6b9-jlndm" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.693445 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/feed0de4-5e24-4a88-8a8c-4552940e76bb-credential-keys\") pod \"keystone-5b6d9bc6b9-jlndm\" (UID: \"feed0de4-5e24-4a88-8a8c-4552940e76bb\") " pod="swift-kuttl-tests/keystone-5b6d9bc6b9-jlndm" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.693592 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feed0de4-5e24-4a88-8a8c-4552940e76bb-config-data\") pod \"keystone-5b6d9bc6b9-jlndm\" (UID: \"feed0de4-5e24-4a88-8a8c-4552940e76bb\") " pod="swift-kuttl-tests/keystone-5b6d9bc6b9-jlndm" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.697747 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/feed0de4-5e24-4a88-8a8c-4552940e76bb-fernet-keys\") pod \"keystone-5b6d9bc6b9-jlndm\" (UID: \"feed0de4-5e24-4a88-8a8c-4552940e76bb\") " pod="swift-kuttl-tests/keystone-5b6d9bc6b9-jlndm" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.702549 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/feed0de4-5e24-4a88-8a8c-4552940e76bb-scripts\") pod \"keystone-5b6d9bc6b9-jlndm\" (UID: \"feed0de4-5e24-4a88-8a8c-4552940e76bb\") " pod="swift-kuttl-tests/keystone-5b6d9bc6b9-jlndm" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.708755 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4znbg\" (UniqueName: \"kubernetes.io/projected/feed0de4-5e24-4a88-8a8c-4552940e76bb-kube-api-access-4znbg\") pod \"keystone-5b6d9bc6b9-jlndm\" (UID: \"feed0de4-5e24-4a88-8a8c-4552940e76bb\") " pod="swift-kuttl-tests/keystone-5b6d9bc6b9-jlndm" Mar 09 09:37:10 crc kubenswrapper[4971]: I0309 09:37:10.777611 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-5b6d9bc6b9-jlndm" Mar 09 09:37:11 crc kubenswrapper[4971]: I0309 09:37:11.181730 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-5b6d9bc6b9-jlndm"] Mar 09 09:37:11 crc kubenswrapper[4971]: I0309 09:37:11.388827 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-5b6d9bc6b9-jlndm" event={"ID":"feed0de4-5e24-4a88-8a8c-4552940e76bb","Type":"ContainerStarted","Data":"14e7338649aa4c5af1eee6e76ed6ec7481a998e3499886d831fbfd6245c4d1b8"} Mar 09 09:37:11 crc kubenswrapper[4971]: I0309 09:37:11.388869 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-5b6d9bc6b9-jlndm" event={"ID":"feed0de4-5e24-4a88-8a8c-4552940e76bb","Type":"ContainerStarted","Data":"c23f2e35a08bebfd23fe8959b053330dd66d697ffb7261850461eca763797125"} Mar 09 09:37:11 crc kubenswrapper[4971]: I0309 09:37:11.389079 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/keystone-5b6d9bc6b9-jlndm" Mar 09 09:37:11 crc kubenswrapper[4971]: I0309 09:37:11.408125 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/keystone-5b6d9bc6b9-jlndm" podStartSLOduration=1.408100999 podStartE2EDuration="1.408100999s" podCreationTimestamp="2026-03-09 09:37:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:37:11.403105414 +0000 UTC m=+1034.963033214" watchObservedRunningTime="2026-03-09 09:37:11.408100999 +0000 UTC m=+1034.968028809" Mar 09 09:37:17 crc kubenswrapper[4971]: I0309 09:37:17.817541 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5bfb4fc94-fzpfg"] Mar 09 09:37:17 crc kubenswrapper[4971]: I0309 09:37:17.819061 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5bfb4fc94-fzpfg" Mar 09 09:37:17 crc kubenswrapper[4971]: I0309 09:37:17.828178 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-m9v4z" Mar 09 09:37:17 crc kubenswrapper[4971]: I0309 09:37:17.828747 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-service-cert" Mar 09 09:37:17 crc kubenswrapper[4971]: I0309 09:37:17.860739 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5bfb4fc94-fzpfg"] Mar 09 09:37:17 crc kubenswrapper[4971]: I0309 09:37:17.887385 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nw7x\" (UniqueName: \"kubernetes.io/projected/732b8106-b919-410c-b481-43320eb43604-kube-api-access-4nw7x\") pod \"barbican-operator-controller-manager-5bfb4fc94-fzpfg\" (UID: \"732b8106-b919-410c-b481-43320eb43604\") " pod="openstack-operators/barbican-operator-controller-manager-5bfb4fc94-fzpfg" Mar 09 09:37:17 crc kubenswrapper[4971]: I0309 09:37:17.887437 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/732b8106-b919-410c-b481-43320eb43604-apiservice-cert\") pod \"barbican-operator-controller-manager-5bfb4fc94-fzpfg\" (UID: \"732b8106-b919-410c-b481-43320eb43604\") " pod="openstack-operators/barbican-operator-controller-manager-5bfb4fc94-fzpfg" Mar 09 09:37:17 crc kubenswrapper[4971]: I0309 09:37:17.887510 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/732b8106-b919-410c-b481-43320eb43604-webhook-cert\") pod \"barbican-operator-controller-manager-5bfb4fc94-fzpfg\" (UID: \"732b8106-b919-410c-b481-43320eb43604\") " pod="openstack-operators/barbican-operator-controller-manager-5bfb4fc94-fzpfg" Mar 09 09:37:17 crc kubenswrapper[4971]: I0309 09:37:17.988432 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nw7x\" (UniqueName: \"kubernetes.io/projected/732b8106-b919-410c-b481-43320eb43604-kube-api-access-4nw7x\") pod \"barbican-operator-controller-manager-5bfb4fc94-fzpfg\" (UID: \"732b8106-b919-410c-b481-43320eb43604\") " pod="openstack-operators/barbican-operator-controller-manager-5bfb4fc94-fzpfg" Mar 09 09:37:17 crc kubenswrapper[4971]: I0309 09:37:17.988788 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/732b8106-b919-410c-b481-43320eb43604-apiservice-cert\") pod \"barbican-operator-controller-manager-5bfb4fc94-fzpfg\" (UID: \"732b8106-b919-410c-b481-43320eb43604\") " pod="openstack-operators/barbican-operator-controller-manager-5bfb4fc94-fzpfg" Mar 09 09:37:17 crc kubenswrapper[4971]: I0309 09:37:17.988815 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/732b8106-b919-410c-b481-43320eb43604-webhook-cert\") pod \"barbican-operator-controller-manager-5bfb4fc94-fzpfg\" (UID: \"732b8106-b919-410c-b481-43320eb43604\") " pod="openstack-operators/barbican-operator-controller-manager-5bfb4fc94-fzpfg" Mar 09 09:37:17 crc kubenswrapper[4971]: I0309 09:37:17.995869 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/732b8106-b919-410c-b481-43320eb43604-webhook-cert\") pod \"barbican-operator-controller-manager-5bfb4fc94-fzpfg\" (UID: \"732b8106-b919-410c-b481-43320eb43604\") " pod="openstack-operators/barbican-operator-controller-manager-5bfb4fc94-fzpfg" Mar 09 09:37:18 crc kubenswrapper[4971]: I0309 09:37:18.004564 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/732b8106-b919-410c-b481-43320eb43604-apiservice-cert\") pod \"barbican-operator-controller-manager-5bfb4fc94-fzpfg\" (UID: \"732b8106-b919-410c-b481-43320eb43604\") " pod="openstack-operators/barbican-operator-controller-manager-5bfb4fc94-fzpfg" Mar 09 09:37:18 crc kubenswrapper[4971]: I0309 09:37:18.014653 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nw7x\" (UniqueName: \"kubernetes.io/projected/732b8106-b919-410c-b481-43320eb43604-kube-api-access-4nw7x\") pod \"barbican-operator-controller-manager-5bfb4fc94-fzpfg\" (UID: \"732b8106-b919-410c-b481-43320eb43604\") " pod="openstack-operators/barbican-operator-controller-manager-5bfb4fc94-fzpfg" Mar 09 09:37:18 crc kubenswrapper[4971]: I0309 09:37:18.162202 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5bfb4fc94-fzpfg" Mar 09 09:37:18 crc kubenswrapper[4971]: I0309 09:37:18.613580 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5bfb4fc94-fzpfg"] Mar 09 09:37:19 crc kubenswrapper[4971]: I0309 09:37:19.438070 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5bfb4fc94-fzpfg" event={"ID":"732b8106-b919-410c-b481-43320eb43604","Type":"ContainerStarted","Data":"aaa32b03a9eba64ee2d105465b6bd2c1de220d88b215cee0dbba1ef96523fa8b"} Mar 09 09:37:23 crc kubenswrapper[4971]: I0309 09:37:23.462902 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5bfb4fc94-fzpfg" event={"ID":"732b8106-b919-410c-b481-43320eb43604","Type":"ContainerStarted","Data":"eeb60bf7a5aa85b00b27186e609fe0602bddb5b6d619b41c9a7fa34c68ed5b6f"} Mar 09 09:37:23 crc kubenswrapper[4971]: I0309 09:37:23.463499 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5bfb4fc94-fzpfg" Mar 09 09:37:23 crc kubenswrapper[4971]: I0309 09:37:23.486895 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-5bfb4fc94-fzpfg" podStartSLOduration=2.210193187 podStartE2EDuration="6.486877383s" podCreationTimestamp="2026-03-09 09:37:17 +0000 UTC" firstStartedPulling="2026-03-09 09:37:18.622185091 +0000 UTC m=+1042.182112901" lastFinishedPulling="2026-03-09 09:37:22.898869287 +0000 UTC m=+1046.458797097" observedRunningTime="2026-03-09 09:37:23.480224499 +0000 UTC m=+1047.040152329" watchObservedRunningTime="2026-03-09 09:37:23.486877383 +0000 UTC m=+1047.046805193" Mar 09 09:37:28 crc kubenswrapper[4971]: I0309 09:37:28.169991 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5bfb4fc94-fzpfg" Mar 09 09:37:32 crc kubenswrapper[4971]: I0309 09:37:32.547030 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-index-h772j"] Mar 09 09:37:32 crc kubenswrapper[4971]: I0309 09:37:32.548067 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-h772j" Mar 09 09:37:32 crc kubenswrapper[4971]: I0309 09:37:32.549929 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-index-dockercfg-45kf4" Mar 09 09:37:32 crc kubenswrapper[4971]: I0309 09:37:32.563064 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-h772j"] Mar 09 09:37:32 crc kubenswrapper[4971]: I0309 09:37:32.601330 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn585\" (UniqueName: \"kubernetes.io/projected/0f5272ac-d8e3-45ac-b7c0-51279a2416eb-kube-api-access-zn585\") pod \"swift-operator-index-h772j\" (UID: \"0f5272ac-d8e3-45ac-b7c0-51279a2416eb\") " pod="openstack-operators/swift-operator-index-h772j" Mar 09 09:37:32 crc kubenswrapper[4971]: I0309 09:37:32.703107 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn585\" (UniqueName: \"kubernetes.io/projected/0f5272ac-d8e3-45ac-b7c0-51279a2416eb-kube-api-access-zn585\") pod \"swift-operator-index-h772j\" (UID: \"0f5272ac-d8e3-45ac-b7c0-51279a2416eb\") " pod="openstack-operators/swift-operator-index-h772j" Mar 09 09:37:32 crc kubenswrapper[4971]: I0309 09:37:32.729576 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn585\" (UniqueName: \"kubernetes.io/projected/0f5272ac-d8e3-45ac-b7c0-51279a2416eb-kube-api-access-zn585\") pod \"swift-operator-index-h772j\" (UID: \"0f5272ac-d8e3-45ac-b7c0-51279a2416eb\") " pod="openstack-operators/swift-operator-index-h772j" Mar 09 09:37:32 crc kubenswrapper[4971]: I0309 09:37:32.869743 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-h772j" Mar 09 09:37:33 crc kubenswrapper[4971]: I0309 09:37:33.288477 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-h772j"] Mar 09 09:37:33 crc kubenswrapper[4971]: I0309 09:37:33.537555 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-h772j" event={"ID":"0f5272ac-d8e3-45ac-b7c0-51279a2416eb","Type":"ContainerStarted","Data":"cb279243cbafa24cf032bffeb84ed0fc38ce3b9360357ec0707d3985894ca143"} Mar 09 09:37:36 crc kubenswrapper[4971]: I0309 09:37:36.586077 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-h772j" event={"ID":"0f5272ac-d8e3-45ac-b7c0-51279a2416eb","Type":"ContainerStarted","Data":"2796681d5e46c46b3a1cfc0798f5a8cd3521de7e359e90c6788265bd2e08906e"} Mar 09 09:37:36 crc kubenswrapper[4971]: I0309 09:37:36.927897 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-index-h772j" podStartSLOduration=2.138780484 podStartE2EDuration="4.927877549s" podCreationTimestamp="2026-03-09 09:37:32 +0000 UTC" firstStartedPulling="2026-03-09 09:37:33.288319121 +0000 UTC m=+1056.848246931" lastFinishedPulling="2026-03-09 09:37:36.077416186 +0000 UTC m=+1059.637343996" observedRunningTime="2026-03-09 09:37:36.599191899 +0000 UTC m=+1060.159119709" watchObservedRunningTime="2026-03-09 09:37:36.927877549 +0000 UTC m=+1060.487805359" Mar 09 09:37:36 crc kubenswrapper[4971]: I0309 09:37:36.932576 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-index-h772j"] Mar 09 09:37:37 crc kubenswrapper[4971]: I0309 09:37:37.539160 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-index-jc7b2"] Mar 09 09:37:37 crc kubenswrapper[4971]: I0309 09:37:37.539965 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-jc7b2" Mar 09 09:37:37 crc kubenswrapper[4971]: I0309 09:37:37.548079 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-jc7b2"] Mar 09 09:37:37 crc kubenswrapper[4971]: I0309 09:37:37.698047 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhhsm\" (UniqueName: \"kubernetes.io/projected/97cd3aa2-fa2c-4950-aadc-75530bbfe9bb-kube-api-access-hhhsm\") pod \"swift-operator-index-jc7b2\" (UID: \"97cd3aa2-fa2c-4950-aadc-75530bbfe9bb\") " pod="openstack-operators/swift-operator-index-jc7b2" Mar 09 09:37:37 crc kubenswrapper[4971]: I0309 09:37:37.799035 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhhsm\" (UniqueName: \"kubernetes.io/projected/97cd3aa2-fa2c-4950-aadc-75530bbfe9bb-kube-api-access-hhhsm\") pod \"swift-operator-index-jc7b2\" (UID: \"97cd3aa2-fa2c-4950-aadc-75530bbfe9bb\") " pod="openstack-operators/swift-operator-index-jc7b2" Mar 09 09:37:37 crc kubenswrapper[4971]: I0309 09:37:37.816735 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhhsm\" (UniqueName: \"kubernetes.io/projected/97cd3aa2-fa2c-4950-aadc-75530bbfe9bb-kube-api-access-hhhsm\") pod \"swift-operator-index-jc7b2\" (UID: \"97cd3aa2-fa2c-4950-aadc-75530bbfe9bb\") " pod="openstack-operators/swift-operator-index-jc7b2" Mar 09 09:37:37 crc kubenswrapper[4971]: I0309 09:37:37.855513 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-jc7b2" Mar 09 09:37:38 crc kubenswrapper[4971]: I0309 09:37:38.248212 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-jc7b2"] Mar 09 09:37:38 crc kubenswrapper[4971]: W0309 09:37:38.252543 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97cd3aa2_fa2c_4950_aadc_75530bbfe9bb.slice/crio-7b973486a6604610ecf34d0f65c13299d3ae4a631888308095bb8a674c9789f5 WatchSource:0}: Error finding container 7b973486a6604610ecf34d0f65c13299d3ae4a631888308095bb8a674c9789f5: Status 404 returned error can't find the container with id 7b973486a6604610ecf34d0f65c13299d3ae4a631888308095bb8a674c9789f5 Mar 09 09:37:38 crc kubenswrapper[4971]: I0309 09:37:38.602254 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-jc7b2" event={"ID":"97cd3aa2-fa2c-4950-aadc-75530bbfe9bb","Type":"ContainerStarted","Data":"f146b50c1119e9fe43b3ed464c89b45f2e4016417280fa21779d649744fd3702"} Mar 09 09:37:38 crc kubenswrapper[4971]: I0309 09:37:38.602578 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-jc7b2" event={"ID":"97cd3aa2-fa2c-4950-aadc-75530bbfe9bb","Type":"ContainerStarted","Data":"7b973486a6604610ecf34d0f65c13299d3ae4a631888308095bb8a674c9789f5"} Mar 09 09:37:38 crc kubenswrapper[4971]: I0309 09:37:38.602517 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/swift-operator-index-h772j" podUID="0f5272ac-d8e3-45ac-b7c0-51279a2416eb" containerName="registry-server" containerID="cri-o://2796681d5e46c46b3a1cfc0798f5a8cd3521de7e359e90c6788265bd2e08906e" gracePeriod=2 Mar 09 09:37:38 crc kubenswrapper[4971]: I0309 09:37:38.619974 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-index-jc7b2" podStartSLOduration=1.559562415 podStartE2EDuration="1.619950647s" podCreationTimestamp="2026-03-09 09:37:37 +0000 UTC" firstStartedPulling="2026-03-09 09:37:38.256196954 +0000 UTC m=+1061.816124764" lastFinishedPulling="2026-03-09 09:37:38.316585186 +0000 UTC m=+1061.876512996" observedRunningTime="2026-03-09 09:37:38.617564467 +0000 UTC m=+1062.177492277" watchObservedRunningTime="2026-03-09 09:37:38.619950647 +0000 UTC m=+1062.179878467" Mar 09 09:37:39 crc kubenswrapper[4971]: I0309 09:37:39.049397 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-h772j" Mar 09 09:37:39 crc kubenswrapper[4971]: I0309 09:37:39.215393 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zn585\" (UniqueName: \"kubernetes.io/projected/0f5272ac-d8e3-45ac-b7c0-51279a2416eb-kube-api-access-zn585\") pod \"0f5272ac-d8e3-45ac-b7c0-51279a2416eb\" (UID: \"0f5272ac-d8e3-45ac-b7c0-51279a2416eb\") " Mar 09 09:37:39 crc kubenswrapper[4971]: I0309 09:37:39.221669 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f5272ac-d8e3-45ac-b7c0-51279a2416eb-kube-api-access-zn585" (OuterVolumeSpecName: "kube-api-access-zn585") pod "0f5272ac-d8e3-45ac-b7c0-51279a2416eb" (UID: "0f5272ac-d8e3-45ac-b7c0-51279a2416eb"). InnerVolumeSpecName "kube-api-access-zn585". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:37:39 crc kubenswrapper[4971]: I0309 09:37:39.317440 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zn585\" (UniqueName: \"kubernetes.io/projected/0f5272ac-d8e3-45ac-b7c0-51279a2416eb-kube-api-access-zn585\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:39 crc kubenswrapper[4971]: I0309 09:37:39.611413 4971 generic.go:334] "Generic (PLEG): container finished" podID="0f5272ac-d8e3-45ac-b7c0-51279a2416eb" containerID="2796681d5e46c46b3a1cfc0798f5a8cd3521de7e359e90c6788265bd2e08906e" exitCode=0 Mar 09 09:37:39 crc kubenswrapper[4971]: I0309 09:37:39.611733 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-h772j" event={"ID":"0f5272ac-d8e3-45ac-b7c0-51279a2416eb","Type":"ContainerDied","Data":"2796681d5e46c46b3a1cfc0798f5a8cd3521de7e359e90c6788265bd2e08906e"} Mar 09 09:37:39 crc kubenswrapper[4971]: I0309 09:37:39.611802 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-h772j" event={"ID":"0f5272ac-d8e3-45ac-b7c0-51279a2416eb","Type":"ContainerDied","Data":"cb279243cbafa24cf032bffeb84ed0fc38ce3b9360357ec0707d3985894ca143"} Mar 09 09:37:39 crc kubenswrapper[4971]: I0309 09:37:39.611831 4971 scope.go:117] "RemoveContainer" containerID="2796681d5e46c46b3a1cfc0798f5a8cd3521de7e359e90c6788265bd2e08906e" Mar 09 09:37:39 crc kubenswrapper[4971]: I0309 09:37:39.612116 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-h772j" Mar 09 09:37:39 crc kubenswrapper[4971]: I0309 09:37:39.633807 4971 scope.go:117] "RemoveContainer" containerID="2796681d5e46c46b3a1cfc0798f5a8cd3521de7e359e90c6788265bd2e08906e" Mar 09 09:37:39 crc kubenswrapper[4971]: E0309 09:37:39.634305 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2796681d5e46c46b3a1cfc0798f5a8cd3521de7e359e90c6788265bd2e08906e\": container with ID starting with 2796681d5e46c46b3a1cfc0798f5a8cd3521de7e359e90c6788265bd2e08906e not found: ID does not exist" containerID="2796681d5e46c46b3a1cfc0798f5a8cd3521de7e359e90c6788265bd2e08906e" Mar 09 09:37:39 crc kubenswrapper[4971]: I0309 09:37:39.634376 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2796681d5e46c46b3a1cfc0798f5a8cd3521de7e359e90c6788265bd2e08906e"} err="failed to get container status \"2796681d5e46c46b3a1cfc0798f5a8cd3521de7e359e90c6788265bd2e08906e\": rpc error: code = NotFound desc = could not find container \"2796681d5e46c46b3a1cfc0798f5a8cd3521de7e359e90c6788265bd2e08906e\": container with ID starting with 2796681d5e46c46b3a1cfc0798f5a8cd3521de7e359e90c6788265bd2e08906e not found: ID does not exist" Mar 09 09:37:39 crc kubenswrapper[4971]: I0309 09:37:39.662755 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-index-h772j"] Mar 09 09:37:39 crc kubenswrapper[4971]: I0309 09:37:39.669103 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/swift-operator-index-h772j"] Mar 09 09:37:41 crc kubenswrapper[4971]: I0309 09:37:41.160729 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f5272ac-d8e3-45ac-b7c0-51279a2416eb" path="/var/lib/kubelet/pods/0f5272ac-d8e3-45ac-b7c0-51279a2416eb/volumes" Mar 09 09:37:42 crc kubenswrapper[4971]: I0309 09:37:42.448425 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/keystone-5b6d9bc6b9-jlndm" Mar 09 09:37:47 crc kubenswrapper[4971]: I0309 09:37:47.855991 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-index-jc7b2" Mar 09 09:37:47 crc kubenswrapper[4971]: I0309 09:37:47.856582 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/swift-operator-index-jc7b2" Mar 09 09:37:47 crc kubenswrapper[4971]: I0309 09:37:47.881663 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/swift-operator-index-jc7b2" Mar 09 09:37:48 crc kubenswrapper[4971]: I0309 09:37:48.698844 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-index-jc7b2" Mar 09 09:37:50 crc kubenswrapper[4971]: I0309 09:37:50.291618 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/barbican-db-create-szs62"] Mar 09 09:37:50 crc kubenswrapper[4971]: E0309 09:37:50.292251 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f5272ac-d8e3-45ac-b7c0-51279a2416eb" containerName="registry-server" Mar 09 09:37:50 crc kubenswrapper[4971]: I0309 09:37:50.292270 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f5272ac-d8e3-45ac-b7c0-51279a2416eb" containerName="registry-server" Mar 09 09:37:50 crc kubenswrapper[4971]: I0309 09:37:50.292480 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f5272ac-d8e3-45ac-b7c0-51279a2416eb" containerName="registry-server" Mar 09 09:37:50 crc kubenswrapper[4971]: I0309 09:37:50.293038 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-create-szs62" Mar 09 09:37:50 crc kubenswrapper[4971]: I0309 09:37:50.333781 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/barbican-1b72-account-create-update-z76fh"] Mar 09 09:37:50 crc kubenswrapper[4971]: I0309 09:37:50.335036 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-1b72-account-create-update-z76fh" Mar 09 09:37:50 crc kubenswrapper[4971]: I0309 09:37:50.337081 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-db-secret" Mar 09 09:37:50 crc kubenswrapper[4971]: I0309 09:37:50.340056 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-db-create-szs62"] Mar 09 09:37:50 crc kubenswrapper[4971]: I0309 09:37:50.347783 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-1b72-account-create-update-z76fh"] Mar 09 09:37:50 crc kubenswrapper[4971]: I0309 09:37:50.478496 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd6bx\" (UniqueName: \"kubernetes.io/projected/6609af45-62cb-4830-b4d1-39700af89b1b-kube-api-access-cd6bx\") pod \"barbican-1b72-account-create-update-z76fh\" (UID: \"6609af45-62cb-4830-b4d1-39700af89b1b\") " pod="swift-kuttl-tests/barbican-1b72-account-create-update-z76fh" Mar 09 09:37:50 crc kubenswrapper[4971]: I0309 09:37:50.478627 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3248073d-6eed-48b2-a088-b84c57ae3579-operator-scripts\") pod \"barbican-db-create-szs62\" (UID: \"3248073d-6eed-48b2-a088-b84c57ae3579\") " pod="swift-kuttl-tests/barbican-db-create-szs62" Mar 09 09:37:50 crc kubenswrapper[4971]: I0309 09:37:50.478733 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6609af45-62cb-4830-b4d1-39700af89b1b-operator-scripts\") pod \"barbican-1b72-account-create-update-z76fh\" (UID: \"6609af45-62cb-4830-b4d1-39700af89b1b\") " pod="swift-kuttl-tests/barbican-1b72-account-create-update-z76fh" Mar 09 09:37:50 crc kubenswrapper[4971]: I0309 09:37:50.478787 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svsq7\" (UniqueName: \"kubernetes.io/projected/3248073d-6eed-48b2-a088-b84c57ae3579-kube-api-access-svsq7\") pod \"barbican-db-create-szs62\" (UID: \"3248073d-6eed-48b2-a088-b84c57ae3579\") " pod="swift-kuttl-tests/barbican-db-create-szs62" Mar 09 09:37:50 crc kubenswrapper[4971]: I0309 09:37:50.580670 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3248073d-6eed-48b2-a088-b84c57ae3579-operator-scripts\") pod \"barbican-db-create-szs62\" (UID: \"3248073d-6eed-48b2-a088-b84c57ae3579\") " pod="swift-kuttl-tests/barbican-db-create-szs62" Mar 09 09:37:50 crc kubenswrapper[4971]: I0309 09:37:50.580723 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6609af45-62cb-4830-b4d1-39700af89b1b-operator-scripts\") pod \"barbican-1b72-account-create-update-z76fh\" (UID: \"6609af45-62cb-4830-b4d1-39700af89b1b\") " pod="swift-kuttl-tests/barbican-1b72-account-create-update-z76fh" Mar 09 09:37:50 crc kubenswrapper[4971]: I0309 09:37:50.580746 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svsq7\" (UniqueName: \"kubernetes.io/projected/3248073d-6eed-48b2-a088-b84c57ae3579-kube-api-access-svsq7\") pod \"barbican-db-create-szs62\" (UID: \"3248073d-6eed-48b2-a088-b84c57ae3579\") " pod="swift-kuttl-tests/barbican-db-create-szs62" Mar 09 09:37:50 crc kubenswrapper[4971]: I0309 09:37:50.580794 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd6bx\" (UniqueName: \"kubernetes.io/projected/6609af45-62cb-4830-b4d1-39700af89b1b-kube-api-access-cd6bx\") pod \"barbican-1b72-account-create-update-z76fh\" (UID: \"6609af45-62cb-4830-b4d1-39700af89b1b\") " pod="swift-kuttl-tests/barbican-1b72-account-create-update-z76fh" Mar 09 09:37:50 crc kubenswrapper[4971]: I0309 09:37:50.581642 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6609af45-62cb-4830-b4d1-39700af89b1b-operator-scripts\") pod \"barbican-1b72-account-create-update-z76fh\" (UID: \"6609af45-62cb-4830-b4d1-39700af89b1b\") " pod="swift-kuttl-tests/barbican-1b72-account-create-update-z76fh" Mar 09 09:37:50 crc kubenswrapper[4971]: I0309 09:37:50.582263 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3248073d-6eed-48b2-a088-b84c57ae3579-operator-scripts\") pod \"barbican-db-create-szs62\" (UID: \"3248073d-6eed-48b2-a088-b84c57ae3579\") " pod="swift-kuttl-tests/barbican-db-create-szs62" Mar 09 09:37:50 crc kubenswrapper[4971]: I0309 09:37:50.600294 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svsq7\" (UniqueName: \"kubernetes.io/projected/3248073d-6eed-48b2-a088-b84c57ae3579-kube-api-access-svsq7\") pod \"barbican-db-create-szs62\" (UID: \"3248073d-6eed-48b2-a088-b84c57ae3579\") " pod="swift-kuttl-tests/barbican-db-create-szs62" Mar 09 09:37:50 crc kubenswrapper[4971]: I0309 09:37:50.604061 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd6bx\" (UniqueName: \"kubernetes.io/projected/6609af45-62cb-4830-b4d1-39700af89b1b-kube-api-access-cd6bx\") pod \"barbican-1b72-account-create-update-z76fh\" (UID: \"6609af45-62cb-4830-b4d1-39700af89b1b\") " pod="swift-kuttl-tests/barbican-1b72-account-create-update-z76fh" Mar 09 09:37:50 crc kubenswrapper[4971]: I0309 09:37:50.651058 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-create-szs62" Mar 09 09:37:50 crc kubenswrapper[4971]: I0309 09:37:50.658402 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-1b72-account-create-update-z76fh" Mar 09 09:37:51 crc kubenswrapper[4971]: I0309 09:37:51.141087 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-db-create-szs62"] Mar 09 09:37:51 crc kubenswrapper[4971]: I0309 09:37:51.273206 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-1b72-account-create-update-z76fh"] Mar 09 09:37:51 crc kubenswrapper[4971]: W0309 09:37:51.277854 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6609af45_62cb_4830_b4d1_39700af89b1b.slice/crio-ad39161c6d2790cf714476506bf342f5662273f9443791542892b38ce1fca73c WatchSource:0}: Error finding container ad39161c6d2790cf714476506bf342f5662273f9443791542892b38ce1fca73c: Status 404 returned error can't find the container with id ad39161c6d2790cf714476506bf342f5662273f9443791542892b38ce1fca73c Mar 09 09:37:51 crc kubenswrapper[4971]: I0309 09:37:51.691955 4971 generic.go:334] "Generic (PLEG): container finished" podID="3248073d-6eed-48b2-a088-b84c57ae3579" containerID="6bd0613e16689845322a70359db1f3b709f820c2d340682b22ecca4e72b9e5b7" exitCode=0 Mar 09 09:37:51 crc kubenswrapper[4971]: I0309 09:37:51.692052 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-create-szs62" event={"ID":"3248073d-6eed-48b2-a088-b84c57ae3579","Type":"ContainerDied","Data":"6bd0613e16689845322a70359db1f3b709f820c2d340682b22ecca4e72b9e5b7"} Mar 09 09:37:51 crc kubenswrapper[4971]: I0309 09:37:51.692097 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-create-szs62" event={"ID":"3248073d-6eed-48b2-a088-b84c57ae3579","Type":"ContainerStarted","Data":"8d342c7b554f2925a364fbe1e06c48b4492e72343866e9701f82bee09636704d"} Mar 09 09:37:51 crc kubenswrapper[4971]: I0309 09:37:51.693622 4971 generic.go:334] "Generic (PLEG): container finished" podID="6609af45-62cb-4830-b4d1-39700af89b1b" containerID="44a5ac950525c0a616a0dfeea6f099440a2ce95a3f797d003275ffdb5cfa379a" exitCode=0 Mar 09 09:37:51 crc kubenswrapper[4971]: I0309 09:37:51.693660 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-1b72-account-create-update-z76fh" event={"ID":"6609af45-62cb-4830-b4d1-39700af89b1b","Type":"ContainerDied","Data":"44a5ac950525c0a616a0dfeea6f099440a2ce95a3f797d003275ffdb5cfa379a"} Mar 09 09:37:51 crc kubenswrapper[4971]: I0309 09:37:51.693680 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-1b72-account-create-update-z76fh" event={"ID":"6609af45-62cb-4830-b4d1-39700af89b1b","Type":"ContainerStarted","Data":"ad39161c6d2790cf714476506bf342f5662273f9443791542892b38ce1fca73c"} Mar 09 09:37:53 crc kubenswrapper[4971]: I0309 09:37:53.073369 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-1b72-account-create-update-z76fh" Mar 09 09:37:53 crc kubenswrapper[4971]: I0309 09:37:53.159636 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-create-szs62" Mar 09 09:37:53 crc kubenswrapper[4971]: I0309 09:37:53.222827 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6609af45-62cb-4830-b4d1-39700af89b1b-operator-scripts\") pod \"6609af45-62cb-4830-b4d1-39700af89b1b\" (UID: \"6609af45-62cb-4830-b4d1-39700af89b1b\") " Mar 09 09:37:53 crc kubenswrapper[4971]: I0309 09:37:53.223000 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd6bx\" (UniqueName: \"kubernetes.io/projected/6609af45-62cb-4830-b4d1-39700af89b1b-kube-api-access-cd6bx\") pod \"6609af45-62cb-4830-b4d1-39700af89b1b\" (UID: \"6609af45-62cb-4830-b4d1-39700af89b1b\") " Mar 09 09:37:53 crc kubenswrapper[4971]: I0309 09:37:53.223957 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6609af45-62cb-4830-b4d1-39700af89b1b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6609af45-62cb-4830-b4d1-39700af89b1b" (UID: "6609af45-62cb-4830-b4d1-39700af89b1b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:37:53 crc kubenswrapper[4971]: I0309 09:37:53.229785 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6609af45-62cb-4830-b4d1-39700af89b1b-kube-api-access-cd6bx" (OuterVolumeSpecName: "kube-api-access-cd6bx") pod "6609af45-62cb-4830-b4d1-39700af89b1b" (UID: "6609af45-62cb-4830-b4d1-39700af89b1b"). InnerVolumeSpecName "kube-api-access-cd6bx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:37:53 crc kubenswrapper[4971]: I0309 09:37:53.324511 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3248073d-6eed-48b2-a088-b84c57ae3579-operator-scripts\") pod \"3248073d-6eed-48b2-a088-b84c57ae3579\" (UID: \"3248073d-6eed-48b2-a088-b84c57ae3579\") " Mar 09 09:37:53 crc kubenswrapper[4971]: I0309 09:37:53.324679 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svsq7\" (UniqueName: \"kubernetes.io/projected/3248073d-6eed-48b2-a088-b84c57ae3579-kube-api-access-svsq7\") pod \"3248073d-6eed-48b2-a088-b84c57ae3579\" (UID: \"3248073d-6eed-48b2-a088-b84c57ae3579\") " Mar 09 09:37:53 crc kubenswrapper[4971]: I0309 09:37:53.325040 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6609af45-62cb-4830-b4d1-39700af89b1b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:53 crc kubenswrapper[4971]: I0309 09:37:53.325064 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cd6bx\" (UniqueName: \"kubernetes.io/projected/6609af45-62cb-4830-b4d1-39700af89b1b-kube-api-access-cd6bx\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:53 crc kubenswrapper[4971]: I0309 09:37:53.325613 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3248073d-6eed-48b2-a088-b84c57ae3579-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3248073d-6eed-48b2-a088-b84c57ae3579" (UID: "3248073d-6eed-48b2-a088-b84c57ae3579"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:37:53 crc kubenswrapper[4971]: I0309 09:37:53.327954 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3248073d-6eed-48b2-a088-b84c57ae3579-kube-api-access-svsq7" (OuterVolumeSpecName: "kube-api-access-svsq7") pod "3248073d-6eed-48b2-a088-b84c57ae3579" (UID: "3248073d-6eed-48b2-a088-b84c57ae3579"). InnerVolumeSpecName "kube-api-access-svsq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:37:53 crc kubenswrapper[4971]: I0309 09:37:53.426834 4971 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3248073d-6eed-48b2-a088-b84c57ae3579-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:53 crc kubenswrapper[4971]: I0309 09:37:53.426881 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svsq7\" (UniqueName: \"kubernetes.io/projected/3248073d-6eed-48b2-a088-b84c57ae3579-kube-api-access-svsq7\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:53 crc kubenswrapper[4971]: I0309 09:37:53.711640 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-create-szs62" event={"ID":"3248073d-6eed-48b2-a088-b84c57ae3579","Type":"ContainerDied","Data":"8d342c7b554f2925a364fbe1e06c48b4492e72343866e9701f82bee09636704d"} Mar 09 09:37:53 crc kubenswrapper[4971]: I0309 09:37:53.711895 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d342c7b554f2925a364fbe1e06c48b4492e72343866e9701f82bee09636704d" Mar 09 09:37:53 crc kubenswrapper[4971]: I0309 09:37:53.711708 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-create-szs62" Mar 09 09:37:53 crc kubenswrapper[4971]: I0309 09:37:53.713375 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-1b72-account-create-update-z76fh" event={"ID":"6609af45-62cb-4830-b4d1-39700af89b1b","Type":"ContainerDied","Data":"ad39161c6d2790cf714476506bf342f5662273f9443791542892b38ce1fca73c"} Mar 09 09:37:53 crc kubenswrapper[4971]: I0309 09:37:53.713411 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad39161c6d2790cf714476506bf342f5662273f9443791542892b38ce1fca73c" Mar 09 09:37:53 crc kubenswrapper[4971]: I0309 09:37:53.713421 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-1b72-account-create-update-z76fh" Mar 09 09:37:55 crc kubenswrapper[4971]: I0309 09:37:55.484490 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/barbican-db-sync-5rgjq"] Mar 09 09:37:55 crc kubenswrapper[4971]: E0309 09:37:55.484877 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6609af45-62cb-4830-b4d1-39700af89b1b" containerName="mariadb-account-create-update" Mar 09 09:37:55 crc kubenswrapper[4971]: I0309 09:37:55.484892 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6609af45-62cb-4830-b4d1-39700af89b1b" containerName="mariadb-account-create-update" Mar 09 09:37:55 crc kubenswrapper[4971]: E0309 09:37:55.484909 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3248073d-6eed-48b2-a088-b84c57ae3579" containerName="mariadb-database-create" Mar 09 09:37:55 crc kubenswrapper[4971]: I0309 09:37:55.484917 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3248073d-6eed-48b2-a088-b84c57ae3579" containerName="mariadb-database-create" Mar 09 09:37:55 crc kubenswrapper[4971]: I0309 09:37:55.485073 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6609af45-62cb-4830-b4d1-39700af89b1b" containerName="mariadb-account-create-update" Mar 09 09:37:55 crc kubenswrapper[4971]: I0309 09:37:55.485097 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="3248073d-6eed-48b2-a088-b84c57ae3579" containerName="mariadb-database-create" Mar 09 09:37:55 crc kubenswrapper[4971]: I0309 09:37:55.485631 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-sync-5rgjq" Mar 09 09:37:55 crc kubenswrapper[4971]: I0309 09:37:55.488537 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-db-sync-5rgjq"] Mar 09 09:37:55 crc kubenswrapper[4971]: I0309 09:37:55.488562 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-barbican-dockercfg-k4sfg" Mar 09 09:37:55 crc kubenswrapper[4971]: I0309 09:37:55.491943 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-config-data" Mar 09 09:37:55 crc kubenswrapper[4971]: I0309 09:37:55.657166 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq9wl\" (UniqueName: \"kubernetes.io/projected/1a41ed0b-fe0b-486e-844e-4f0cfa225bb8-kube-api-access-cq9wl\") pod \"barbican-db-sync-5rgjq\" (UID: \"1a41ed0b-fe0b-486e-844e-4f0cfa225bb8\") " pod="swift-kuttl-tests/barbican-db-sync-5rgjq" Mar 09 09:37:55 crc kubenswrapper[4971]: I0309 09:37:55.657216 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1a41ed0b-fe0b-486e-844e-4f0cfa225bb8-db-sync-config-data\") pod \"barbican-db-sync-5rgjq\" (UID: \"1a41ed0b-fe0b-486e-844e-4f0cfa225bb8\") " pod="swift-kuttl-tests/barbican-db-sync-5rgjq" Mar 09 09:37:55 crc kubenswrapper[4971]: I0309 09:37:55.758064 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq9wl\" (UniqueName: \"kubernetes.io/projected/1a41ed0b-fe0b-486e-844e-4f0cfa225bb8-kube-api-access-cq9wl\") pod \"barbican-db-sync-5rgjq\" (UID: \"1a41ed0b-fe0b-486e-844e-4f0cfa225bb8\") " pod="swift-kuttl-tests/barbican-db-sync-5rgjq" Mar 09 09:37:55 crc kubenswrapper[4971]: I0309 09:37:55.758102 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1a41ed0b-fe0b-486e-844e-4f0cfa225bb8-db-sync-config-data\") pod \"barbican-db-sync-5rgjq\" (UID: \"1a41ed0b-fe0b-486e-844e-4f0cfa225bb8\") " pod="swift-kuttl-tests/barbican-db-sync-5rgjq" Mar 09 09:37:55 crc kubenswrapper[4971]: I0309 09:37:55.762563 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1a41ed0b-fe0b-486e-844e-4f0cfa225bb8-db-sync-config-data\") pod \"barbican-db-sync-5rgjq\" (UID: \"1a41ed0b-fe0b-486e-844e-4f0cfa225bb8\") " pod="swift-kuttl-tests/barbican-db-sync-5rgjq" Mar 09 09:37:55 crc kubenswrapper[4971]: I0309 09:37:55.775216 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq9wl\" (UniqueName: \"kubernetes.io/projected/1a41ed0b-fe0b-486e-844e-4f0cfa225bb8-kube-api-access-cq9wl\") pod \"barbican-db-sync-5rgjq\" (UID: \"1a41ed0b-fe0b-486e-844e-4f0cfa225bb8\") " pod="swift-kuttl-tests/barbican-db-sync-5rgjq" Mar 09 09:37:55 crc kubenswrapper[4971]: I0309 09:37:55.819920 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-sync-5rgjq" Mar 09 09:37:56 crc kubenswrapper[4971]: I0309 09:37:56.149934 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-db-sync-5rgjq"] Mar 09 09:37:56 crc kubenswrapper[4971]: I0309 09:37:56.375215 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt"] Mar 09 09:37:56 crc kubenswrapper[4971]: I0309 09:37:56.376694 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt" Mar 09 09:37:56 crc kubenswrapper[4971]: I0309 09:37:56.382369 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-w69pb" Mar 09 09:37:56 crc kubenswrapper[4971]: I0309 09:37:56.387201 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt"] Mar 09 09:37:56 crc kubenswrapper[4971]: I0309 09:37:56.469298 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbzbg\" (UniqueName: \"kubernetes.io/projected/566d73b4-920e-430e-ab8c-da58c5834dce-kube-api-access-cbzbg\") pod \"dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt\" (UID: \"566d73b4-920e-430e-ab8c-da58c5834dce\") " pod="openstack-operators/dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt" Mar 09 09:37:56 crc kubenswrapper[4971]: I0309 09:37:56.469366 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/566d73b4-920e-430e-ab8c-da58c5834dce-bundle\") pod \"dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt\" (UID: \"566d73b4-920e-430e-ab8c-da58c5834dce\") " pod="openstack-operators/dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt" Mar 09 09:37:56 crc kubenswrapper[4971]: I0309 09:37:56.469387 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/566d73b4-920e-430e-ab8c-da58c5834dce-util\") pod \"dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt\" (UID: \"566d73b4-920e-430e-ab8c-da58c5834dce\") " pod="openstack-operators/dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt" Mar 09 09:37:56 crc kubenswrapper[4971]: I0309 09:37:56.571170 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbzbg\" (UniqueName: \"kubernetes.io/projected/566d73b4-920e-430e-ab8c-da58c5834dce-kube-api-access-cbzbg\") pod \"dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt\" (UID: \"566d73b4-920e-430e-ab8c-da58c5834dce\") " pod="openstack-operators/dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt" Mar 09 09:37:56 crc kubenswrapper[4971]: I0309 09:37:56.572136 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/566d73b4-920e-430e-ab8c-da58c5834dce-bundle\") pod \"dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt\" (UID: \"566d73b4-920e-430e-ab8c-da58c5834dce\") " pod="openstack-operators/dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt" Mar 09 09:37:56 crc kubenswrapper[4971]: I0309 09:37:56.572290 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/566d73b4-920e-430e-ab8c-da58c5834dce-util\") pod \"dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt\" (UID: \"566d73b4-920e-430e-ab8c-da58c5834dce\") " pod="openstack-operators/dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt" Mar 09 09:37:56 crc kubenswrapper[4971]: I0309 09:37:56.573018 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/566d73b4-920e-430e-ab8c-da58c5834dce-util\") pod \"dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt\" (UID: \"566d73b4-920e-430e-ab8c-da58c5834dce\") " pod="openstack-operators/dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt" Mar 09 09:37:56 crc kubenswrapper[4971]: I0309 09:37:56.573617 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/566d73b4-920e-430e-ab8c-da58c5834dce-bundle\") pod \"dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt\" (UID: \"566d73b4-920e-430e-ab8c-da58c5834dce\") " pod="openstack-operators/dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt" Mar 09 09:37:56 crc kubenswrapper[4971]: I0309 09:37:56.595550 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbzbg\" (UniqueName: \"kubernetes.io/projected/566d73b4-920e-430e-ab8c-da58c5834dce-kube-api-access-cbzbg\") pod \"dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt\" (UID: \"566d73b4-920e-430e-ab8c-da58c5834dce\") " pod="openstack-operators/dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt" Mar 09 09:37:56 crc kubenswrapper[4971]: I0309 09:37:56.712269 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt" Mar 09 09:37:56 crc kubenswrapper[4971]: I0309 09:37:56.735243 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-sync-5rgjq" event={"ID":"1a41ed0b-fe0b-486e-844e-4f0cfa225bb8","Type":"ContainerStarted","Data":"95a7f12bca2f837c6b97185bc2e9dac5377fb7dbb62ee36dd7bbf3757ac5604b"} Mar 09 09:37:56 crc kubenswrapper[4971]: I0309 09:37:56.981584 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt"] Mar 09 09:37:57 crc kubenswrapper[4971]: W0309 09:37:57.018498 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod566d73b4_920e_430e_ab8c_da58c5834dce.slice/crio-0f71e3ca5edcb63205928c63f2eea54a84f07fb3531db1b0a98eb16443885827 WatchSource:0}: Error finding container 0f71e3ca5edcb63205928c63f2eea54a84f07fb3531db1b0a98eb16443885827: Status 404 returned error can't find the container with id 0f71e3ca5edcb63205928c63f2eea54a84f07fb3531db1b0a98eb16443885827 Mar 09 09:37:57 crc kubenswrapper[4971]: I0309 09:37:57.743559 4971 generic.go:334] "Generic (PLEG): container finished" podID="566d73b4-920e-430e-ab8c-da58c5834dce" containerID="36b3da4df6d7506b0c2a06af1bad1dfd3fd94e9805bfd4f8cbd3c27fa211b5d1" exitCode=0 Mar 09 09:37:57 crc kubenswrapper[4971]: I0309 09:37:57.743731 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt" event={"ID":"566d73b4-920e-430e-ab8c-da58c5834dce","Type":"ContainerDied","Data":"36b3da4df6d7506b0c2a06af1bad1dfd3fd94e9805bfd4f8cbd3c27fa211b5d1"} Mar 09 09:37:57 crc kubenswrapper[4971]: I0309 09:37:57.744047 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt" event={"ID":"566d73b4-920e-430e-ab8c-da58c5834dce","Type":"ContainerStarted","Data":"0f71e3ca5edcb63205928c63f2eea54a84f07fb3531db1b0a98eb16443885827"} Mar 09 09:38:00 crc kubenswrapper[4971]: I0309 09:38:00.128194 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550818-248bz"] Mar 09 09:38:00 crc kubenswrapper[4971]: I0309 09:38:00.129126 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550818-248bz" Mar 09 09:38:00 crc kubenswrapper[4971]: I0309 09:38:00.134914 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:38:00 crc kubenswrapper[4971]: I0309 09:38:00.135085 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-xhrv2" Mar 09 09:38:00 crc kubenswrapper[4971]: I0309 09:38:00.135382 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:38:00 crc kubenswrapper[4971]: I0309 09:38:00.137076 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550818-248bz"] Mar 09 09:38:00 crc kubenswrapper[4971]: I0309 09:38:00.233004 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t94jz\" (UniqueName: \"kubernetes.io/projected/a672281e-9e08-4c3f-8c0c-fd4acd6f0666-kube-api-access-t94jz\") pod \"auto-csr-approver-29550818-248bz\" (UID: \"a672281e-9e08-4c3f-8c0c-fd4acd6f0666\") " pod="openshift-infra/auto-csr-approver-29550818-248bz" Mar 09 09:38:00 crc kubenswrapper[4971]: I0309 09:38:00.335056 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t94jz\" (UniqueName: \"kubernetes.io/projected/a672281e-9e08-4c3f-8c0c-fd4acd6f0666-kube-api-access-t94jz\") pod \"auto-csr-approver-29550818-248bz\" (UID: \"a672281e-9e08-4c3f-8c0c-fd4acd6f0666\") " pod="openshift-infra/auto-csr-approver-29550818-248bz" Mar 09 09:38:00 crc kubenswrapper[4971]: I0309 09:38:00.361643 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t94jz\" (UniqueName: \"kubernetes.io/projected/a672281e-9e08-4c3f-8c0c-fd4acd6f0666-kube-api-access-t94jz\") pod \"auto-csr-approver-29550818-248bz\" (UID: \"a672281e-9e08-4c3f-8c0c-fd4acd6f0666\") " pod="openshift-infra/auto-csr-approver-29550818-248bz" Mar 09 09:38:00 crc kubenswrapper[4971]: I0309 09:38:00.448120 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550818-248bz" Mar 09 09:38:01 crc kubenswrapper[4971]: I0309 09:38:01.236972 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550818-248bz"] Mar 09 09:38:01 crc kubenswrapper[4971]: W0309 09:38:01.241237 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda672281e_9e08_4c3f_8c0c_fd4acd6f0666.slice/crio-79d348b15795adc1e8795dc52d90b86db99c05017a2671c26dbe8e57ef63d2e7 WatchSource:0}: Error finding container 79d348b15795adc1e8795dc52d90b86db99c05017a2671c26dbe8e57ef63d2e7: Status 404 returned error can't find the container with id 79d348b15795adc1e8795dc52d90b86db99c05017a2671c26dbe8e57ef63d2e7 Mar 09 09:38:01 crc kubenswrapper[4971]: I0309 09:38:01.772913 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550818-248bz" event={"ID":"a672281e-9e08-4c3f-8c0c-fd4acd6f0666","Type":"ContainerStarted","Data":"79d348b15795adc1e8795dc52d90b86db99c05017a2671c26dbe8e57ef63d2e7"} Mar 09 09:38:01 crc kubenswrapper[4971]: I0309 09:38:01.775593 4971 generic.go:334] "Generic (PLEG): container finished" podID="566d73b4-920e-430e-ab8c-da58c5834dce" containerID="63b11ff9138722e02e21d65827149ab4178cb84d2eba4127aee484e9d37229ed" exitCode=0 Mar 09 09:38:01 crc kubenswrapper[4971]: I0309 09:38:01.775658 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt" event={"ID":"566d73b4-920e-430e-ab8c-da58c5834dce","Type":"ContainerDied","Data":"63b11ff9138722e02e21d65827149ab4178cb84d2eba4127aee484e9d37229ed"} Mar 09 09:38:01 crc kubenswrapper[4971]: I0309 09:38:01.777863 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-sync-5rgjq" event={"ID":"1a41ed0b-fe0b-486e-844e-4f0cfa225bb8","Type":"ContainerStarted","Data":"d299a8d58ef6971c841c838b77f422d903fdb71532eb097a7864010656ee24c8"} Mar 09 09:38:01 crc kubenswrapper[4971]: I0309 09:38:01.819287 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/barbican-db-sync-5rgjq" podStartSLOduration=2.176870724 podStartE2EDuration="6.819268741s" podCreationTimestamp="2026-03-09 09:37:55 +0000 UTC" firstStartedPulling="2026-03-09 09:37:56.167438393 +0000 UTC m=+1079.727366203" lastFinishedPulling="2026-03-09 09:38:00.80983642 +0000 UTC m=+1084.369764220" observedRunningTime="2026-03-09 09:38:01.81716518 +0000 UTC m=+1085.377092990" watchObservedRunningTime="2026-03-09 09:38:01.819268741 +0000 UTC m=+1085.379196551" Mar 09 09:38:02 crc kubenswrapper[4971]: I0309 09:38:02.790212 4971 generic.go:334] "Generic (PLEG): container finished" podID="566d73b4-920e-430e-ab8c-da58c5834dce" containerID="44bac8fba7fc026eda5ca70abb9583856dfae14bae50d3e325c4a00e89c8d7a4" exitCode=0 Mar 09 09:38:02 crc kubenswrapper[4971]: I0309 09:38:02.790262 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt" event={"ID":"566d73b4-920e-430e-ab8c-da58c5834dce","Type":"ContainerDied","Data":"44bac8fba7fc026eda5ca70abb9583856dfae14bae50d3e325c4a00e89c8d7a4"} Mar 09 09:38:03 crc kubenswrapper[4971]: I0309 09:38:03.798621 4971 generic.go:334] "Generic (PLEG): container finished" podID="a672281e-9e08-4c3f-8c0c-fd4acd6f0666" containerID="395ee7a0aa47b6d164abe6bfa7fab3fefc7833a6a2a4b2b407ab04f1e5f34459" exitCode=0 Mar 09 09:38:03 crc kubenswrapper[4971]: I0309 09:38:03.799602 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550818-248bz" event={"ID":"a672281e-9e08-4c3f-8c0c-fd4acd6f0666","Type":"ContainerDied","Data":"395ee7a0aa47b6d164abe6bfa7fab3fefc7833a6a2a4b2b407ab04f1e5f34459"} Mar 09 09:38:04 crc kubenswrapper[4971]: I0309 09:38:04.114093 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt" Mar 09 09:38:04 crc kubenswrapper[4971]: I0309 09:38:04.292726 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbzbg\" (UniqueName: \"kubernetes.io/projected/566d73b4-920e-430e-ab8c-da58c5834dce-kube-api-access-cbzbg\") pod \"566d73b4-920e-430e-ab8c-da58c5834dce\" (UID: \"566d73b4-920e-430e-ab8c-da58c5834dce\") " Mar 09 09:38:04 crc kubenswrapper[4971]: I0309 09:38:04.292827 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/566d73b4-920e-430e-ab8c-da58c5834dce-util\") pod \"566d73b4-920e-430e-ab8c-da58c5834dce\" (UID: \"566d73b4-920e-430e-ab8c-da58c5834dce\") " Mar 09 09:38:04 crc kubenswrapper[4971]: I0309 09:38:04.292914 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/566d73b4-920e-430e-ab8c-da58c5834dce-bundle\") pod \"566d73b4-920e-430e-ab8c-da58c5834dce\" (UID: \"566d73b4-920e-430e-ab8c-da58c5834dce\") " Mar 09 09:38:04 crc kubenswrapper[4971]: I0309 09:38:04.293921 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/566d73b4-920e-430e-ab8c-da58c5834dce-bundle" (OuterVolumeSpecName: "bundle") pod "566d73b4-920e-430e-ab8c-da58c5834dce" (UID: "566d73b4-920e-430e-ab8c-da58c5834dce"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:38:04 crc kubenswrapper[4971]: I0309 09:38:04.298950 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/566d73b4-920e-430e-ab8c-da58c5834dce-kube-api-access-cbzbg" (OuterVolumeSpecName: "kube-api-access-cbzbg") pod "566d73b4-920e-430e-ab8c-da58c5834dce" (UID: "566d73b4-920e-430e-ab8c-da58c5834dce"). InnerVolumeSpecName "kube-api-access-cbzbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:38:04 crc kubenswrapper[4971]: I0309 09:38:04.310221 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/566d73b4-920e-430e-ab8c-da58c5834dce-util" (OuterVolumeSpecName: "util") pod "566d73b4-920e-430e-ab8c-da58c5834dce" (UID: "566d73b4-920e-430e-ab8c-da58c5834dce"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:38:04 crc kubenswrapper[4971]: I0309 09:38:04.394499 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbzbg\" (UniqueName: \"kubernetes.io/projected/566d73b4-920e-430e-ab8c-da58c5834dce-kube-api-access-cbzbg\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:04 crc kubenswrapper[4971]: I0309 09:38:04.394535 4971 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/566d73b4-920e-430e-ab8c-da58c5834dce-util\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:04 crc kubenswrapper[4971]: I0309 09:38:04.394546 4971 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/566d73b4-920e-430e-ab8c-da58c5834dce-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:04 crc kubenswrapper[4971]: I0309 09:38:04.808293 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt" event={"ID":"566d73b4-920e-430e-ab8c-da58c5834dce","Type":"ContainerDied","Data":"0f71e3ca5edcb63205928c63f2eea54a84f07fb3531db1b0a98eb16443885827"} Mar 09 09:38:04 crc kubenswrapper[4971]: I0309 09:38:04.808333 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f71e3ca5edcb63205928c63f2eea54a84f07fb3531db1b0a98eb16443885827" Mar 09 09:38:04 crc kubenswrapper[4971]: I0309 09:38:04.808338 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt" Mar 09 09:38:04 crc kubenswrapper[4971]: I0309 09:38:04.811419 4971 generic.go:334] "Generic (PLEG): container finished" podID="1a41ed0b-fe0b-486e-844e-4f0cfa225bb8" containerID="d299a8d58ef6971c841c838b77f422d903fdb71532eb097a7864010656ee24c8" exitCode=0 Mar 09 09:38:04 crc kubenswrapper[4971]: I0309 09:38:04.811518 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-sync-5rgjq" event={"ID":"1a41ed0b-fe0b-486e-844e-4f0cfa225bb8","Type":"ContainerDied","Data":"d299a8d58ef6971c841c838b77f422d903fdb71532eb097a7864010656ee24c8"} Mar 09 09:38:05 crc kubenswrapper[4971]: I0309 09:38:05.121926 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550818-248bz" Mar 09 09:38:05 crc kubenswrapper[4971]: I0309 09:38:05.207097 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t94jz\" (UniqueName: \"kubernetes.io/projected/a672281e-9e08-4c3f-8c0c-fd4acd6f0666-kube-api-access-t94jz\") pod \"a672281e-9e08-4c3f-8c0c-fd4acd6f0666\" (UID: \"a672281e-9e08-4c3f-8c0c-fd4acd6f0666\") " Mar 09 09:38:05 crc kubenswrapper[4971]: I0309 09:38:05.223153 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a672281e-9e08-4c3f-8c0c-fd4acd6f0666-kube-api-access-t94jz" (OuterVolumeSpecName: "kube-api-access-t94jz") pod "a672281e-9e08-4c3f-8c0c-fd4acd6f0666" (UID: "a672281e-9e08-4c3f-8c0c-fd4acd6f0666"). InnerVolumeSpecName "kube-api-access-t94jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:38:05 crc kubenswrapper[4971]: I0309 09:38:05.308594 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t94jz\" (UniqueName: \"kubernetes.io/projected/a672281e-9e08-4c3f-8c0c-fd4acd6f0666-kube-api-access-t94jz\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:05 crc kubenswrapper[4971]: I0309 09:38:05.819740 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550818-248bz" event={"ID":"a672281e-9e08-4c3f-8c0c-fd4acd6f0666","Type":"ContainerDied","Data":"79d348b15795adc1e8795dc52d90b86db99c05017a2671c26dbe8e57ef63d2e7"} Mar 09 09:38:05 crc kubenswrapper[4971]: I0309 09:38:05.819768 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550818-248bz" Mar 09 09:38:05 crc kubenswrapper[4971]: I0309 09:38:05.819778 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79d348b15795adc1e8795dc52d90b86db99c05017a2671c26dbe8e57ef63d2e7" Mar 09 09:38:06 crc kubenswrapper[4971]: I0309 09:38:06.055894 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-sync-5rgjq" Mar 09 09:38:06 crc kubenswrapper[4971]: I0309 09:38:06.171910 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550812-gphdx"] Mar 09 09:38:06 crc kubenswrapper[4971]: I0309 09:38:06.177026 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550812-gphdx"] Mar 09 09:38:06 crc kubenswrapper[4971]: I0309 09:38:06.218926 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1a41ed0b-fe0b-486e-844e-4f0cfa225bb8-db-sync-config-data\") pod \"1a41ed0b-fe0b-486e-844e-4f0cfa225bb8\" (UID: \"1a41ed0b-fe0b-486e-844e-4f0cfa225bb8\") " Mar 09 09:38:06 crc kubenswrapper[4971]: I0309 09:38:06.219145 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq9wl\" (UniqueName: \"kubernetes.io/projected/1a41ed0b-fe0b-486e-844e-4f0cfa225bb8-kube-api-access-cq9wl\") pod \"1a41ed0b-fe0b-486e-844e-4f0cfa225bb8\" (UID: \"1a41ed0b-fe0b-486e-844e-4f0cfa225bb8\") " Mar 09 09:38:06 crc kubenswrapper[4971]: I0309 09:38:06.222871 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a41ed0b-fe0b-486e-844e-4f0cfa225bb8-kube-api-access-cq9wl" (OuterVolumeSpecName: "kube-api-access-cq9wl") pod "1a41ed0b-fe0b-486e-844e-4f0cfa225bb8" (UID: "1a41ed0b-fe0b-486e-844e-4f0cfa225bb8"). InnerVolumeSpecName "kube-api-access-cq9wl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:38:06 crc kubenswrapper[4971]: I0309 09:38:06.222960 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a41ed0b-fe0b-486e-844e-4f0cfa225bb8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1a41ed0b-fe0b-486e-844e-4f0cfa225bb8" (UID: "1a41ed0b-fe0b-486e-844e-4f0cfa225bb8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:38:06 crc kubenswrapper[4971]: I0309 09:38:06.321334 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq9wl\" (UniqueName: \"kubernetes.io/projected/1a41ed0b-fe0b-486e-844e-4f0cfa225bb8-kube-api-access-cq9wl\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:06 crc kubenswrapper[4971]: I0309 09:38:06.321393 4971 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1a41ed0b-fe0b-486e-844e-4f0cfa225bb8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:06 crc kubenswrapper[4971]: I0309 09:38:06.828688 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-sync-5rgjq" event={"ID":"1a41ed0b-fe0b-486e-844e-4f0cfa225bb8","Type":"ContainerDied","Data":"95a7f12bca2f837c6b97185bc2e9dac5377fb7dbb62ee36dd7bbf3757ac5604b"} Mar 09 09:38:06 crc kubenswrapper[4971]: I0309 09:38:06.828729 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95a7f12bca2f837c6b97185bc2e9dac5377fb7dbb62ee36dd7bbf3757ac5604b" Mar 09 09:38:06 crc kubenswrapper[4971]: I0309 09:38:06.828782 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-sync-5rgjq" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.040104 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/barbican-worker-5dbbf7ff77-mmrwc"] Mar 09 09:38:07 crc kubenswrapper[4971]: E0309 09:38:07.040407 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="566d73b4-920e-430e-ab8c-da58c5834dce" containerName="pull" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.040425 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="566d73b4-920e-430e-ab8c-da58c5834dce" containerName="pull" Mar 09 09:38:07 crc kubenswrapper[4971]: E0309 09:38:07.040434 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a672281e-9e08-4c3f-8c0c-fd4acd6f0666" containerName="oc" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.040440 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a672281e-9e08-4c3f-8c0c-fd4acd6f0666" containerName="oc" Mar 09 09:38:07 crc kubenswrapper[4971]: E0309 09:38:07.040458 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a41ed0b-fe0b-486e-844e-4f0cfa225bb8" containerName="barbican-db-sync" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.040466 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a41ed0b-fe0b-486e-844e-4f0cfa225bb8" containerName="barbican-db-sync" Mar 09 09:38:07 crc kubenswrapper[4971]: E0309 09:38:07.040473 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="566d73b4-920e-430e-ab8c-da58c5834dce" containerName="util" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.040478 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="566d73b4-920e-430e-ab8c-da58c5834dce" containerName="util" Mar 09 09:38:07 crc kubenswrapper[4971]: E0309 09:38:07.040488 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="566d73b4-920e-430e-ab8c-da58c5834dce" containerName="extract" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.040493 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="566d73b4-920e-430e-ab8c-da58c5834dce" containerName="extract" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.040609 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a672281e-9e08-4c3f-8c0c-fd4acd6f0666" containerName="oc" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.040620 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="566d73b4-920e-430e-ab8c-da58c5834dce" containerName="extract" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.040632 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a41ed0b-fe0b-486e-844e-4f0cfa225bb8" containerName="barbican-db-sync" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.041343 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-worker-5dbbf7ff77-mmrwc" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.043696 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-barbican-dockercfg-k4sfg" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.046090 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-config-data" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.046800 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-worker-config-data" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.062909 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-worker-5dbbf7ff77-mmrwc"] Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.093869 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/barbican-keystone-listener-7c5f6b4756-n7vwl"] Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.094884 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-keystone-listener-7c5f6b4756-n7vwl" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.096263 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-keystone-listener-config-data" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.104912 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-keystone-listener-7c5f6b4756-n7vwl"] Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.134738 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2752865-3de3-46a0-b9d1-0ddb9422835b-config-data-custom\") pod \"barbican-worker-5dbbf7ff77-mmrwc\" (UID: \"e2752865-3de3-46a0-b9d1-0ddb9422835b\") " pod="swift-kuttl-tests/barbican-worker-5dbbf7ff77-mmrwc" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.134805 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2752865-3de3-46a0-b9d1-0ddb9422835b-config-data\") pod \"barbican-worker-5dbbf7ff77-mmrwc\" (UID: \"e2752865-3de3-46a0-b9d1-0ddb9422835b\") " pod="swift-kuttl-tests/barbican-worker-5dbbf7ff77-mmrwc" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.134895 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdt88\" (UniqueName: \"kubernetes.io/projected/e2752865-3de3-46a0-b9d1-0ddb9422835b-kube-api-access-vdt88\") pod \"barbican-worker-5dbbf7ff77-mmrwc\" (UID: \"e2752865-3de3-46a0-b9d1-0ddb9422835b\") " pod="swift-kuttl-tests/barbican-worker-5dbbf7ff77-mmrwc" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.134927 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2752865-3de3-46a0-b9d1-0ddb9422835b-logs\") pod \"barbican-worker-5dbbf7ff77-mmrwc\" (UID: \"e2752865-3de3-46a0-b9d1-0ddb9422835b\") " pod="swift-kuttl-tests/barbican-worker-5dbbf7ff77-mmrwc" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.160857 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e736658b-7928-4d43-b26c-5c93e8fb5f99" path="/var/lib/kubelet/pods/e736658b-7928-4d43-b26c-5c93e8fb5f99/volumes" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.236260 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdt88\" (UniqueName: \"kubernetes.io/projected/e2752865-3de3-46a0-b9d1-0ddb9422835b-kube-api-access-vdt88\") pod \"barbican-worker-5dbbf7ff77-mmrwc\" (UID: \"e2752865-3de3-46a0-b9d1-0ddb9422835b\") " pod="swift-kuttl-tests/barbican-worker-5dbbf7ff77-mmrwc" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.236316 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2752865-3de3-46a0-b9d1-0ddb9422835b-logs\") pod \"barbican-worker-5dbbf7ff77-mmrwc\" (UID: \"e2752865-3de3-46a0-b9d1-0ddb9422835b\") " pod="swift-kuttl-tests/barbican-worker-5dbbf7ff77-mmrwc" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.236371 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nk7r\" (UniqueName: \"kubernetes.io/projected/a3c571db-af37-4e76-a633-9f58b814341f-kube-api-access-2nk7r\") pod \"barbican-keystone-listener-7c5f6b4756-n7vwl\" (UID: \"a3c571db-af37-4e76-a633-9f58b814341f\") " pod="swift-kuttl-tests/barbican-keystone-listener-7c5f6b4756-n7vwl" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.236413 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3c571db-af37-4e76-a633-9f58b814341f-config-data-custom\") pod \"barbican-keystone-listener-7c5f6b4756-n7vwl\" (UID: \"a3c571db-af37-4e76-a633-9f58b814341f\") " pod="swift-kuttl-tests/barbican-keystone-listener-7c5f6b4756-n7vwl" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.236440 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2752865-3de3-46a0-b9d1-0ddb9422835b-config-data-custom\") pod \"barbican-worker-5dbbf7ff77-mmrwc\" (UID: \"e2752865-3de3-46a0-b9d1-0ddb9422835b\") " pod="swift-kuttl-tests/barbican-worker-5dbbf7ff77-mmrwc" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.236465 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2752865-3de3-46a0-b9d1-0ddb9422835b-config-data\") pod \"barbican-worker-5dbbf7ff77-mmrwc\" (UID: \"e2752865-3de3-46a0-b9d1-0ddb9422835b\") " pod="swift-kuttl-tests/barbican-worker-5dbbf7ff77-mmrwc" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.236515 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3c571db-af37-4e76-a633-9f58b814341f-config-data\") pod \"barbican-keystone-listener-7c5f6b4756-n7vwl\" (UID: \"a3c571db-af37-4e76-a633-9f58b814341f\") " pod="swift-kuttl-tests/barbican-keystone-listener-7c5f6b4756-n7vwl" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.236531 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3c571db-af37-4e76-a633-9f58b814341f-logs\") pod \"barbican-keystone-listener-7c5f6b4756-n7vwl\" (UID: \"a3c571db-af37-4e76-a633-9f58b814341f\") " pod="swift-kuttl-tests/barbican-keystone-listener-7c5f6b4756-n7vwl" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.237011 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2752865-3de3-46a0-b9d1-0ddb9422835b-logs\") pod \"barbican-worker-5dbbf7ff77-mmrwc\" (UID: \"e2752865-3de3-46a0-b9d1-0ddb9422835b\") " pod="swift-kuttl-tests/barbican-worker-5dbbf7ff77-mmrwc" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.240757 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2752865-3de3-46a0-b9d1-0ddb9422835b-config-data-custom\") pod \"barbican-worker-5dbbf7ff77-mmrwc\" (UID: \"e2752865-3de3-46a0-b9d1-0ddb9422835b\") " pod="swift-kuttl-tests/barbican-worker-5dbbf7ff77-mmrwc" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.241241 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2752865-3de3-46a0-b9d1-0ddb9422835b-config-data\") pod \"barbican-worker-5dbbf7ff77-mmrwc\" (UID: \"e2752865-3de3-46a0-b9d1-0ddb9422835b\") " pod="swift-kuttl-tests/barbican-worker-5dbbf7ff77-mmrwc" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.259709 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdt88\" (UniqueName: \"kubernetes.io/projected/e2752865-3de3-46a0-b9d1-0ddb9422835b-kube-api-access-vdt88\") pod \"barbican-worker-5dbbf7ff77-mmrwc\" (UID: \"e2752865-3de3-46a0-b9d1-0ddb9422835b\") " pod="swift-kuttl-tests/barbican-worker-5dbbf7ff77-mmrwc" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.304439 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/barbican-api-9984f6cdd-9rzrp"] Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.305629 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-api-9984f6cdd-9rzrp" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.307905 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-api-config-data" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.324154 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-api-9984f6cdd-9rzrp"] Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.349904 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3c571db-af37-4e76-a633-9f58b814341f-config-data\") pod \"barbican-keystone-listener-7c5f6b4756-n7vwl\" (UID: \"a3c571db-af37-4e76-a633-9f58b814341f\") " pod="swift-kuttl-tests/barbican-keystone-listener-7c5f6b4756-n7vwl" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.349962 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3c571db-af37-4e76-a633-9f58b814341f-logs\") pod \"barbican-keystone-listener-7c5f6b4756-n7vwl\" (UID: \"a3c571db-af37-4e76-a633-9f58b814341f\") " pod="swift-kuttl-tests/barbican-keystone-listener-7c5f6b4756-n7vwl" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.350061 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nk7r\" (UniqueName: \"kubernetes.io/projected/a3c571db-af37-4e76-a633-9f58b814341f-kube-api-access-2nk7r\") pod \"barbican-keystone-listener-7c5f6b4756-n7vwl\" (UID: \"a3c571db-af37-4e76-a633-9f58b814341f\") " pod="swift-kuttl-tests/barbican-keystone-listener-7c5f6b4756-n7vwl" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.350182 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3c571db-af37-4e76-a633-9f58b814341f-config-data-custom\") pod \"barbican-keystone-listener-7c5f6b4756-n7vwl\" (UID: \"a3c571db-af37-4e76-a633-9f58b814341f\") " pod="swift-kuttl-tests/barbican-keystone-listener-7c5f6b4756-n7vwl" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.350566 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3c571db-af37-4e76-a633-9f58b814341f-logs\") pod \"barbican-keystone-listener-7c5f6b4756-n7vwl\" (UID: \"a3c571db-af37-4e76-a633-9f58b814341f\") " pod="swift-kuttl-tests/barbican-keystone-listener-7c5f6b4756-n7vwl" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.357069 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3c571db-af37-4e76-a633-9f58b814341f-config-data-custom\") pod \"barbican-keystone-listener-7c5f6b4756-n7vwl\" (UID: \"a3c571db-af37-4e76-a633-9f58b814341f\") " pod="swift-kuttl-tests/barbican-keystone-listener-7c5f6b4756-n7vwl" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.358368 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3c571db-af37-4e76-a633-9f58b814341f-config-data\") pod \"barbican-keystone-listener-7c5f6b4756-n7vwl\" (UID: \"a3c571db-af37-4e76-a633-9f58b814341f\") " pod="swift-kuttl-tests/barbican-keystone-listener-7c5f6b4756-n7vwl" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.381574 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nk7r\" (UniqueName: \"kubernetes.io/projected/a3c571db-af37-4e76-a633-9f58b814341f-kube-api-access-2nk7r\") pod \"barbican-keystone-listener-7c5f6b4756-n7vwl\" (UID: \"a3c571db-af37-4e76-a633-9f58b814341f\") " pod="swift-kuttl-tests/barbican-keystone-listener-7c5f6b4756-n7vwl" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.408782 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-worker-5dbbf7ff77-mmrwc" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.434562 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-keystone-listener-7c5f6b4756-n7vwl" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.451197 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da90eb8d-a57a-4d85-978b-919c2008cd3a-logs\") pod \"barbican-api-9984f6cdd-9rzrp\" (UID: \"da90eb8d-a57a-4d85-978b-919c2008cd3a\") " pod="swift-kuttl-tests/barbican-api-9984f6cdd-9rzrp" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.451291 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da90eb8d-a57a-4d85-978b-919c2008cd3a-config-data-custom\") pod \"barbican-api-9984f6cdd-9rzrp\" (UID: \"da90eb8d-a57a-4d85-978b-919c2008cd3a\") " pod="swift-kuttl-tests/barbican-api-9984f6cdd-9rzrp" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.451368 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da90eb8d-a57a-4d85-978b-919c2008cd3a-config-data\") pod \"barbican-api-9984f6cdd-9rzrp\" (UID: \"da90eb8d-a57a-4d85-978b-919c2008cd3a\") " pod="swift-kuttl-tests/barbican-api-9984f6cdd-9rzrp" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.451404 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzdcd\" (UniqueName: \"kubernetes.io/projected/da90eb8d-a57a-4d85-978b-919c2008cd3a-kube-api-access-rzdcd\") pod \"barbican-api-9984f6cdd-9rzrp\" (UID: \"da90eb8d-a57a-4d85-978b-919c2008cd3a\") " pod="swift-kuttl-tests/barbican-api-9984f6cdd-9rzrp" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.553477 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da90eb8d-a57a-4d85-978b-919c2008cd3a-logs\") pod \"barbican-api-9984f6cdd-9rzrp\" (UID: \"da90eb8d-a57a-4d85-978b-919c2008cd3a\") " pod="swift-kuttl-tests/barbican-api-9984f6cdd-9rzrp" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.553566 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da90eb8d-a57a-4d85-978b-919c2008cd3a-config-data-custom\") pod \"barbican-api-9984f6cdd-9rzrp\" (UID: \"da90eb8d-a57a-4d85-978b-919c2008cd3a\") " pod="swift-kuttl-tests/barbican-api-9984f6cdd-9rzrp" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.553604 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da90eb8d-a57a-4d85-978b-919c2008cd3a-config-data\") pod \"barbican-api-9984f6cdd-9rzrp\" (UID: \"da90eb8d-a57a-4d85-978b-919c2008cd3a\") " pod="swift-kuttl-tests/barbican-api-9984f6cdd-9rzrp" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.553642 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzdcd\" (UniqueName: \"kubernetes.io/projected/da90eb8d-a57a-4d85-978b-919c2008cd3a-kube-api-access-rzdcd\") pod \"barbican-api-9984f6cdd-9rzrp\" (UID: \"da90eb8d-a57a-4d85-978b-919c2008cd3a\") " pod="swift-kuttl-tests/barbican-api-9984f6cdd-9rzrp" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.555157 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da90eb8d-a57a-4d85-978b-919c2008cd3a-logs\") pod \"barbican-api-9984f6cdd-9rzrp\" (UID: \"da90eb8d-a57a-4d85-978b-919c2008cd3a\") " pod="swift-kuttl-tests/barbican-api-9984f6cdd-9rzrp" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.559791 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da90eb8d-a57a-4d85-978b-919c2008cd3a-config-data\") pod \"barbican-api-9984f6cdd-9rzrp\" (UID: \"da90eb8d-a57a-4d85-978b-919c2008cd3a\") " pod="swift-kuttl-tests/barbican-api-9984f6cdd-9rzrp" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.566339 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da90eb8d-a57a-4d85-978b-919c2008cd3a-config-data-custom\") pod \"barbican-api-9984f6cdd-9rzrp\" (UID: \"da90eb8d-a57a-4d85-978b-919c2008cd3a\") " pod="swift-kuttl-tests/barbican-api-9984f6cdd-9rzrp" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.580610 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzdcd\" (UniqueName: \"kubernetes.io/projected/da90eb8d-a57a-4d85-978b-919c2008cd3a-kube-api-access-rzdcd\") pod \"barbican-api-9984f6cdd-9rzrp\" (UID: \"da90eb8d-a57a-4d85-978b-919c2008cd3a\") " pod="swift-kuttl-tests/barbican-api-9984f6cdd-9rzrp" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.620877 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-api-9984f6cdd-9rzrp" Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.649208 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-worker-5dbbf7ff77-mmrwc"] Mar 09 09:38:07 crc kubenswrapper[4971]: I0309 09:38:07.842286 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-worker-5dbbf7ff77-mmrwc" event={"ID":"e2752865-3de3-46a0-b9d1-0ddb9422835b","Type":"ContainerStarted","Data":"e5cbb62c0c9093d093699fcbd6163ceef7ab7b77c8fc8d4f5c51d7bdc915a82e"} Mar 09 09:38:08 crc kubenswrapper[4971]: I0309 09:38:08.011986 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-keystone-listener-7c5f6b4756-n7vwl"] Mar 09 09:38:08 crc kubenswrapper[4971]: W0309 09:38:08.012873 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3c571db_af37_4e76_a633_9f58b814341f.slice/crio-d66bb7c00f138e73057dafacb21a4d8bcf454cc56740d205fc6429d40f2cd3e6 WatchSource:0}: Error finding container d66bb7c00f138e73057dafacb21a4d8bcf454cc56740d205fc6429d40f2cd3e6: Status 404 returned error can't find the container with id d66bb7c00f138e73057dafacb21a4d8bcf454cc56740d205fc6429d40f2cd3e6 Mar 09 09:38:08 crc kubenswrapper[4971]: I0309 09:38:08.076610 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-api-9984f6cdd-9rzrp"] Mar 09 09:38:08 crc kubenswrapper[4971]: W0309 09:38:08.079842 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda90eb8d_a57a_4d85_978b_919c2008cd3a.slice/crio-202993eeae1122424ffd10daea38c58bc2fd6381c1aa24d6704eebeaab50467a WatchSource:0}: Error finding container 202993eeae1122424ffd10daea38c58bc2fd6381c1aa24d6704eebeaab50467a: Status 404 returned error can't find the container with id 202993eeae1122424ffd10daea38c58bc2fd6381c1aa24d6704eebeaab50467a Mar 09 09:38:08 crc kubenswrapper[4971]: I0309 09:38:08.850044 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-api-9984f6cdd-9rzrp" event={"ID":"da90eb8d-a57a-4d85-978b-919c2008cd3a","Type":"ContainerStarted","Data":"84a6f30d141df641e601c742b64bd3120be774823726652e4503bb08601d852f"} Mar 09 09:38:08 crc kubenswrapper[4971]: I0309 09:38:08.850638 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-api-9984f6cdd-9rzrp" event={"ID":"da90eb8d-a57a-4d85-978b-919c2008cd3a","Type":"ContainerStarted","Data":"e5f4e754672b4afd78b564cd6aee2a0195018fdee67e0e7e2ffa6f73966fd0b0"} Mar 09 09:38:08 crc kubenswrapper[4971]: I0309 09:38:08.850658 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/barbican-api-9984f6cdd-9rzrp" Mar 09 09:38:08 crc kubenswrapper[4971]: I0309 09:38:08.850669 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-api-9984f6cdd-9rzrp" event={"ID":"da90eb8d-a57a-4d85-978b-919c2008cd3a","Type":"ContainerStarted","Data":"202993eeae1122424ffd10daea38c58bc2fd6381c1aa24d6704eebeaab50467a"} Mar 09 09:38:08 crc kubenswrapper[4971]: I0309 09:38:08.851205 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-keystone-listener-7c5f6b4756-n7vwl" event={"ID":"a3c571db-af37-4e76-a633-9f58b814341f","Type":"ContainerStarted","Data":"d66bb7c00f138e73057dafacb21a4d8bcf454cc56740d205fc6429d40f2cd3e6"} Mar 09 09:38:08 crc kubenswrapper[4971]: I0309 09:38:08.871977 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/barbican-api-9984f6cdd-9rzrp" podStartSLOduration=1.871957041 podStartE2EDuration="1.871957041s" podCreationTimestamp="2026-03-09 09:38:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:38:08.86953192 +0000 UTC m=+1092.429459730" watchObservedRunningTime="2026-03-09 09:38:08.871957041 +0000 UTC m=+1092.431884851" Mar 09 09:38:09 crc kubenswrapper[4971]: I0309 09:38:09.858326 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-worker-5dbbf7ff77-mmrwc" event={"ID":"e2752865-3de3-46a0-b9d1-0ddb9422835b","Type":"ContainerStarted","Data":"38750a7227b5f535c1db080bbb4b2ca628d02ad251edc063e1e9cb29201e4378"} Mar 09 09:38:09 crc kubenswrapper[4971]: I0309 09:38:09.858627 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-worker-5dbbf7ff77-mmrwc" event={"ID":"e2752865-3de3-46a0-b9d1-0ddb9422835b","Type":"ContainerStarted","Data":"68b8758f7ca22eba046fae61fee7f97815370b4429d4bc982372bac7d50f33f9"} Mar 09 09:38:09 crc kubenswrapper[4971]: I0309 09:38:09.861115 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-keystone-listener-7c5f6b4756-n7vwl" event={"ID":"a3c571db-af37-4e76-a633-9f58b814341f","Type":"ContainerStarted","Data":"fa2b72ac85399b3edfb70469af23baeb020bedb9ae425a9c5f3f16411d46a84c"} Mar 09 09:38:09 crc kubenswrapper[4971]: I0309 09:38:09.861279 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/barbican-api-9984f6cdd-9rzrp" Mar 09 09:38:09 crc kubenswrapper[4971]: I0309 09:38:09.876761 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/barbican-worker-5dbbf7ff77-mmrwc" podStartSLOduration=1.726891997 podStartE2EDuration="2.876744766s" podCreationTimestamp="2026-03-09 09:38:07 +0000 UTC" firstStartedPulling="2026-03-09 09:38:07.662913785 +0000 UTC m=+1091.222841595" lastFinishedPulling="2026-03-09 09:38:08.812766554 +0000 UTC m=+1092.372694364" observedRunningTime="2026-03-09 09:38:09.875127689 +0000 UTC m=+1093.435055499" watchObservedRunningTime="2026-03-09 09:38:09.876744766 +0000 UTC m=+1093.436672576" Mar 09 09:38:10 crc kubenswrapper[4971]: I0309 09:38:10.868011 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-keystone-listener-7c5f6b4756-n7vwl" event={"ID":"a3c571db-af37-4e76-a633-9f58b814341f","Type":"ContainerStarted","Data":"88911ae4657de4f8b3cbffd97bc510eacfbdbdba3dc9d2a38eb7b2615784c43d"} Mar 09 09:38:10 crc kubenswrapper[4971]: I0309 09:38:10.890224 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/barbican-keystone-listener-7c5f6b4756-n7vwl" podStartSLOduration=2.359157715 podStartE2EDuration="3.890210365s" podCreationTimestamp="2026-03-09 09:38:07 +0000 UTC" firstStartedPulling="2026-03-09 09:38:08.015208404 +0000 UTC m=+1091.575136214" lastFinishedPulling="2026-03-09 09:38:09.546261054 +0000 UTC m=+1093.106188864" observedRunningTime="2026-03-09 09:38:10.887222028 +0000 UTC m=+1094.447149838" watchObservedRunningTime="2026-03-09 09:38:10.890210365 +0000 UTC m=+1094.450138165" Mar 09 09:38:14 crc kubenswrapper[4971]: I0309 09:38:14.794833 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:38:14 crc kubenswrapper[4971]: I0309 09:38:14.795521 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:38:18 crc kubenswrapper[4971]: I0309 09:38:18.505859 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6fd95cd797-b9sk8"] Mar 09 09:38:18 crc kubenswrapper[4971]: I0309 09:38:18.507074 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6fd95cd797-b9sk8" Mar 09 09:38:18 crc kubenswrapper[4971]: I0309 09:38:18.509830 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-k4bl2" Mar 09 09:38:18 crc kubenswrapper[4971]: I0309 09:38:18.509830 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-service-cert" Mar 09 09:38:18 crc kubenswrapper[4971]: I0309 09:38:18.522459 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6fd95cd797-b9sk8"] Mar 09 09:38:18 crc kubenswrapper[4971]: I0309 09:38:18.615052 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/802c0560-39aa-4e21-a55e-6374f50e4301-apiservice-cert\") pod \"swift-operator-controller-manager-6fd95cd797-b9sk8\" (UID: \"802c0560-39aa-4e21-a55e-6374f50e4301\") " pod="openstack-operators/swift-operator-controller-manager-6fd95cd797-b9sk8" Mar 09 09:38:18 crc kubenswrapper[4971]: I0309 09:38:18.615101 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d86h\" (UniqueName: \"kubernetes.io/projected/802c0560-39aa-4e21-a55e-6374f50e4301-kube-api-access-4d86h\") pod \"swift-operator-controller-manager-6fd95cd797-b9sk8\" (UID: \"802c0560-39aa-4e21-a55e-6374f50e4301\") " pod="openstack-operators/swift-operator-controller-manager-6fd95cd797-b9sk8" Mar 09 09:38:18 crc kubenswrapper[4971]: I0309 09:38:18.615191 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/802c0560-39aa-4e21-a55e-6374f50e4301-webhook-cert\") pod \"swift-operator-controller-manager-6fd95cd797-b9sk8\" (UID: \"802c0560-39aa-4e21-a55e-6374f50e4301\") " pod="openstack-operators/swift-operator-controller-manager-6fd95cd797-b9sk8" Mar 09 09:38:18 crc kubenswrapper[4971]: I0309 09:38:18.716304 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/802c0560-39aa-4e21-a55e-6374f50e4301-webhook-cert\") pod \"swift-operator-controller-manager-6fd95cd797-b9sk8\" (UID: \"802c0560-39aa-4e21-a55e-6374f50e4301\") " pod="openstack-operators/swift-operator-controller-manager-6fd95cd797-b9sk8" Mar 09 09:38:18 crc kubenswrapper[4971]: I0309 09:38:18.716674 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/802c0560-39aa-4e21-a55e-6374f50e4301-apiservice-cert\") pod \"swift-operator-controller-manager-6fd95cd797-b9sk8\" (UID: \"802c0560-39aa-4e21-a55e-6374f50e4301\") " pod="openstack-operators/swift-operator-controller-manager-6fd95cd797-b9sk8" Mar 09 09:38:18 crc kubenswrapper[4971]: I0309 09:38:18.716696 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d86h\" (UniqueName: \"kubernetes.io/projected/802c0560-39aa-4e21-a55e-6374f50e4301-kube-api-access-4d86h\") pod \"swift-operator-controller-manager-6fd95cd797-b9sk8\" (UID: \"802c0560-39aa-4e21-a55e-6374f50e4301\") " pod="openstack-operators/swift-operator-controller-manager-6fd95cd797-b9sk8" Mar 09 09:38:18 crc kubenswrapper[4971]: I0309 09:38:18.722714 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/802c0560-39aa-4e21-a55e-6374f50e4301-webhook-cert\") pod \"swift-operator-controller-manager-6fd95cd797-b9sk8\" (UID: \"802c0560-39aa-4e21-a55e-6374f50e4301\") " pod="openstack-operators/swift-operator-controller-manager-6fd95cd797-b9sk8" Mar 09 09:38:18 crc kubenswrapper[4971]: I0309 09:38:18.730998 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/802c0560-39aa-4e21-a55e-6374f50e4301-apiservice-cert\") pod \"swift-operator-controller-manager-6fd95cd797-b9sk8\" (UID: \"802c0560-39aa-4e21-a55e-6374f50e4301\") " pod="openstack-operators/swift-operator-controller-manager-6fd95cd797-b9sk8" Mar 09 09:38:18 crc kubenswrapper[4971]: I0309 09:38:18.742845 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d86h\" (UniqueName: \"kubernetes.io/projected/802c0560-39aa-4e21-a55e-6374f50e4301-kube-api-access-4d86h\") pod \"swift-operator-controller-manager-6fd95cd797-b9sk8\" (UID: \"802c0560-39aa-4e21-a55e-6374f50e4301\") " pod="openstack-operators/swift-operator-controller-manager-6fd95cd797-b9sk8" Mar 09 09:38:18 crc kubenswrapper[4971]: I0309 09:38:18.823853 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6fd95cd797-b9sk8" Mar 09 09:38:19 crc kubenswrapper[4971]: I0309 09:38:19.178807 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/barbican-api-9984f6cdd-9rzrp" Mar 09 09:38:19 crc kubenswrapper[4971]: I0309 09:38:19.245827 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/barbican-api-9984f6cdd-9rzrp" Mar 09 09:38:19 crc kubenswrapper[4971]: I0309 09:38:19.338433 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6fd95cd797-b9sk8"] Mar 09 09:38:19 crc kubenswrapper[4971]: I0309 09:38:19.932606 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6fd95cd797-b9sk8" event={"ID":"802c0560-39aa-4e21-a55e-6374f50e4301","Type":"ContainerStarted","Data":"73ca5ab54863bb7ed8eae75b0e5e854687317688ece0f528f3cd557521710756"} Mar 09 09:38:21 crc kubenswrapper[4971]: I0309 09:38:21.946243 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6fd95cd797-b9sk8" event={"ID":"802c0560-39aa-4e21-a55e-6374f50e4301","Type":"ContainerStarted","Data":"b7c2d738796d4525030e82997930843a40f55c5fa4d4b6cfd3d9c7e56aac06f0"} Mar 09 09:38:21 crc kubenswrapper[4971]: I0309 09:38:21.946941 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6fd95cd797-b9sk8" Mar 09 09:38:21 crc kubenswrapper[4971]: I0309 09:38:21.968769 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6fd95cd797-b9sk8" podStartSLOduration=2.151270968 podStartE2EDuration="3.968747852s" podCreationTimestamp="2026-03-09 09:38:18 +0000 UTC" firstStartedPulling="2026-03-09 09:38:19.351244658 +0000 UTC m=+1102.911172468" lastFinishedPulling="2026-03-09 09:38:21.168721542 +0000 UTC m=+1104.728649352" observedRunningTime="2026-03-09 09:38:21.965503129 +0000 UTC m=+1105.525430939" watchObservedRunningTime="2026-03-09 09:38:21.968747852 +0000 UTC m=+1105.528675662" Mar 09 09:38:28 crc kubenswrapper[4971]: I0309 09:38:28.837007 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6fd95cd797-b9sk8" Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.442229 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.447276 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.450447 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-files" Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.450506 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-swift-dockercfg-8g9mx" Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.451288 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-storage-config-data" Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.452856 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-conf" Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.520824 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.610708 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6f4feb95-a276-4089-9876-d30cde31f67c-lock\") pod \"swift-storage-0\" (UID: \"6f4feb95-a276-4089-9876-d30cde31f67c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.610776 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-etc-swift\") pod \"swift-storage-0\" (UID: \"6f4feb95-a276-4089-9876-d30cde31f67c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.610950 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6f4feb95-a276-4089-9876-d30cde31f67c-cache\") pod \"swift-storage-0\" (UID: \"6f4feb95-a276-4089-9876-d30cde31f67c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.611076 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x56b\" (UniqueName: \"kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-kube-api-access-5x56b\") pod \"swift-storage-0\" (UID: \"6f4feb95-a276-4089-9876-d30cde31f67c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.611149 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"6f4feb95-a276-4089-9876-d30cde31f67c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.712248 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-etc-swift\") pod \"swift-storage-0\" (UID: \"6f4feb95-a276-4089-9876-d30cde31f67c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.712324 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6f4feb95-a276-4089-9876-d30cde31f67c-cache\") pod \"swift-storage-0\" (UID: \"6f4feb95-a276-4089-9876-d30cde31f67c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.712388 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x56b\" (UniqueName: \"kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-kube-api-access-5x56b\") pod \"swift-storage-0\" (UID: \"6f4feb95-a276-4089-9876-d30cde31f67c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.712419 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"6f4feb95-a276-4089-9876-d30cde31f67c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.712458 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6f4feb95-a276-4089-9876-d30cde31f67c-lock\") pod \"swift-storage-0\" (UID: \"6f4feb95-a276-4089-9876-d30cde31f67c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:38:31 crc kubenswrapper[4971]: E0309 09:38:31.712513 4971 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 09:38:31 crc kubenswrapper[4971]: E0309 09:38:31.712550 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 09 09:38:31 crc kubenswrapper[4971]: E0309 09:38:31.712624 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-etc-swift podName:6f4feb95-a276-4089-9876-d30cde31f67c nodeName:}" failed. No retries permitted until 2026-03-09 09:38:32.212599408 +0000 UTC m=+1115.772527278 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-etc-swift") pod "swift-storage-0" (UID: "6f4feb95-a276-4089-9876-d30cde31f67c") : configmap "swift-ring-files" not found Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.712868 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6f4feb95-a276-4089-9876-d30cde31f67c-cache\") pod \"swift-storage-0\" (UID: \"6f4feb95-a276-4089-9876-d30cde31f67c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.712912 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6f4feb95-a276-4089-9876-d30cde31f67c-lock\") pod \"swift-storage-0\" (UID: \"6f4feb95-a276-4089-9876-d30cde31f67c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.712943 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"6f4feb95-a276-4089-9876-d30cde31f67c\") device mount path \"/mnt/openstack/pv10\"" pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.744497 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"6f4feb95-a276-4089-9876-d30cde31f67c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.744914 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x56b\" (UniqueName: \"kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-kube-api-access-5x56b\") pod \"swift-storage-0\" (UID: \"6f4feb95-a276-4089-9876-d30cde31f67c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.970799 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-5jh5l"] Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.982744 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-5jh5l" Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.987983 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-proxy-config-data" Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.988136 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.988244 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:38:31 crc kubenswrapper[4971]: I0309 09:38:31.992907 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-5jh5l"] Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.117005 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0c0f78c3-b59b-4ec0-9147-904583e571a1-swiftconf\") pod \"swift-ring-rebalance-5jh5l\" (UID: \"0c0f78c3-b59b-4ec0-9147-904583e571a1\") " pod="swift-kuttl-tests/swift-ring-rebalance-5jh5l" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.117191 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0c0f78c3-b59b-4ec0-9147-904583e571a1-ring-data-devices\") pod \"swift-ring-rebalance-5jh5l\" (UID: \"0c0f78c3-b59b-4ec0-9147-904583e571a1\") " pod="swift-kuttl-tests/swift-ring-rebalance-5jh5l" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.117264 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0c0f78c3-b59b-4ec0-9147-904583e571a1-etc-swift\") pod \"swift-ring-rebalance-5jh5l\" (UID: \"0c0f78c3-b59b-4ec0-9147-904583e571a1\") " pod="swift-kuttl-tests/swift-ring-rebalance-5jh5l" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.117306 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c0f78c3-b59b-4ec0-9147-904583e571a1-scripts\") pod \"swift-ring-rebalance-5jh5l\" (UID: \"0c0f78c3-b59b-4ec0-9147-904583e571a1\") " pod="swift-kuttl-tests/swift-ring-rebalance-5jh5l" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.117609 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzgnd\" (UniqueName: \"kubernetes.io/projected/0c0f78c3-b59b-4ec0-9147-904583e571a1-kube-api-access-nzgnd\") pod \"swift-ring-rebalance-5jh5l\" (UID: \"0c0f78c3-b59b-4ec0-9147-904583e571a1\") " pod="swift-kuttl-tests/swift-ring-rebalance-5jh5l" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.117674 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0c0f78c3-b59b-4ec0-9147-904583e571a1-dispersionconf\") pod \"swift-ring-rebalance-5jh5l\" (UID: \"0c0f78c3-b59b-4ec0-9147-904583e571a1\") " pod="swift-kuttl-tests/swift-ring-rebalance-5jh5l" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.218771 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0c0f78c3-b59b-4ec0-9147-904583e571a1-ring-data-devices\") pod \"swift-ring-rebalance-5jh5l\" (UID: \"0c0f78c3-b59b-4ec0-9147-904583e571a1\") " pod="swift-kuttl-tests/swift-ring-rebalance-5jh5l" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.218829 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-etc-swift\") pod \"swift-storage-0\" (UID: \"6f4feb95-a276-4089-9876-d30cde31f67c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.218851 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0c0f78c3-b59b-4ec0-9147-904583e571a1-etc-swift\") pod \"swift-ring-rebalance-5jh5l\" (UID: \"0c0f78c3-b59b-4ec0-9147-904583e571a1\") " pod="swift-kuttl-tests/swift-ring-rebalance-5jh5l" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.218872 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c0f78c3-b59b-4ec0-9147-904583e571a1-scripts\") pod \"swift-ring-rebalance-5jh5l\" (UID: \"0c0f78c3-b59b-4ec0-9147-904583e571a1\") " pod="swift-kuttl-tests/swift-ring-rebalance-5jh5l" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.218946 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzgnd\" (UniqueName: \"kubernetes.io/projected/0c0f78c3-b59b-4ec0-9147-904583e571a1-kube-api-access-nzgnd\") pod \"swift-ring-rebalance-5jh5l\" (UID: \"0c0f78c3-b59b-4ec0-9147-904583e571a1\") " pod="swift-kuttl-tests/swift-ring-rebalance-5jh5l" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.218971 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0c0f78c3-b59b-4ec0-9147-904583e571a1-dispersionconf\") pod \"swift-ring-rebalance-5jh5l\" (UID: \"0c0f78c3-b59b-4ec0-9147-904583e571a1\") " pod="swift-kuttl-tests/swift-ring-rebalance-5jh5l" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.218987 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0c0f78c3-b59b-4ec0-9147-904583e571a1-swiftconf\") pod \"swift-ring-rebalance-5jh5l\" (UID: \"0c0f78c3-b59b-4ec0-9147-904583e571a1\") " pod="swift-kuttl-tests/swift-ring-rebalance-5jh5l" Mar 09 09:38:32 crc kubenswrapper[4971]: E0309 09:38:32.219021 4971 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 09:38:32 crc kubenswrapper[4971]: E0309 09:38:32.219047 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 09 09:38:32 crc kubenswrapper[4971]: E0309 09:38:32.219092 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-etc-swift podName:6f4feb95-a276-4089-9876-d30cde31f67c nodeName:}" failed. No retries permitted until 2026-03-09 09:38:33.219077081 +0000 UTC m=+1116.779004891 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-etc-swift") pod "swift-storage-0" (UID: "6f4feb95-a276-4089-9876-d30cde31f67c") : configmap "swift-ring-files" not found Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.219679 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0c0f78c3-b59b-4ec0-9147-904583e571a1-etc-swift\") pod \"swift-ring-rebalance-5jh5l\" (UID: \"0c0f78c3-b59b-4ec0-9147-904583e571a1\") " pod="swift-kuttl-tests/swift-ring-rebalance-5jh5l" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.219762 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0c0f78c3-b59b-4ec0-9147-904583e571a1-ring-data-devices\") pod \"swift-ring-rebalance-5jh5l\" (UID: \"0c0f78c3-b59b-4ec0-9147-904583e571a1\") " pod="swift-kuttl-tests/swift-ring-rebalance-5jh5l" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.220199 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c0f78c3-b59b-4ec0-9147-904583e571a1-scripts\") pod \"swift-ring-rebalance-5jh5l\" (UID: \"0c0f78c3-b59b-4ec0-9147-904583e571a1\") " pod="swift-kuttl-tests/swift-ring-rebalance-5jh5l" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.222842 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0c0f78c3-b59b-4ec0-9147-904583e571a1-dispersionconf\") pod \"swift-ring-rebalance-5jh5l\" (UID: \"0c0f78c3-b59b-4ec0-9147-904583e571a1\") " pod="swift-kuttl-tests/swift-ring-rebalance-5jh5l" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.224398 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0c0f78c3-b59b-4ec0-9147-904583e571a1-swiftconf\") pod \"swift-ring-rebalance-5jh5l\" (UID: \"0c0f78c3-b59b-4ec0-9147-904583e571a1\") " pod="swift-kuttl-tests/swift-ring-rebalance-5jh5l" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.242436 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzgnd\" (UniqueName: \"kubernetes.io/projected/0c0f78c3-b59b-4ec0-9147-904583e571a1-kube-api-access-nzgnd\") pod \"swift-ring-rebalance-5jh5l\" (UID: \"0c0f78c3-b59b-4ec0-9147-904583e571a1\") " pod="swift-kuttl-tests/swift-ring-rebalance-5jh5l" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.308970 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-5jh5l" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.395827 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-htk6v"] Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.402283 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.444408 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-htk6v"] Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.524146 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/315d491f-24ac-4eda-9e07-1e0533f2f9b7-log-httpd\") pod \"swift-proxy-76c998454c-htk6v\" (UID: \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.524210 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-etc-swift\") pod \"swift-proxy-76c998454c-htk6v\" (UID: \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.524272 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/315d491f-24ac-4eda-9e07-1e0533f2f9b7-run-httpd\") pod \"swift-proxy-76c998454c-htk6v\" (UID: \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.524332 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315d491f-24ac-4eda-9e07-1e0533f2f9b7-config-data\") pod \"swift-proxy-76c998454c-htk6v\" (UID: \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.524407 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6752k\" (UniqueName: \"kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-kube-api-access-6752k\") pod \"swift-proxy-76c998454c-htk6v\" (UID: \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.626295 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/315d491f-24ac-4eda-9e07-1e0533f2f9b7-log-httpd\") pod \"swift-proxy-76c998454c-htk6v\" (UID: \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.626387 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-etc-swift\") pod \"swift-proxy-76c998454c-htk6v\" (UID: \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.626448 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/315d491f-24ac-4eda-9e07-1e0533f2f9b7-run-httpd\") pod \"swift-proxy-76c998454c-htk6v\" (UID: \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.626510 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315d491f-24ac-4eda-9e07-1e0533f2f9b7-config-data\") pod \"swift-proxy-76c998454c-htk6v\" (UID: \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:38:32 crc kubenswrapper[4971]: E0309 09:38:32.626544 4971 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 09:38:32 crc kubenswrapper[4971]: E0309 09:38:32.626569 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-76c998454c-htk6v: configmap "swift-ring-files" not found Mar 09 09:38:32 crc kubenswrapper[4971]: E0309 09:38:32.626624 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-etc-swift podName:315d491f-24ac-4eda-9e07-1e0533f2f9b7 nodeName:}" failed. No retries permitted until 2026-03-09 09:38:33.126604846 +0000 UTC m=+1116.686532656 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-etc-swift") pod "swift-proxy-76c998454c-htk6v" (UID: "315d491f-24ac-4eda-9e07-1e0533f2f9b7") : configmap "swift-ring-files" not found Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.626566 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6752k\" (UniqueName: \"kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-kube-api-access-6752k\") pod \"swift-proxy-76c998454c-htk6v\" (UID: \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.627682 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/315d491f-24ac-4eda-9e07-1e0533f2f9b7-log-httpd\") pod \"swift-proxy-76c998454c-htk6v\" (UID: \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.627698 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/315d491f-24ac-4eda-9e07-1e0533f2f9b7-run-httpd\") pod \"swift-proxy-76c998454c-htk6v\" (UID: \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.635784 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315d491f-24ac-4eda-9e07-1e0533f2f9b7-config-data\") pod \"swift-proxy-76c998454c-htk6v\" (UID: \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.649112 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6752k\" (UniqueName: \"kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-kube-api-access-6752k\") pod \"swift-proxy-76c998454c-htk6v\" (UID: \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.753043 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-5jh5l"] Mar 09 09:38:32 crc kubenswrapper[4971]: W0309 09:38:32.759961 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c0f78c3_b59b_4ec0_9147_904583e571a1.slice/crio-fb3938fe13723d27cbe2d795500fc43a8cadf8e9e0aa7b71f02e5e99be02c826 WatchSource:0}: Error finding container fb3938fe13723d27cbe2d795500fc43a8cadf8e9e0aa7b71f02e5e99be02c826: Status 404 returned error can't find the container with id fb3938fe13723d27cbe2d795500fc43a8cadf8e9e0aa7b71f02e5e99be02c826 Mar 09 09:38:32 crc kubenswrapper[4971]: I0309 09:38:32.762252 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:38:33 crc kubenswrapper[4971]: I0309 09:38:33.023858 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-5jh5l" event={"ID":"0c0f78c3-b59b-4ec0-9147-904583e571a1","Type":"ContainerStarted","Data":"fb3938fe13723d27cbe2d795500fc43a8cadf8e9e0aa7b71f02e5e99be02c826"} Mar 09 09:38:33 crc kubenswrapper[4971]: I0309 09:38:33.133247 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-etc-swift\") pod \"swift-proxy-76c998454c-htk6v\" (UID: \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:38:33 crc kubenswrapper[4971]: E0309 09:38:33.133486 4971 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 09:38:33 crc kubenswrapper[4971]: E0309 09:38:33.133512 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-76c998454c-htk6v: configmap "swift-ring-files" not found Mar 09 09:38:33 crc kubenswrapper[4971]: E0309 09:38:33.133578 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-etc-swift podName:315d491f-24ac-4eda-9e07-1e0533f2f9b7 nodeName:}" failed. No retries permitted until 2026-03-09 09:38:34.133559203 +0000 UTC m=+1117.693487023 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-etc-swift") pod "swift-proxy-76c998454c-htk6v" (UID: "315d491f-24ac-4eda-9e07-1e0533f2f9b7") : configmap "swift-ring-files" not found Mar 09 09:38:33 crc kubenswrapper[4971]: I0309 09:38:33.234419 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-etc-swift\") pod \"swift-storage-0\" (UID: \"6f4feb95-a276-4089-9876-d30cde31f67c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:38:33 crc kubenswrapper[4971]: E0309 09:38:33.234671 4971 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 09:38:33 crc kubenswrapper[4971]: E0309 09:38:33.234708 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 09 09:38:33 crc kubenswrapper[4971]: E0309 09:38:33.234810 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-etc-swift podName:6f4feb95-a276-4089-9876-d30cde31f67c nodeName:}" failed. No retries permitted until 2026-03-09 09:38:35.234788905 +0000 UTC m=+1118.794716715 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-etc-swift") pod "swift-storage-0" (UID: "6f4feb95-a276-4089-9876-d30cde31f67c") : configmap "swift-ring-files" not found Mar 09 09:38:34 crc kubenswrapper[4971]: I0309 09:38:34.147105 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-etc-swift\") pod \"swift-proxy-76c998454c-htk6v\" (UID: \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:38:34 crc kubenswrapper[4971]: E0309 09:38:34.147313 4971 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 09:38:34 crc kubenswrapper[4971]: E0309 09:38:34.147337 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-76c998454c-htk6v: configmap "swift-ring-files" not found Mar 09 09:38:34 crc kubenswrapper[4971]: E0309 09:38:34.147421 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-etc-swift podName:315d491f-24ac-4eda-9e07-1e0533f2f9b7 nodeName:}" failed. No retries permitted until 2026-03-09 09:38:36.147402744 +0000 UTC m=+1119.707330554 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-etc-swift") pod "swift-proxy-76c998454c-htk6v" (UID: "315d491f-24ac-4eda-9e07-1e0533f2f9b7") : configmap "swift-ring-files" not found Mar 09 09:38:35 crc kubenswrapper[4971]: I0309 09:38:35.265266 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-etc-swift\") pod \"swift-storage-0\" (UID: \"6f4feb95-a276-4089-9876-d30cde31f67c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:38:35 crc kubenswrapper[4971]: E0309 09:38:35.265437 4971 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 09:38:35 crc kubenswrapper[4971]: E0309 09:38:35.265956 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 09 09:38:35 crc kubenswrapper[4971]: E0309 09:38:35.266031 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-etc-swift podName:6f4feb95-a276-4089-9876-d30cde31f67c nodeName:}" failed. No retries permitted until 2026-03-09 09:38:39.266007968 +0000 UTC m=+1122.825935778 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-etc-swift") pod "swift-storage-0" (UID: "6f4feb95-a276-4089-9876-d30cde31f67c") : configmap "swift-ring-files" not found Mar 09 09:38:36 crc kubenswrapper[4971]: I0309 09:38:36.189883 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-etc-swift\") pod \"swift-proxy-76c998454c-htk6v\" (UID: \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:38:36 crc kubenswrapper[4971]: E0309 09:38:36.190105 4971 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 09:38:36 crc kubenswrapper[4971]: E0309 09:38:36.190139 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-76c998454c-htk6v: configmap "swift-ring-files" not found Mar 09 09:38:36 crc kubenswrapper[4971]: E0309 09:38:36.190212 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-etc-swift podName:315d491f-24ac-4eda-9e07-1e0533f2f9b7 nodeName:}" failed. No retries permitted until 2026-03-09 09:38:40.190189649 +0000 UTC m=+1123.750117459 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-etc-swift") pod "swift-proxy-76c998454c-htk6v" (UID: "315d491f-24ac-4eda-9e07-1e0533f2f9b7") : configmap "swift-ring-files" not found Mar 09 09:38:38 crc kubenswrapper[4971]: I0309 09:38:38.061990 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-5jh5l" event={"ID":"0c0f78c3-b59b-4ec0-9147-904583e571a1","Type":"ContainerStarted","Data":"554187e3ab94ee132b60bde01c040ad442e61f588caa077ffab326909e95d74f"} Mar 09 09:38:39 crc kubenswrapper[4971]: I0309 09:38:39.356160 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-etc-swift\") pod \"swift-storage-0\" (UID: \"6f4feb95-a276-4089-9876-d30cde31f67c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:38:39 crc kubenswrapper[4971]: E0309 09:38:39.356405 4971 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 09:38:39 crc kubenswrapper[4971]: E0309 09:38:39.356450 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 09 09:38:39 crc kubenswrapper[4971]: E0309 09:38:39.356517 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-etc-swift podName:6f4feb95-a276-4089-9876-d30cde31f67c nodeName:}" failed. No retries permitted until 2026-03-09 09:38:47.356497109 +0000 UTC m=+1130.916424919 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-etc-swift") pod "swift-storage-0" (UID: "6f4feb95-a276-4089-9876-d30cde31f67c") : configmap "swift-ring-files" not found Mar 09 09:38:40 crc kubenswrapper[4971]: I0309 09:38:40.268195 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-etc-swift\") pod \"swift-proxy-76c998454c-htk6v\" (UID: \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:38:40 crc kubenswrapper[4971]: E0309 09:38:40.268383 4971 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 09:38:40 crc kubenswrapper[4971]: E0309 09:38:40.268644 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-76c998454c-htk6v: configmap "swift-ring-files" not found Mar 09 09:38:40 crc kubenswrapper[4971]: E0309 09:38:40.268701 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-etc-swift podName:315d491f-24ac-4eda-9e07-1e0533f2f9b7 nodeName:}" failed. No retries permitted until 2026-03-09 09:38:48.268684255 +0000 UTC m=+1131.828612065 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-etc-swift") pod "swift-proxy-76c998454c-htk6v" (UID: "315d491f-24ac-4eda-9e07-1e0533f2f9b7") : configmap "swift-ring-files" not found Mar 09 09:38:44 crc kubenswrapper[4971]: I0309 09:38:44.124143 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c0f78c3-b59b-4ec0-9147-904583e571a1" containerID="554187e3ab94ee132b60bde01c040ad442e61f588caa077ffab326909e95d74f" exitCode=0 Mar 09 09:38:44 crc kubenswrapper[4971]: I0309 09:38:44.124465 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-5jh5l" event={"ID":"0c0f78c3-b59b-4ec0-9147-904583e571a1","Type":"ContainerDied","Data":"554187e3ab94ee132b60bde01c040ad442e61f588caa077ffab326909e95d74f"} Mar 09 09:38:44 crc kubenswrapper[4971]: I0309 09:38:44.795308 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:38:44 crc kubenswrapper[4971]: I0309 09:38:44.795456 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:38:45 crc kubenswrapper[4971]: I0309 09:38:45.406573 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-5jh5l" Mar 09 09:38:45 crc kubenswrapper[4971]: I0309 09:38:45.546607 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0c0f78c3-b59b-4ec0-9147-904583e571a1-ring-data-devices\") pod \"0c0f78c3-b59b-4ec0-9147-904583e571a1\" (UID: \"0c0f78c3-b59b-4ec0-9147-904583e571a1\") " Mar 09 09:38:45 crc kubenswrapper[4971]: I0309 09:38:45.546912 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0c0f78c3-b59b-4ec0-9147-904583e571a1-swiftconf\") pod \"0c0f78c3-b59b-4ec0-9147-904583e571a1\" (UID: \"0c0f78c3-b59b-4ec0-9147-904583e571a1\") " Mar 09 09:38:45 crc kubenswrapper[4971]: I0309 09:38:45.546944 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c0f78c3-b59b-4ec0-9147-904583e571a1-scripts\") pod \"0c0f78c3-b59b-4ec0-9147-904583e571a1\" (UID: \"0c0f78c3-b59b-4ec0-9147-904583e571a1\") " Mar 09 09:38:45 crc kubenswrapper[4971]: I0309 09:38:45.546991 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0c0f78c3-b59b-4ec0-9147-904583e571a1-etc-swift\") pod \"0c0f78c3-b59b-4ec0-9147-904583e571a1\" (UID: \"0c0f78c3-b59b-4ec0-9147-904583e571a1\") " Mar 09 09:38:45 crc kubenswrapper[4971]: I0309 09:38:45.547071 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzgnd\" (UniqueName: \"kubernetes.io/projected/0c0f78c3-b59b-4ec0-9147-904583e571a1-kube-api-access-nzgnd\") pod \"0c0f78c3-b59b-4ec0-9147-904583e571a1\" (UID: \"0c0f78c3-b59b-4ec0-9147-904583e571a1\") " Mar 09 09:38:45 crc kubenswrapper[4971]: I0309 09:38:45.547120 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0c0f78c3-b59b-4ec0-9147-904583e571a1-dispersionconf\") pod \"0c0f78c3-b59b-4ec0-9147-904583e571a1\" (UID: \"0c0f78c3-b59b-4ec0-9147-904583e571a1\") " Mar 09 09:38:45 crc kubenswrapper[4971]: I0309 09:38:45.547800 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c0f78c3-b59b-4ec0-9147-904583e571a1-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0c0f78c3-b59b-4ec0-9147-904583e571a1" (UID: "0c0f78c3-b59b-4ec0-9147-904583e571a1"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:38:45 crc kubenswrapper[4971]: I0309 09:38:45.548375 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c0f78c3-b59b-4ec0-9147-904583e571a1-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0c0f78c3-b59b-4ec0-9147-904583e571a1" (UID: "0c0f78c3-b59b-4ec0-9147-904583e571a1"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:38:45 crc kubenswrapper[4971]: I0309 09:38:45.559258 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c0f78c3-b59b-4ec0-9147-904583e571a1-kube-api-access-nzgnd" (OuterVolumeSpecName: "kube-api-access-nzgnd") pod "0c0f78c3-b59b-4ec0-9147-904583e571a1" (UID: "0c0f78c3-b59b-4ec0-9147-904583e571a1"). InnerVolumeSpecName "kube-api-access-nzgnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:38:45 crc kubenswrapper[4971]: I0309 09:38:45.566695 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c0f78c3-b59b-4ec0-9147-904583e571a1-scripts" (OuterVolumeSpecName: "scripts") pod "0c0f78c3-b59b-4ec0-9147-904583e571a1" (UID: "0c0f78c3-b59b-4ec0-9147-904583e571a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:38:45 crc kubenswrapper[4971]: I0309 09:38:45.573024 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c0f78c3-b59b-4ec0-9147-904583e571a1-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0c0f78c3-b59b-4ec0-9147-904583e571a1" (UID: "0c0f78c3-b59b-4ec0-9147-904583e571a1"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:38:45 crc kubenswrapper[4971]: I0309 09:38:45.587109 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c0f78c3-b59b-4ec0-9147-904583e571a1-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0c0f78c3-b59b-4ec0-9147-904583e571a1" (UID: "0c0f78c3-b59b-4ec0-9147-904583e571a1"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:38:45 crc kubenswrapper[4971]: I0309 09:38:45.648487 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0c0f78c3-b59b-4ec0-9147-904583e571a1-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:45 crc kubenswrapper[4971]: I0309 09:38:45.648527 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzgnd\" (UniqueName: \"kubernetes.io/projected/0c0f78c3-b59b-4ec0-9147-904583e571a1-kube-api-access-nzgnd\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:45 crc kubenswrapper[4971]: I0309 09:38:45.648546 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0c0f78c3-b59b-4ec0-9147-904583e571a1-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:45 crc kubenswrapper[4971]: I0309 09:38:45.648563 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0c0f78c3-b59b-4ec0-9147-904583e571a1-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:45 crc kubenswrapper[4971]: I0309 09:38:45.648578 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0c0f78c3-b59b-4ec0-9147-904583e571a1-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:45 crc kubenswrapper[4971]: I0309 09:38:45.648590 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c0f78c3-b59b-4ec0-9147-904583e571a1-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:46 crc kubenswrapper[4971]: I0309 09:38:46.142653 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-5jh5l" event={"ID":"0c0f78c3-b59b-4ec0-9147-904583e571a1","Type":"ContainerDied","Data":"fb3938fe13723d27cbe2d795500fc43a8cadf8e9e0aa7b71f02e5e99be02c826"} Mar 09 09:38:46 crc kubenswrapper[4971]: I0309 09:38:46.142709 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-5jh5l" Mar 09 09:38:46 crc kubenswrapper[4971]: I0309 09:38:46.142716 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb3938fe13723d27cbe2d795500fc43a8cadf8e9e0aa7b71f02e5e99be02c826" Mar 09 09:38:46 crc kubenswrapper[4971]: I0309 09:38:46.313549 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-5jh5l_0c0f78c3-b59b-4ec0-9147-904583e571a1/swift-ring-rebalance/0.log" Mar 09 09:38:47 crc kubenswrapper[4971]: I0309 09:38:47.370893 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-etc-swift\") pod \"swift-storage-0\" (UID: \"6f4feb95-a276-4089-9876-d30cde31f67c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:38:47 crc kubenswrapper[4971]: I0309 09:38:47.380976 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-etc-swift\") pod \"swift-storage-0\" (UID: \"6f4feb95-a276-4089-9876-d30cde31f67c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:38:47 crc kubenswrapper[4971]: I0309 09:38:47.665655 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:38:47 crc kubenswrapper[4971]: I0309 09:38:47.884684 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-5jh5l_0c0f78c3-b59b-4ec0-9147-904583e571a1/swift-ring-rebalance/0.log" Mar 09 09:38:48 crc kubenswrapper[4971]: I0309 09:38:48.090495 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 09:38:48 crc kubenswrapper[4971]: I0309 09:38:48.156308 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerStarted","Data":"78632660fac2835d640401ec4ef5bbfb577bd0df04b45cf9d41ee5bf7d04e684"} Mar 09 09:38:48 crc kubenswrapper[4971]: I0309 09:38:48.284402 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-etc-swift\") pod \"swift-proxy-76c998454c-htk6v\" (UID: \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:38:48 crc kubenswrapper[4971]: I0309 09:38:48.293136 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-etc-swift\") pod \"swift-proxy-76c998454c-htk6v\" (UID: \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:38:48 crc kubenswrapper[4971]: I0309 09:38:48.344214 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:38:48 crc kubenswrapper[4971]: I0309 09:38:48.747949 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-htk6v"] Mar 09 09:38:48 crc kubenswrapper[4971]: W0309 09:38:48.761800 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod315d491f_24ac_4eda_9e07_1e0533f2f9b7.slice/crio-a7588003a6294342238d38881207f7fbe0c789e9e9580fd23c28c1d1dab119a7 WatchSource:0}: Error finding container a7588003a6294342238d38881207f7fbe0c789e9e9580fd23c28c1d1dab119a7: Status 404 returned error can't find the container with id a7588003a6294342238d38881207f7fbe0c789e9e9580fd23c28c1d1dab119a7 Mar 09 09:38:49 crc kubenswrapper[4971]: I0309 09:38:49.163161 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" event={"ID":"315d491f-24ac-4eda-9e07-1e0533f2f9b7","Type":"ContainerStarted","Data":"448228bb223b887e0933c5ff8a7cec55427aa455ff0273e66a0d2d7d1c127720"} Mar 09 09:38:49 crc kubenswrapper[4971]: I0309 09:38:49.163557 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" event={"ID":"315d491f-24ac-4eda-9e07-1e0533f2f9b7","Type":"ContainerStarted","Data":"a7588003a6294342238d38881207f7fbe0c789e9e9580fd23c28c1d1dab119a7"} Mar 09 09:38:49 crc kubenswrapper[4971]: I0309 09:38:49.469958 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-5jh5l_0c0f78c3-b59b-4ec0-9147-904583e571a1/swift-ring-rebalance/0.log" Mar 09 09:38:50 crc kubenswrapper[4971]: I0309 09:38:50.171550 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" event={"ID":"315d491f-24ac-4eda-9e07-1e0533f2f9b7","Type":"ContainerStarted","Data":"3ece388c0688fe0f8fb63a883a4cc8c07356cda3a3ae8f63dcc9935c32bd11b1"} Mar 09 09:38:50 crc kubenswrapper[4971]: I0309 09:38:50.171687 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:38:50 crc kubenswrapper[4971]: I0309 09:38:50.174530 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerStarted","Data":"192f90e34d0663d6e2965b1a7b69f75914a2b0b54f1d1415713f38a3a6d31b17"} Mar 09 09:38:50 crc kubenswrapper[4971]: I0309 09:38:50.174561 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerStarted","Data":"c95508be51add3e7d0796dc7cb4960c28457d409183918120fa7690944dd13d0"} Mar 09 09:38:50 crc kubenswrapper[4971]: I0309 09:38:50.174573 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerStarted","Data":"cb1cd3bf27a24f2736c0759f0ec976fc1d7c9b4e09b6185137076ce9b86d7de4"} Mar 09 09:38:50 crc kubenswrapper[4971]: I0309 09:38:50.174585 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerStarted","Data":"848aad21bf36207d4da113fb7cd49be9b8fa537539be36374caa4a2cb670844e"} Mar 09 09:38:50 crc kubenswrapper[4971]: I0309 09:38:50.192829 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" podStartSLOduration=18.192810129 podStartE2EDuration="18.192810129s" podCreationTimestamp="2026-03-09 09:38:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:38:50.191158741 +0000 UTC m=+1133.751086561" watchObservedRunningTime="2026-03-09 09:38:50.192810129 +0000 UTC m=+1133.752737939" Mar 09 09:38:51 crc kubenswrapper[4971]: I0309 09:38:51.003804 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-5jh5l_0c0f78c3-b59b-4ec0-9147-904583e571a1/swift-ring-rebalance/0.log" Mar 09 09:38:51 crc kubenswrapper[4971]: I0309 09:38:51.181067 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:38:52 crc kubenswrapper[4971]: I0309 09:38:52.222419 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerStarted","Data":"121a90c09f9c392afe81374c449487857e4e089ff467bf96b51986d9517ffc13"} Mar 09 09:38:52 crc kubenswrapper[4971]: I0309 09:38:52.222738 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerStarted","Data":"5b04e74aeb6b43bdc4c680266f8796501ff152e2bbb4c4781e0892e1b5b2822c"} Mar 09 09:38:52 crc kubenswrapper[4971]: I0309 09:38:52.222754 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerStarted","Data":"a036c583066b72a231c3293a4b9c4774af28c7de5ea959bad83acf1618efcbe9"} Mar 09 09:38:52 crc kubenswrapper[4971]: I0309 09:38:52.222762 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerStarted","Data":"1d15d67628c21612f8710d292982a5e01a70e02eabd3218c498d759364ecd6bc"} Mar 09 09:38:52 crc kubenswrapper[4971]: I0309 09:38:52.571123 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-5jh5l_0c0f78c3-b59b-4ec0-9147-904583e571a1/swift-ring-rebalance/0.log" Mar 09 09:38:53 crc kubenswrapper[4971]: I0309 09:38:53.246163 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerStarted","Data":"6f3107911aa7fed5ce204be52c5e4f42ea17e7dc2e1bd34b47cd9d4acf8a077b"} Mar 09 09:38:53 crc kubenswrapper[4971]: I0309 09:38:53.350828 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:38:54 crc kubenswrapper[4971]: I0309 09:38:54.160131 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-5jh5l_0c0f78c3-b59b-4ec0-9147-904583e571a1/swift-ring-rebalance/0.log" Mar 09 09:38:54 crc kubenswrapper[4971]: I0309 09:38:54.263010 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerStarted","Data":"47b0f5e147cc9ee4d9663829ece46b7c6b53841d0cb3b27cf5502732e5f43b8c"} Mar 09 09:38:54 crc kubenswrapper[4971]: I0309 09:38:54.263075 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerStarted","Data":"c7f80e6d39240a4cdc9de835422fcb08a0a16fe50edd76e83c861182a184e612"} Mar 09 09:38:54 crc kubenswrapper[4971]: I0309 09:38:54.263089 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerStarted","Data":"81bd53effc126ccb7c3e3a23efcd194275c8f688c0d68922e1496208248e5ea7"} Mar 09 09:38:54 crc kubenswrapper[4971]: I0309 09:38:54.263100 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerStarted","Data":"a7508e7c1c0fc417c2ba35abcd0621e333541392996e38c35c97fcefd12b89d3"} Mar 09 09:38:54 crc kubenswrapper[4971]: I0309 09:38:54.263109 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerStarted","Data":"15187b478914a74d87583729ecb3e4c017f73934ddb3f6f0c883f4ecbb824648"} Mar 09 09:38:55 crc kubenswrapper[4971]: I0309 09:38:55.284297 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerStarted","Data":"c4aa88515fa1151c995dced60a9702439c048af43e5987392d2bc961e486b0cf"} Mar 09 09:38:55 crc kubenswrapper[4971]: I0309 09:38:55.333942 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-0" podStartSLOduration=20.390603831 podStartE2EDuration="25.333920025s" podCreationTimestamp="2026-03-09 09:38:30 +0000 UTC" firstStartedPulling="2026-03-09 09:38:48.096317644 +0000 UTC m=+1131.656245454" lastFinishedPulling="2026-03-09 09:38:53.039633848 +0000 UTC m=+1136.599561648" observedRunningTime="2026-03-09 09:38:55.325249306 +0000 UTC m=+1138.885177126" watchObservedRunningTime="2026-03-09 09:38:55.333920025 +0000 UTC m=+1138.893847835" Mar 09 09:38:55 crc kubenswrapper[4971]: I0309 09:38:55.739559 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-5jh5l_0c0f78c3-b59b-4ec0-9147-904583e571a1/swift-ring-rebalance/0.log" Mar 09 09:38:57 crc kubenswrapper[4971]: I0309 09:38:57.302121 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-5jh5l_0c0f78c3-b59b-4ec0-9147-904583e571a1/swift-ring-rebalance/0.log" Mar 09 09:38:58 crc kubenswrapper[4971]: I0309 09:38:58.347119 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:38:58 crc kubenswrapper[4971]: I0309 09:38:58.822983 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-5jh5l_0c0f78c3-b59b-4ec0-9147-904583e571a1/swift-ring-rebalance/0.log" Mar 09 09:39:00 crc kubenswrapper[4971]: I0309 09:39:00.349442 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-5jh5l_0c0f78c3-b59b-4ec0-9147-904583e571a1/swift-ring-rebalance/0.log" Mar 09 09:39:01 crc kubenswrapper[4971]: I0309 09:39:01.734482 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 09 09:39:01 crc kubenswrapper[4971]: E0309 09:39:01.734825 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c0f78c3-b59b-4ec0-9147-904583e571a1" containerName="swift-ring-rebalance" Mar 09 09:39:01 crc kubenswrapper[4971]: I0309 09:39:01.734838 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c0f78c3-b59b-4ec0-9147-904583e571a1" containerName="swift-ring-rebalance" Mar 09 09:39:01 crc kubenswrapper[4971]: I0309 09:39:01.734994 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c0f78c3-b59b-4ec0-9147-904583e571a1" containerName="swift-ring-rebalance" Mar 09 09:39:01 crc kubenswrapper[4971]: I0309 09:39:01.738997 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:39:01 crc kubenswrapper[4971]: I0309 09:39:01.739772 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 09 09:39:01 crc kubenswrapper[4971]: I0309 09:39:01.750281 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 09 09:39:01 crc kubenswrapper[4971]: I0309 09:39:01.750451 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:39:01 crc kubenswrapper[4971]: I0309 09:39:01.775240 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 09 09:39:01 crc kubenswrapper[4971]: I0309 09:39:01.910485 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f302bdd8-8044-48a4-aacd-13967f94570c-etc-swift\") pod \"swift-storage-2\" (UID: \"f302bdd8-8044-48a4-aacd-13967f94570c\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:39:01 crc kubenswrapper[4971]: I0309 09:39:01.910543 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-1\" (UID: \"52c89471-afd6-4cce-8a00-54dbcd4ef92b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:39:01 crc kubenswrapper[4971]: I0309 09:39:01.910573 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f302bdd8-8044-48a4-aacd-13967f94570c-lock\") pod \"swift-storage-2\" (UID: \"f302bdd8-8044-48a4-aacd-13967f94570c\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:39:01 crc kubenswrapper[4971]: I0309 09:39:01.910649 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlws9\" (UniqueName: \"kubernetes.io/projected/52c89471-afd6-4cce-8a00-54dbcd4ef92b-kube-api-access-mlws9\") pod \"swift-storage-1\" (UID: \"52c89471-afd6-4cce-8a00-54dbcd4ef92b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:39:01 crc kubenswrapper[4971]: I0309 09:39:01.910697 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/52c89471-afd6-4cce-8a00-54dbcd4ef92b-lock\") pod \"swift-storage-1\" (UID: \"52c89471-afd6-4cce-8a00-54dbcd4ef92b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:39:01 crc kubenswrapper[4971]: I0309 09:39:01.910721 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f302bdd8-8044-48a4-aacd-13967f94570c-cache\") pod \"swift-storage-2\" (UID: \"f302bdd8-8044-48a4-aacd-13967f94570c\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:39:01 crc kubenswrapper[4971]: I0309 09:39:01.910827 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/52c89471-afd6-4cce-8a00-54dbcd4ef92b-cache\") pod \"swift-storage-1\" (UID: \"52c89471-afd6-4cce-8a00-54dbcd4ef92b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:39:01 crc kubenswrapper[4971]: I0309 09:39:01.910902 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h66dp\" (UniqueName: \"kubernetes.io/projected/f302bdd8-8044-48a4-aacd-13967f94570c-kube-api-access-h66dp\") pod \"swift-storage-2\" (UID: \"f302bdd8-8044-48a4-aacd-13967f94570c\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:39:01 crc kubenswrapper[4971]: I0309 09:39:01.910941 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/52c89471-afd6-4cce-8a00-54dbcd4ef92b-etc-swift\") pod \"swift-storage-1\" (UID: \"52c89471-afd6-4cce-8a00-54dbcd4ef92b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:39:01 crc kubenswrapper[4971]: I0309 09:39:01.911053 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-2\" (UID: \"f302bdd8-8044-48a4-aacd-13967f94570c\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.011995 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/52c89471-afd6-4cce-8a00-54dbcd4ef92b-lock\") pod \"swift-storage-1\" (UID: \"52c89471-afd6-4cce-8a00-54dbcd4ef92b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.012054 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f302bdd8-8044-48a4-aacd-13967f94570c-cache\") pod \"swift-storage-2\" (UID: \"f302bdd8-8044-48a4-aacd-13967f94570c\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.012109 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/52c89471-afd6-4cce-8a00-54dbcd4ef92b-cache\") pod \"swift-storage-1\" (UID: \"52c89471-afd6-4cce-8a00-54dbcd4ef92b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.012134 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h66dp\" (UniqueName: \"kubernetes.io/projected/f302bdd8-8044-48a4-aacd-13967f94570c-kube-api-access-h66dp\") pod \"swift-storage-2\" (UID: \"f302bdd8-8044-48a4-aacd-13967f94570c\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.012158 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/52c89471-afd6-4cce-8a00-54dbcd4ef92b-etc-swift\") pod \"swift-storage-1\" (UID: \"52c89471-afd6-4cce-8a00-54dbcd4ef92b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.012195 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-2\" (UID: \"f302bdd8-8044-48a4-aacd-13967f94570c\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.012233 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f302bdd8-8044-48a4-aacd-13967f94570c-etc-swift\") pod \"swift-storage-2\" (UID: \"f302bdd8-8044-48a4-aacd-13967f94570c\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.012255 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-1\" (UID: \"52c89471-afd6-4cce-8a00-54dbcd4ef92b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.012277 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f302bdd8-8044-48a4-aacd-13967f94570c-lock\") pod \"swift-storage-2\" (UID: \"f302bdd8-8044-48a4-aacd-13967f94570c\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.012317 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlws9\" (UniqueName: \"kubernetes.io/projected/52c89471-afd6-4cce-8a00-54dbcd4ef92b-kube-api-access-mlws9\") pod \"swift-storage-1\" (UID: \"52c89471-afd6-4cce-8a00-54dbcd4ef92b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.012574 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/52c89471-afd6-4cce-8a00-54dbcd4ef92b-lock\") pod \"swift-storage-1\" (UID: \"52c89471-afd6-4cce-8a00-54dbcd4ef92b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.012654 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/52c89471-afd6-4cce-8a00-54dbcd4ef92b-cache\") pod \"swift-storage-1\" (UID: \"52c89471-afd6-4cce-8a00-54dbcd4ef92b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.012701 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f302bdd8-8044-48a4-aacd-13967f94570c-cache\") pod \"swift-storage-2\" (UID: \"f302bdd8-8044-48a4-aacd-13967f94570c\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.012737 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-2\" (UID: \"f302bdd8-8044-48a4-aacd-13967f94570c\") device mount path \"/mnt/openstack/pv04\"" pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.012766 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-1\" (UID: \"52c89471-afd6-4cce-8a00-54dbcd4ef92b\") device mount path \"/mnt/openstack/pv02\"" pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.012855 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f302bdd8-8044-48a4-aacd-13967f94570c-lock\") pod \"swift-storage-2\" (UID: \"f302bdd8-8044-48a4-aacd-13967f94570c\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.020008 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f302bdd8-8044-48a4-aacd-13967f94570c-etc-swift\") pod \"swift-storage-2\" (UID: \"f302bdd8-8044-48a4-aacd-13967f94570c\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.022940 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/52c89471-afd6-4cce-8a00-54dbcd4ef92b-etc-swift\") pod \"swift-storage-1\" (UID: \"52c89471-afd6-4cce-8a00-54dbcd4ef92b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.029046 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlws9\" (UniqueName: \"kubernetes.io/projected/52c89471-afd6-4cce-8a00-54dbcd4ef92b-kube-api-access-mlws9\") pod \"swift-storage-1\" (UID: \"52c89471-afd6-4cce-8a00-54dbcd4ef92b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.033575 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-2\" (UID: \"f302bdd8-8044-48a4-aacd-13967f94570c\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.034031 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h66dp\" (UniqueName: \"kubernetes.io/projected/f302bdd8-8044-48a4-aacd-13967f94570c-kube-api-access-h66dp\") pod \"swift-storage-2\" (UID: \"f302bdd8-8044-48a4-aacd-13967f94570c\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.036317 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-1\" (UID: \"52c89471-afd6-4cce-8a00-54dbcd4ef92b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.064738 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.080571 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.533658 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 09 09:39:02 crc kubenswrapper[4971]: W0309 09:39:02.549182 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52c89471_afd6_4cce_8a00_54dbcd4ef92b.slice/crio-61e7bf904994a0c837f0fd4cacda734b9a925d1138328259709a053f3b92c5b4 WatchSource:0}: Error finding container 61e7bf904994a0c837f0fd4cacda734b9a925d1138328259709a053f3b92c5b4: Status 404 returned error can't find the container with id 61e7bf904994a0c837f0fd4cacda734b9a925d1138328259709a053f3b92c5b4 Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.605197 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.814639 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-5jh5l"] Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.826002 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-5jh5l"] Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.832906 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-x7xdk"] Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.834032 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.839243 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.847769 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-x7xdk"] Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.848588 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.934695 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-etc-swift\") pod \"swift-ring-rebalance-x7xdk\" (UID: \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.934740 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-scripts\") pod \"swift-ring-rebalance-x7xdk\" (UID: \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.934784 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnqjs\" (UniqueName: \"kubernetes.io/projected/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-kube-api-access-fnqjs\") pod \"swift-ring-rebalance-x7xdk\" (UID: \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.934830 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-swiftconf\") pod \"swift-ring-rebalance-x7xdk\" (UID: \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.934861 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-dispersionconf\") pod \"swift-ring-rebalance-x7xdk\" (UID: \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" Mar 09 09:39:02 crc kubenswrapper[4971]: I0309 09:39:02.934967 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-ring-data-devices\") pod \"swift-ring-rebalance-x7xdk\" (UID: \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" Mar 09 09:39:03 crc kubenswrapper[4971]: I0309 09:39:03.037010 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-swiftconf\") pod \"swift-ring-rebalance-x7xdk\" (UID: \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" Mar 09 09:39:03 crc kubenswrapper[4971]: I0309 09:39:03.037079 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-dispersionconf\") pod \"swift-ring-rebalance-x7xdk\" (UID: \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" Mar 09 09:39:03 crc kubenswrapper[4971]: I0309 09:39:03.037136 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-ring-data-devices\") pod \"swift-ring-rebalance-x7xdk\" (UID: \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" Mar 09 09:39:03 crc kubenswrapper[4971]: I0309 09:39:03.037206 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-etc-swift\") pod \"swift-ring-rebalance-x7xdk\" (UID: \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" Mar 09 09:39:03 crc kubenswrapper[4971]: I0309 09:39:03.037236 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-scripts\") pod \"swift-ring-rebalance-x7xdk\" (UID: \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" Mar 09 09:39:03 crc kubenswrapper[4971]: I0309 09:39:03.037278 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnqjs\" (UniqueName: \"kubernetes.io/projected/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-kube-api-access-fnqjs\") pod \"swift-ring-rebalance-x7xdk\" (UID: \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" Mar 09 09:39:03 crc kubenswrapper[4971]: I0309 09:39:03.038455 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-etc-swift\") pod \"swift-ring-rebalance-x7xdk\" (UID: \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" Mar 09 09:39:03 crc kubenswrapper[4971]: I0309 09:39:03.038507 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-ring-data-devices\") pod \"swift-ring-rebalance-x7xdk\" (UID: \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" Mar 09 09:39:03 crc kubenswrapper[4971]: I0309 09:39:03.039175 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-scripts\") pod \"swift-ring-rebalance-x7xdk\" (UID: \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" Mar 09 09:39:03 crc kubenswrapper[4971]: I0309 09:39:03.046789 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-dispersionconf\") pod \"swift-ring-rebalance-x7xdk\" (UID: \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" Mar 09 09:39:03 crc kubenswrapper[4971]: I0309 09:39:03.057155 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-swiftconf\") pod \"swift-ring-rebalance-x7xdk\" (UID: \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" Mar 09 09:39:03 crc kubenswrapper[4971]: I0309 09:39:03.060980 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnqjs\" (UniqueName: \"kubernetes.io/projected/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-kube-api-access-fnqjs\") pod \"swift-ring-rebalance-x7xdk\" (UID: \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\") " pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" Mar 09 09:39:03 crc kubenswrapper[4971]: I0309 09:39:03.186272 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" Mar 09 09:39:03 crc kubenswrapper[4971]: I0309 09:39:03.223545 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c0f78c3-b59b-4ec0-9147-904583e571a1" path="/var/lib/kubelet/pods/0c0f78c3-b59b-4ec0-9147-904583e571a1/volumes" Mar 09 09:39:03 crc kubenswrapper[4971]: I0309 09:39:03.373672 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerStarted","Data":"b1c066201987d391a8dd14926c5be6933ba1dbe962df715b3f3afa9727d8f13e"} Mar 09 09:39:03 crc kubenswrapper[4971]: I0309 09:39:03.373715 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerStarted","Data":"d5455beca7782c67fb2e7459725302522a881919ad965b93e26e23b98b6a2900"} Mar 09 09:39:03 crc kubenswrapper[4971]: I0309 09:39:03.373729 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerStarted","Data":"835399e58bad9c868523c328e4f2809cb440e6fd3f172ab8b0694f22bc790969"} Mar 09 09:39:03 crc kubenswrapper[4971]: I0309 09:39:03.373742 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerStarted","Data":"61e7bf904994a0c837f0fd4cacda734b9a925d1138328259709a053f3b92c5b4"} Mar 09 09:39:03 crc kubenswrapper[4971]: I0309 09:39:03.385059 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerStarted","Data":"b10368b9dc860c3bd7458e035e3e68547c3356437a15b60e49b2b6502921da5e"} Mar 09 09:39:03 crc kubenswrapper[4971]: I0309 09:39:03.385130 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerStarted","Data":"046324d703432fdbb7bf43b7e37121cadf48166bd4a1318c968d8008c238e159"} Mar 09 09:39:03 crc kubenswrapper[4971]: I0309 09:39:03.385141 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerStarted","Data":"63998b1ce663cb73c696042d13093ad012e5476f293ecfa291bfe156e6a6731b"} Mar 09 09:39:03 crc kubenswrapper[4971]: I0309 09:39:03.385151 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerStarted","Data":"0846790c197ee9ba49995a83fe78257d13b81b6633feb00583ad1f49ffdf762a"} Mar 09 09:39:03 crc kubenswrapper[4971]: I0309 09:39:03.385160 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerStarted","Data":"fe2d3e8aaddf3d2086091ba540a1cc69d288f6c02e38f8e76279f891e769b629"} Mar 09 09:39:03 crc kubenswrapper[4971]: I0309 09:39:03.627029 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-x7xdk"] Mar 09 09:39:03 crc kubenswrapper[4971]: I0309 09:39:03.798524 4971 scope.go:117] "RemoveContainer" containerID="6f6ee7820a4785d9490723be5f5bafdf23d43e451ddd0d6e0573798c47b11cd9" Mar 09 09:39:04 crc kubenswrapper[4971]: I0309 09:39:04.394934 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerStarted","Data":"50b37896eea06628ecc0ff8113beb83f92dacd56dda44a82698dd4757d0484ec"} Mar 09 09:39:04 crc kubenswrapper[4971]: I0309 09:39:04.395404 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerStarted","Data":"ff7d1cbddb197d85711318a82b94ec0d52e6adfecd4bd4e0dc18c99a41942d12"} Mar 09 09:39:04 crc kubenswrapper[4971]: I0309 09:39:04.395424 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerStarted","Data":"9f807e3c3ca88bd802a7aea7370f8a5cc7d67e20d59a31b0972dd8f3c4371e29"} Mar 09 09:39:04 crc kubenswrapper[4971]: I0309 09:39:04.395441 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerStarted","Data":"d528d13ffd9863dfe83d2b0b1af5d6c819c2f47dc3a02586cf497de786326e75"} Mar 09 09:39:04 crc kubenswrapper[4971]: I0309 09:39:04.397256 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" event={"ID":"5fcb9db8-86c8-4d4b-b541-5cf291d702f2","Type":"ContainerStarted","Data":"e2c0fb322aacd49c39694152ff8bffb836dccf9b61150ee892504b3c52bfe072"} Mar 09 09:39:04 crc kubenswrapper[4971]: I0309 09:39:04.397304 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" event={"ID":"5fcb9db8-86c8-4d4b-b541-5cf291d702f2","Type":"ContainerStarted","Data":"3923b9d0ce8b645e0ff8942009c17cd8d9218cb9dec368026709050216620a28"} Mar 09 09:39:04 crc kubenswrapper[4971]: I0309 09:39:04.401180 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerStarted","Data":"2bd62e61d9273d27adae24883798b95d981eba165342c0115a8ed473604a86a5"} Mar 09 09:39:04 crc kubenswrapper[4971]: I0309 09:39:04.401222 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerStarted","Data":"b691a5dd6b911fb191a9c07bc398f6bbebb3f00217946b2ff8238c3bb5f4731d"} Mar 09 09:39:04 crc kubenswrapper[4971]: I0309 09:39:04.401237 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerStarted","Data":"0c8ebcacb6282d984b0446d963fdc2bd3528a6f38ec6de9de6a73c6dd79bccb6"} Mar 09 09:39:04 crc kubenswrapper[4971]: I0309 09:39:04.401248 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerStarted","Data":"b5aacb4f2dd858d7eb95084bd2708dbe1b313f689202584451b13642d9b8f55f"} Mar 09 09:39:04 crc kubenswrapper[4971]: I0309 09:39:04.448114 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" podStartSLOduration=2.448093553 podStartE2EDuration="2.448093553s" podCreationTimestamp="2026-03-09 09:39:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:39:04.444991444 +0000 UTC m=+1148.004919254" watchObservedRunningTime="2026-03-09 09:39:04.448093553 +0000 UTC m=+1148.008021363" Mar 09 09:39:05 crc kubenswrapper[4971]: I0309 09:39:05.447627 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerStarted","Data":"778a01a71c0604ad7e14750e5dd3ae66e36d42a9179af09e745e3aa305e5ad95"} Mar 09 09:39:05 crc kubenswrapper[4971]: I0309 09:39:05.447982 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerStarted","Data":"371c269962f08a5e5cb9d92b8dd0305621f0d0b732e821209bd9c77742716ce3"} Mar 09 09:39:05 crc kubenswrapper[4971]: I0309 09:39:05.447997 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerStarted","Data":"d9efd0b7e90d1c5fc1f09a7e00f3865753ca39f93436949de4897d929b5dde96"} Mar 09 09:39:05 crc kubenswrapper[4971]: I0309 09:39:05.448005 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerStarted","Data":"685ebb342528743121758ca1e9c7e33a0df5f99a17f412802da9e5017c61621f"} Mar 09 09:39:05 crc kubenswrapper[4971]: I0309 09:39:05.477320 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerStarted","Data":"6c34a39cd1f6d1492aad90077ec6a42408f46f3bcccb7b0064b9bf0aa8abf4a2"} Mar 09 09:39:05 crc kubenswrapper[4971]: I0309 09:39:05.477383 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerStarted","Data":"dc82b02f06416b72870d4e96fa28c536b46ebac807f74b9e586c94cff84ebbd1"} Mar 09 09:39:05 crc kubenswrapper[4971]: I0309 09:39:05.477398 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerStarted","Data":"d2c2292180738b4ce2c8c5098f9124b67253c759fa65dc97e488d1143f9022af"} Mar 09 09:39:05 crc kubenswrapper[4971]: I0309 09:39:05.477408 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerStarted","Data":"a09b6b080b275f78b31ee43c6251e1ac8b9df12f3d3f6e0cb2935eb9aac50aed"} Mar 09 09:39:06 crc kubenswrapper[4971]: I0309 09:39:06.490519 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerStarted","Data":"028bf2e70f6264d9ac250238c465525469085579c8a83d96f201c0b1d5db8e55"} Mar 09 09:39:06 crc kubenswrapper[4971]: I0309 09:39:06.490828 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerStarted","Data":"9f70981b5e4f72eda1c8702f375251a2076f188c8de7ff62f47f3965fab5ebe9"} Mar 09 09:39:06 crc kubenswrapper[4971]: I0309 09:39:06.490869 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerStarted","Data":"4f4c3914c725af7e2c42f00fd89515519532d47360a29b5c80c0e74d5900dd68"} Mar 09 09:39:06 crc kubenswrapper[4971]: I0309 09:39:06.496478 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerStarted","Data":"8514f9892d73207bca6439d58cf121d6234889b8e57b2db16cffc790d7b4ad49"} Mar 09 09:39:06 crc kubenswrapper[4971]: I0309 09:39:06.496527 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerStarted","Data":"4458d3e84ca56cb30a624364166ae4fa8207e1a3939676005bb9a4bda0ad96cb"} Mar 09 09:39:06 crc kubenswrapper[4971]: I0309 09:39:06.496542 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerStarted","Data":"2f9499650f3c8ff7cee6d1b2c7ee361d719b0dfefa5b0bccaddb4f38a3681cda"} Mar 09 09:39:06 crc kubenswrapper[4971]: I0309 09:39:06.496555 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerStarted","Data":"010bd76869fd777a88efe67de904a37c0c4d058f63e845c195a9b3a4a07771fb"} Mar 09 09:39:06 crc kubenswrapper[4971]: I0309 09:39:06.528489 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-2" podStartSLOduration=6.528465677 podStartE2EDuration="6.528465677s" podCreationTimestamp="2026-03-09 09:39:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:39:06.525128371 +0000 UTC m=+1150.085056211" watchObservedRunningTime="2026-03-09 09:39:06.528465677 +0000 UTC m=+1150.088393487" Mar 09 09:39:06 crc kubenswrapper[4971]: I0309 09:39:06.562253 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-1" podStartSLOduration=6.562237265 podStartE2EDuration="6.562237265s" podCreationTimestamp="2026-03-09 09:39:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:39:06.55823842 +0000 UTC m=+1150.118166230" watchObservedRunningTime="2026-03-09 09:39:06.562237265 +0000 UTC m=+1150.122165075" Mar 09 09:39:12 crc kubenswrapper[4971]: I0309 09:39:12.561168 4971 generic.go:334] "Generic (PLEG): container finished" podID="5fcb9db8-86c8-4d4b-b541-5cf291d702f2" containerID="e2c0fb322aacd49c39694152ff8bffb836dccf9b61150ee892504b3c52bfe072" exitCode=0 Mar 09 09:39:12 crc kubenswrapper[4971]: I0309 09:39:12.561259 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" event={"ID":"5fcb9db8-86c8-4d4b-b541-5cf291d702f2","Type":"ContainerDied","Data":"e2c0fb322aacd49c39694152ff8bffb836dccf9b61150ee892504b3c52bfe072"} Mar 09 09:39:13 crc kubenswrapper[4971]: I0309 09:39:13.869047 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" Mar 09 09:39:13 crc kubenswrapper[4971]: I0309 09:39:13.926738 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnqjs\" (UniqueName: \"kubernetes.io/projected/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-kube-api-access-fnqjs\") pod \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\" (UID: \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\") " Mar 09 09:39:13 crc kubenswrapper[4971]: I0309 09:39:13.926779 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-etc-swift\") pod \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\" (UID: \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\") " Mar 09 09:39:13 crc kubenswrapper[4971]: I0309 09:39:13.926812 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-scripts\") pod \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\" (UID: \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\") " Mar 09 09:39:13 crc kubenswrapper[4971]: I0309 09:39:13.926944 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-dispersionconf\") pod \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\" (UID: \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\") " Mar 09 09:39:13 crc kubenswrapper[4971]: I0309 09:39:13.926984 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-swiftconf\") pod \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\" (UID: \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\") " Mar 09 09:39:13 crc kubenswrapper[4971]: I0309 09:39:13.927022 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-ring-data-devices\") pod \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\" (UID: \"5fcb9db8-86c8-4d4b-b541-5cf291d702f2\") " Mar 09 09:39:13 crc kubenswrapper[4971]: I0309 09:39:13.927543 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5fcb9db8-86c8-4d4b-b541-5cf291d702f2" (UID: "5fcb9db8-86c8-4d4b-b541-5cf291d702f2"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:39:13 crc kubenswrapper[4971]: I0309 09:39:13.927584 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5fcb9db8-86c8-4d4b-b541-5cf291d702f2" (UID: "5fcb9db8-86c8-4d4b-b541-5cf291d702f2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:39:13 crc kubenswrapper[4971]: I0309 09:39:13.935566 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-kube-api-access-fnqjs" (OuterVolumeSpecName: "kube-api-access-fnqjs") pod "5fcb9db8-86c8-4d4b-b541-5cf291d702f2" (UID: "5fcb9db8-86c8-4d4b-b541-5cf291d702f2"). InnerVolumeSpecName "kube-api-access-fnqjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:39:13 crc kubenswrapper[4971]: I0309 09:39:13.945083 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-scripts" (OuterVolumeSpecName: "scripts") pod "5fcb9db8-86c8-4d4b-b541-5cf291d702f2" (UID: "5fcb9db8-86c8-4d4b-b541-5cf291d702f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:39:13 crc kubenswrapper[4971]: I0309 09:39:13.947399 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5fcb9db8-86c8-4d4b-b541-5cf291d702f2" (UID: "5fcb9db8-86c8-4d4b-b541-5cf291d702f2"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:39:13 crc kubenswrapper[4971]: I0309 09:39:13.947546 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5fcb9db8-86c8-4d4b-b541-5cf291d702f2" (UID: "5fcb9db8-86c8-4d4b-b541-5cf291d702f2"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.029241 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.029281 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.029294 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnqjs\" (UniqueName: \"kubernetes.io/projected/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-kube-api-access-fnqjs\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.029304 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.029312 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.029320 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5fcb9db8-86c8-4d4b-b541-5cf291d702f2-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.577127 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" event={"ID":"5fcb9db8-86c8-4d4b-b541-5cf291d702f2","Type":"ContainerDied","Data":"3923b9d0ce8b645e0ff8942009c17cd8d9218cb9dec368026709050216620a28"} Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.577462 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3923b9d0ce8b645e0ff8942009c17cd8d9218cb9dec368026709050216620a28" Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.577189 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-x7xdk" Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.794854 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.794931 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.794993 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.795829 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"850265ce9f01a5c63d70bb3589fb993cb12b2014828540d2dac94573f14584e1"} pod="openshift-machine-config-operator/machine-config-daemon-p56wx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.795930 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" containerID="cri-o://850265ce9f01a5c63d70bb3589fb993cb12b2014828540d2dac94573f14584e1" gracePeriod=600 Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.823793 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bj58t"] Mar 09 09:39:14 crc kubenswrapper[4971]: E0309 09:39:14.824107 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fcb9db8-86c8-4d4b-b541-5cf291d702f2" containerName="swift-ring-rebalance" Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.824124 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fcb9db8-86c8-4d4b-b541-5cf291d702f2" containerName="swift-ring-rebalance" Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.824258 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fcb9db8-86c8-4d4b-b541-5cf291d702f2" containerName="swift-ring-rebalance" Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.824752 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj58t" Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.832762 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bj58t"] Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.832940 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.833627 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.942980 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wb6p\" (UniqueName: \"kubernetes.io/projected/8172c8d2-4555-4a38-a018-07349ddea4db-kube-api-access-8wb6p\") pod \"swift-ring-rebalance-debug-bj58t\" (UID: \"8172c8d2-4555-4a38-a018-07349ddea4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj58t" Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.943077 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8172c8d2-4555-4a38-a018-07349ddea4db-swiftconf\") pod \"swift-ring-rebalance-debug-bj58t\" (UID: \"8172c8d2-4555-4a38-a018-07349ddea4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj58t" Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.943103 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8172c8d2-4555-4a38-a018-07349ddea4db-dispersionconf\") pod \"swift-ring-rebalance-debug-bj58t\" (UID: \"8172c8d2-4555-4a38-a018-07349ddea4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj58t" Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.943172 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8172c8d2-4555-4a38-a018-07349ddea4db-etc-swift\") pod \"swift-ring-rebalance-debug-bj58t\" (UID: \"8172c8d2-4555-4a38-a018-07349ddea4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj58t" Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.943228 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8172c8d2-4555-4a38-a018-07349ddea4db-scripts\") pod \"swift-ring-rebalance-debug-bj58t\" (UID: \"8172c8d2-4555-4a38-a018-07349ddea4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj58t" Mar 09 09:39:14 crc kubenswrapper[4971]: I0309 09:39:14.943256 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8172c8d2-4555-4a38-a018-07349ddea4db-ring-data-devices\") pod \"swift-ring-rebalance-debug-bj58t\" (UID: \"8172c8d2-4555-4a38-a018-07349ddea4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj58t" Mar 09 09:39:15 crc kubenswrapper[4971]: I0309 09:39:15.045148 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8172c8d2-4555-4a38-a018-07349ddea4db-etc-swift\") pod \"swift-ring-rebalance-debug-bj58t\" (UID: \"8172c8d2-4555-4a38-a018-07349ddea4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj58t" Mar 09 09:39:15 crc kubenswrapper[4971]: I0309 09:39:15.045274 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8172c8d2-4555-4a38-a018-07349ddea4db-scripts\") pod \"swift-ring-rebalance-debug-bj58t\" (UID: \"8172c8d2-4555-4a38-a018-07349ddea4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj58t" Mar 09 09:39:15 crc kubenswrapper[4971]: I0309 09:39:15.045387 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8172c8d2-4555-4a38-a018-07349ddea4db-ring-data-devices\") pod \"swift-ring-rebalance-debug-bj58t\" (UID: \"8172c8d2-4555-4a38-a018-07349ddea4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj58t" Mar 09 09:39:15 crc kubenswrapper[4971]: I0309 09:39:15.045414 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wb6p\" (UniqueName: \"kubernetes.io/projected/8172c8d2-4555-4a38-a018-07349ddea4db-kube-api-access-8wb6p\") pod \"swift-ring-rebalance-debug-bj58t\" (UID: \"8172c8d2-4555-4a38-a018-07349ddea4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj58t" Mar 09 09:39:15 crc kubenswrapper[4971]: I0309 09:39:15.046670 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8172c8d2-4555-4a38-a018-07349ddea4db-etc-swift\") pod \"swift-ring-rebalance-debug-bj58t\" (UID: \"8172c8d2-4555-4a38-a018-07349ddea4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj58t" Mar 09 09:39:15 crc kubenswrapper[4971]: I0309 09:39:15.046994 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8172c8d2-4555-4a38-a018-07349ddea4db-swiftconf\") pod \"swift-ring-rebalance-debug-bj58t\" (UID: \"8172c8d2-4555-4a38-a018-07349ddea4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj58t" Mar 09 09:39:15 crc kubenswrapper[4971]: I0309 09:39:15.047076 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8172c8d2-4555-4a38-a018-07349ddea4db-dispersionconf\") pod \"swift-ring-rebalance-debug-bj58t\" (UID: \"8172c8d2-4555-4a38-a018-07349ddea4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj58t" Mar 09 09:39:15 crc kubenswrapper[4971]: I0309 09:39:15.047115 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8172c8d2-4555-4a38-a018-07349ddea4db-scripts\") pod \"swift-ring-rebalance-debug-bj58t\" (UID: \"8172c8d2-4555-4a38-a018-07349ddea4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj58t" Mar 09 09:39:15 crc kubenswrapper[4971]: I0309 09:39:15.047167 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8172c8d2-4555-4a38-a018-07349ddea4db-ring-data-devices\") pod \"swift-ring-rebalance-debug-bj58t\" (UID: \"8172c8d2-4555-4a38-a018-07349ddea4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj58t" Mar 09 09:39:15 crc kubenswrapper[4971]: I0309 09:39:15.053843 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8172c8d2-4555-4a38-a018-07349ddea4db-swiftconf\") pod \"swift-ring-rebalance-debug-bj58t\" (UID: \"8172c8d2-4555-4a38-a018-07349ddea4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj58t" Mar 09 09:39:15 crc kubenswrapper[4971]: I0309 09:39:15.056294 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8172c8d2-4555-4a38-a018-07349ddea4db-dispersionconf\") pod \"swift-ring-rebalance-debug-bj58t\" (UID: \"8172c8d2-4555-4a38-a018-07349ddea4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj58t" Mar 09 09:39:15 crc kubenswrapper[4971]: I0309 09:39:15.069577 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wb6p\" (UniqueName: \"kubernetes.io/projected/8172c8d2-4555-4a38-a018-07349ddea4db-kube-api-access-8wb6p\") pod \"swift-ring-rebalance-debug-bj58t\" (UID: \"8172c8d2-4555-4a38-a018-07349ddea4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj58t" Mar 09 09:39:15 crc kubenswrapper[4971]: I0309 09:39:15.149111 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj58t" Mar 09 09:39:15 crc kubenswrapper[4971]: I0309 09:39:15.386040 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bj58t"] Mar 09 09:39:15 crc kubenswrapper[4971]: I0309 09:39:15.585029 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj58t" event={"ID":"8172c8d2-4555-4a38-a018-07349ddea4db","Type":"ContainerStarted","Data":"9d78bfe5d79bc1332f000be05a02ae3363e37f66af27305f24d9aaf4de48aeda"} Mar 09 09:39:15 crc kubenswrapper[4971]: I0309 09:39:15.585422 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj58t" event={"ID":"8172c8d2-4555-4a38-a018-07349ddea4db","Type":"ContainerStarted","Data":"9ee313fed20c290f66e4bc4eb6b3ef4046bab3d7e4944d79767003464b41a0a1"} Mar 09 09:39:15 crc kubenswrapper[4971]: I0309 09:39:15.587226 4971 generic.go:334] "Generic (PLEG): container finished" podID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerID="850265ce9f01a5c63d70bb3589fb993cb12b2014828540d2dac94573f14584e1" exitCode=0 Mar 09 09:39:15 crc kubenswrapper[4971]: I0309 09:39:15.587298 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" event={"ID":"05fde3ad-1182-4b15-bb1a-f365ecc92d75","Type":"ContainerDied","Data":"850265ce9f01a5c63d70bb3589fb993cb12b2014828540d2dac94573f14584e1"} Mar 09 09:39:15 crc kubenswrapper[4971]: I0309 09:39:15.587474 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" event={"ID":"05fde3ad-1182-4b15-bb1a-f365ecc92d75","Type":"ContainerStarted","Data":"cc375558fe6e32e81af0357f1b5962f3f3827247e841efa171f347f6cf29b99c"} Mar 09 09:39:15 crc kubenswrapper[4971]: I0309 09:39:15.587603 4971 scope.go:117] "RemoveContainer" containerID="3faafb59e33c928765c2ecf23a7678ad846a40e6f9948d8c13dc3d6b7074865f" Mar 09 09:39:15 crc kubenswrapper[4971]: I0309 09:39:15.601193 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj58t" podStartSLOduration=1.601174446 podStartE2EDuration="1.601174446s" podCreationTimestamp="2026-03-09 09:39:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:39:15.600513498 +0000 UTC m=+1159.160441308" watchObservedRunningTime="2026-03-09 09:39:15.601174446 +0000 UTC m=+1159.161102266" Mar 09 09:39:16 crc kubenswrapper[4971]: I0309 09:39:16.598398 4971 generic.go:334] "Generic (PLEG): container finished" podID="8172c8d2-4555-4a38-a018-07349ddea4db" containerID="9d78bfe5d79bc1332f000be05a02ae3363e37f66af27305f24d9aaf4de48aeda" exitCode=0 Mar 09 09:39:16 crc kubenswrapper[4971]: I0309 09:39:16.598581 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj58t" event={"ID":"8172c8d2-4555-4a38-a018-07349ddea4db","Type":"ContainerDied","Data":"9d78bfe5d79bc1332f000be05a02ae3363e37f66af27305f24d9aaf4de48aeda"} Mar 09 09:39:18 crc kubenswrapper[4971]: I0309 09:39:17.873310 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj58t" Mar 09 09:39:18 crc kubenswrapper[4971]: I0309 09:39:17.908736 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bj58t"] Mar 09 09:39:18 crc kubenswrapper[4971]: I0309 09:39:17.916081 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bj58t"] Mar 09 09:39:18 crc kubenswrapper[4971]: I0309 09:39:17.996226 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8172c8d2-4555-4a38-a018-07349ddea4db-ring-data-devices\") pod \"8172c8d2-4555-4a38-a018-07349ddea4db\" (UID: \"8172c8d2-4555-4a38-a018-07349ddea4db\") " Mar 09 09:39:18 crc kubenswrapper[4971]: I0309 09:39:17.996406 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8172c8d2-4555-4a38-a018-07349ddea4db-scripts\") pod \"8172c8d2-4555-4a38-a018-07349ddea4db\" (UID: \"8172c8d2-4555-4a38-a018-07349ddea4db\") " Mar 09 09:39:18 crc kubenswrapper[4971]: I0309 09:39:17.996444 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8172c8d2-4555-4a38-a018-07349ddea4db-etc-swift\") pod \"8172c8d2-4555-4a38-a018-07349ddea4db\" (UID: \"8172c8d2-4555-4a38-a018-07349ddea4db\") " Mar 09 09:39:18 crc kubenswrapper[4971]: I0309 09:39:17.996581 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8172c8d2-4555-4a38-a018-07349ddea4db-swiftconf\") pod \"8172c8d2-4555-4a38-a018-07349ddea4db\" (UID: \"8172c8d2-4555-4a38-a018-07349ddea4db\") " Mar 09 09:39:18 crc kubenswrapper[4971]: I0309 09:39:17.996616 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wb6p\" (UniqueName: \"kubernetes.io/projected/8172c8d2-4555-4a38-a018-07349ddea4db-kube-api-access-8wb6p\") pod \"8172c8d2-4555-4a38-a018-07349ddea4db\" (UID: \"8172c8d2-4555-4a38-a018-07349ddea4db\") " Mar 09 09:39:18 crc kubenswrapper[4971]: I0309 09:39:17.996652 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8172c8d2-4555-4a38-a018-07349ddea4db-dispersionconf\") pod \"8172c8d2-4555-4a38-a018-07349ddea4db\" (UID: \"8172c8d2-4555-4a38-a018-07349ddea4db\") " Mar 09 09:39:18 crc kubenswrapper[4971]: I0309 09:39:17.997103 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8172c8d2-4555-4a38-a018-07349ddea4db-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8172c8d2-4555-4a38-a018-07349ddea4db" (UID: "8172c8d2-4555-4a38-a018-07349ddea4db"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:39:18 crc kubenswrapper[4971]: I0309 09:39:17.997318 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8172c8d2-4555-4a38-a018-07349ddea4db-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8172c8d2-4555-4a38-a018-07349ddea4db" (UID: "8172c8d2-4555-4a38-a018-07349ddea4db"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:39:18 crc kubenswrapper[4971]: I0309 09:39:18.003632 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8172c8d2-4555-4a38-a018-07349ddea4db-kube-api-access-8wb6p" (OuterVolumeSpecName: "kube-api-access-8wb6p") pod "8172c8d2-4555-4a38-a018-07349ddea4db" (UID: "8172c8d2-4555-4a38-a018-07349ddea4db"). InnerVolumeSpecName "kube-api-access-8wb6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:39:18 crc kubenswrapper[4971]: I0309 09:39:18.020198 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8172c8d2-4555-4a38-a018-07349ddea4db-scripts" (OuterVolumeSpecName: "scripts") pod "8172c8d2-4555-4a38-a018-07349ddea4db" (UID: "8172c8d2-4555-4a38-a018-07349ddea4db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:39:18 crc kubenswrapper[4971]: I0309 09:39:18.020718 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8172c8d2-4555-4a38-a018-07349ddea4db-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8172c8d2-4555-4a38-a018-07349ddea4db" (UID: "8172c8d2-4555-4a38-a018-07349ddea4db"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:39:18 crc kubenswrapper[4971]: I0309 09:39:18.024658 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8172c8d2-4555-4a38-a018-07349ddea4db-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8172c8d2-4555-4a38-a018-07349ddea4db" (UID: "8172c8d2-4555-4a38-a018-07349ddea4db"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:39:18 crc kubenswrapper[4971]: I0309 09:39:18.098223 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8172c8d2-4555-4a38-a018-07349ddea4db-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:18 crc kubenswrapper[4971]: I0309 09:39:18.098263 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8172c8d2-4555-4a38-a018-07349ddea4db-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:18 crc kubenswrapper[4971]: I0309 09:39:18.098274 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8172c8d2-4555-4a38-a018-07349ddea4db-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:18 crc kubenswrapper[4971]: I0309 09:39:18.098286 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wb6p\" (UniqueName: \"kubernetes.io/projected/8172c8d2-4555-4a38-a018-07349ddea4db-kube-api-access-8wb6p\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:18 crc kubenswrapper[4971]: I0309 09:39:18.098301 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8172c8d2-4555-4a38-a018-07349ddea4db-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:18 crc kubenswrapper[4971]: I0309 09:39:18.098312 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8172c8d2-4555-4a38-a018-07349ddea4db-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:18 crc kubenswrapper[4971]: I0309 09:39:18.614472 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ee313fed20c290f66e4bc4eb6b3ef4046bab3d7e4944d79767003464b41a0a1" Mar 09 09:39:18 crc kubenswrapper[4971]: I0309 09:39:18.614600 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj58t" Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.162506 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8172c8d2-4555-4a38-a018-07349ddea4db" path="/var/lib/kubelet/pods/8172c8d2-4555-4a38-a018-07349ddea4db/volumes" Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.282596 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt"] Mar 09 09:39:19 crc kubenswrapper[4971]: E0309 09:39:19.282905 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8172c8d2-4555-4a38-a018-07349ddea4db" containerName="swift-ring-rebalance" Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.282917 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8172c8d2-4555-4a38-a018-07349ddea4db" containerName="swift-ring-rebalance" Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.283060 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8172c8d2-4555-4a38-a018-07349ddea4db" containerName="swift-ring-rebalance" Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.283518 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt" Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.286218 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.293775 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.294889 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt"] Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.416102 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/52cd61bd-9afd-40de-8369-c704972e7314-swiftconf\") pod \"swift-ring-rebalance-debug-k7gpt\" (UID: \"52cd61bd-9afd-40de-8369-c704972e7314\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt" Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.416198 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/52cd61bd-9afd-40de-8369-c704972e7314-ring-data-devices\") pod \"swift-ring-rebalance-debug-k7gpt\" (UID: \"52cd61bd-9afd-40de-8369-c704972e7314\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt" Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.416227 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/52cd61bd-9afd-40de-8369-c704972e7314-etc-swift\") pod \"swift-ring-rebalance-debug-k7gpt\" (UID: \"52cd61bd-9afd-40de-8369-c704972e7314\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt" Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.416259 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52cd61bd-9afd-40de-8369-c704972e7314-scripts\") pod \"swift-ring-rebalance-debug-k7gpt\" (UID: \"52cd61bd-9afd-40de-8369-c704972e7314\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt" Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.416368 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-558c9\" (UniqueName: \"kubernetes.io/projected/52cd61bd-9afd-40de-8369-c704972e7314-kube-api-access-558c9\") pod \"swift-ring-rebalance-debug-k7gpt\" (UID: \"52cd61bd-9afd-40de-8369-c704972e7314\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt" Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.416424 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/52cd61bd-9afd-40de-8369-c704972e7314-dispersionconf\") pod \"swift-ring-rebalance-debug-k7gpt\" (UID: \"52cd61bd-9afd-40de-8369-c704972e7314\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt" Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.518048 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/52cd61bd-9afd-40de-8369-c704972e7314-swiftconf\") pod \"swift-ring-rebalance-debug-k7gpt\" (UID: \"52cd61bd-9afd-40de-8369-c704972e7314\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt" Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.518138 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/52cd61bd-9afd-40de-8369-c704972e7314-ring-data-devices\") pod \"swift-ring-rebalance-debug-k7gpt\" (UID: \"52cd61bd-9afd-40de-8369-c704972e7314\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt" Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.518173 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/52cd61bd-9afd-40de-8369-c704972e7314-etc-swift\") pod \"swift-ring-rebalance-debug-k7gpt\" (UID: \"52cd61bd-9afd-40de-8369-c704972e7314\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt" Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.518207 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52cd61bd-9afd-40de-8369-c704972e7314-scripts\") pod \"swift-ring-rebalance-debug-k7gpt\" (UID: \"52cd61bd-9afd-40de-8369-c704972e7314\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt" Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.518262 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-558c9\" (UniqueName: \"kubernetes.io/projected/52cd61bd-9afd-40de-8369-c704972e7314-kube-api-access-558c9\") pod \"swift-ring-rebalance-debug-k7gpt\" (UID: \"52cd61bd-9afd-40de-8369-c704972e7314\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt" Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.518305 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/52cd61bd-9afd-40de-8369-c704972e7314-dispersionconf\") pod \"swift-ring-rebalance-debug-k7gpt\" (UID: \"52cd61bd-9afd-40de-8369-c704972e7314\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt" Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.518765 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/52cd61bd-9afd-40de-8369-c704972e7314-etc-swift\") pod \"swift-ring-rebalance-debug-k7gpt\" (UID: \"52cd61bd-9afd-40de-8369-c704972e7314\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt" Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.519084 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/52cd61bd-9afd-40de-8369-c704972e7314-ring-data-devices\") pod \"swift-ring-rebalance-debug-k7gpt\" (UID: \"52cd61bd-9afd-40de-8369-c704972e7314\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt" Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.519185 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52cd61bd-9afd-40de-8369-c704972e7314-scripts\") pod \"swift-ring-rebalance-debug-k7gpt\" (UID: \"52cd61bd-9afd-40de-8369-c704972e7314\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt" Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.523209 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/52cd61bd-9afd-40de-8369-c704972e7314-dispersionconf\") pod \"swift-ring-rebalance-debug-k7gpt\" (UID: \"52cd61bd-9afd-40de-8369-c704972e7314\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt" Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.523456 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/52cd61bd-9afd-40de-8369-c704972e7314-swiftconf\") pod \"swift-ring-rebalance-debug-k7gpt\" (UID: \"52cd61bd-9afd-40de-8369-c704972e7314\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt" Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.537205 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-558c9\" (UniqueName: \"kubernetes.io/projected/52cd61bd-9afd-40de-8369-c704972e7314-kube-api-access-558c9\") pod \"swift-ring-rebalance-debug-k7gpt\" (UID: \"52cd61bd-9afd-40de-8369-c704972e7314\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt" Mar 09 09:39:19 crc kubenswrapper[4971]: I0309 09:39:19.606139 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt" Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.048647 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt"] Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.637552 4971 generic.go:334] "Generic (PLEG): container finished" podID="52cd61bd-9afd-40de-8369-c704972e7314" containerID="a930d6c87add9d624579cded3ce6a4a8d78f3381664435635a22f7c01ad8f49b" exitCode=0 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.637843 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt" event={"ID":"52cd61bd-9afd-40de-8369-c704972e7314","Type":"ContainerDied","Data":"a930d6c87add9d624579cded3ce6a4a8d78f3381664435635a22f7c01ad8f49b"} Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.637870 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt" event={"ID":"52cd61bd-9afd-40de-8369-c704972e7314","Type":"ContainerStarted","Data":"e96960eea13331917eed169a7a5a9fdca746d1b0855e444b664fa15ee24f8817"} Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.671120 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt"] Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.677466 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt"] Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.777769 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-x7xdk"] Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.787120 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.787758 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="account-server" containerID="cri-o://835399e58bad9c868523c328e4f2809cb440e6fd3f172ab8b0694f22bc790969" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.787814 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="container-server" containerID="cri-o://9f807e3c3ca88bd802a7aea7370f8a5cc7d67e20d59a31b0972dd8f3c4371e29" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.787836 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="object-expirer" containerID="cri-o://010bd76869fd777a88efe67de904a37c0c4d058f63e845c195a9b3a4a07771fb" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.787884 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="container-updater" containerID="cri-o://685ebb342528743121758ca1e9c7e33a0df5f99a17f412802da9e5017c61621f" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.787905 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="account-replicator" containerID="cri-o://d5455beca7782c67fb2e7459725302522a881919ad965b93e26e23b98b6a2900" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.787931 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="container-auditor" containerID="cri-o://50b37896eea06628ecc0ff8113beb83f92dacd56dda44a82698dd4757d0484ec" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.787908 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="account-auditor" containerID="cri-o://b1c066201987d391a8dd14926c5be6933ba1dbe962df715b3f3afa9727d8f13e" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.788022 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="account-reaper" containerID="cri-o://d528d13ffd9863dfe83d2b0b1af5d6c819c2f47dc3a02586cf497de786326e75" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.788044 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="object-auditor" containerID="cri-o://778a01a71c0604ad7e14750e5dd3ae66e36d42a9179af09e745e3aa305e5ad95" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.788030 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="swift-recon-cron" containerID="cri-o://4458d3e84ca56cb30a624364166ae4fa8207e1a3939676005bb9a4bda0ad96cb" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.787981 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="container-replicator" containerID="cri-o://ff7d1cbddb197d85711318a82b94ec0d52e6adfecd4bd4e0dc18c99a41942d12" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.788097 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="object-updater" containerID="cri-o://8514f9892d73207bca6439d58cf121d6234889b8e57b2db16cffc790d7b4ad49" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.788149 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="rsync" containerID="cri-o://2f9499650f3c8ff7cee6d1b2c7ee361d719b0dfefa5b0bccaddb4f38a3681cda" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.788173 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="object-replicator" containerID="cri-o://371c269962f08a5e5cb9d92b8dd0305621f0d0b732e821209bd9c77742716ce3" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.788228 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="object-server" containerID="cri-o://d9efd0b7e90d1c5fc1f09a7e00f3865753ca39f93436949de4897d929b5dde96" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.812413 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-x7xdk"] Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.830478 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.830971 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="account-server" containerID="cri-o://848aad21bf36207d4da113fb7cd49be9b8fa537539be36374caa4a2cb670844e" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.831330 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="swift-recon-cron" containerID="cri-o://c4aa88515fa1151c995dced60a9702439c048af43e5987392d2bc961e486b0cf" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.831417 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="container-auditor" containerID="cri-o://5b04e74aeb6b43bdc4c680266f8796501ff152e2bbb4c4781e0892e1b5b2822c" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.831422 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="container-updater" containerID="cri-o://121a90c09f9c392afe81374c449487857e4e089ff467bf96b51986d9517ffc13" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.831459 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="container-server" containerID="cri-o://1d15d67628c21612f8710d292982a5e01a70e02eabd3218c498d759364ecd6bc" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.831494 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="account-reaper" containerID="cri-o://192f90e34d0663d6e2965b1a7b69f75914a2b0b54f1d1415713f38a3a6d31b17" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.831525 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="account-auditor" containerID="cri-o://c95508be51add3e7d0796dc7cb4960c28457d409183918120fa7690944dd13d0" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.831449 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="container-replicator" containerID="cri-o://a036c583066b72a231c3293a4b9c4774af28c7de5ea959bad83acf1618efcbe9" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.831573 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="account-replicator" containerID="cri-o://cb1cd3bf27a24f2736c0759f0ec976fc1d7c9b4e09b6185137076ce9b86d7de4" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.831619 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="rsync" containerID="cri-o://47b0f5e147cc9ee4d9663829ece46b7c6b53841d0cb3b27cf5502732e5f43b8c" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.831681 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="object-expirer" containerID="cri-o://c7f80e6d39240a4cdc9de835422fcb08a0a16fe50edd76e83c861182a184e612" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.831709 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="object-auditor" containerID="cri-o://a7508e7c1c0fc417c2ba35abcd0621e333541392996e38c35c97fcefd12b89d3" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.831731 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="object-replicator" containerID="cri-o://15187b478914a74d87583729ecb3e4c017f73934ddb3f6f0c883f4ecbb824648" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.831744 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="object-updater" containerID="cri-o://81bd53effc126ccb7c3e3a23efcd194275c8f688c0d68922e1496208248e5ea7" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.831770 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="object-server" containerID="cri-o://6f3107911aa7fed5ce204be52c5e4f42ea17e7dc2e1bd34b47cd9d4acf8a077b" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.842863 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.843359 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="account-server" containerID="cri-o://0846790c197ee9ba49995a83fe78257d13b81b6633feb00583ad1f49ffdf762a" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.843547 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="object-server" containerID="cri-o://a09b6b080b275f78b31ee43c6251e1ac8b9df12f3d3f6e0cb2935eb9aac50aed" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.843900 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="object-updater" containerID="cri-o://6c34a39cd1f6d1492aad90077ec6a42408f46f3bcccb7b0064b9bf0aa8abf4a2" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.843916 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="object-auditor" containerID="cri-o://dc82b02f06416b72870d4e96fa28c536b46ebac807f74b9e586c94cff84ebbd1" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.843931 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="object-replicator" containerID="cri-o://d2c2292180738b4ce2c8c5098f9124b67253c759fa65dc97e488d1143f9022af" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.843929 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="object-expirer" containerID="cri-o://028bf2e70f6264d9ac250238c465525469085579c8a83d96f201c0b1d5db8e55" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.843948 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="swift-recon-cron" containerID="cri-o://9f70981b5e4f72eda1c8702f375251a2076f188c8de7ff62f47f3965fab5ebe9" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.843964 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="rsync" containerID="cri-o://4f4c3914c725af7e2c42f00fd89515519532d47360a29b5c80c0e74d5900dd68" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.843980 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="container-auditor" containerID="cri-o://b691a5dd6b911fb191a9c07bc398f6bbebb3f00217946b2ff8238c3bb5f4731d" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.843963 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="container-server" containerID="cri-o://b5aacb4f2dd858d7eb95084bd2708dbe1b313f689202584451b13642d9b8f55f" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.843993 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="container-updater" containerID="cri-o://2bd62e61d9273d27adae24883798b95d981eba165342c0115a8ed473604a86a5" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.844005 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="container-replicator" containerID="cri-o://0c8ebcacb6282d984b0446d963fdc2bd3528a6f38ec6de9de6a73c6dd79bccb6" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.844008 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="account-auditor" containerID="cri-o://046324d703432fdbb7bf43b7e37121cadf48166bd4a1318c968d8008c238e159" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.844021 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="account-replicator" containerID="cri-o://63998b1ce663cb73c696042d13093ad012e5476f293ecfa291bfe156e6a6731b" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.844024 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="account-reaper" containerID="cri-o://b10368b9dc860c3bd7458e035e3e68547c3356437a15b60e49b2b6502921da5e" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.967821 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-htk6v"] Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.968361 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" podUID="315d491f-24ac-4eda-9e07-1e0533f2f9b7" containerName="proxy-httpd" containerID="cri-o://448228bb223b887e0933c5ff8a7cec55427aa455ff0273e66a0d2d7d1c127720" gracePeriod=30 Mar 09 09:39:20 crc kubenswrapper[4971]: I0309 09:39:20.968505 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" podUID="315d491f-24ac-4eda-9e07-1e0533f2f9b7" containerName="proxy-server" containerID="cri-o://3ece388c0688fe0f8fb63a883a4cc8c07356cda3a3ae8f63dcc9935c32bd11b1" gracePeriod=30 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.166453 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fcb9db8-86c8-4d4b-b541-5cf291d702f2" path="/var/lib/kubelet/pods/5fcb9db8-86c8-4d4b-b541-5cf291d702f2/volumes" Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670228 4971 generic.go:334] "Generic (PLEG): container finished" podID="f302bdd8-8044-48a4-aacd-13967f94570c" containerID="4f4c3914c725af7e2c42f00fd89515519532d47360a29b5c80c0e74d5900dd68" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670269 4971 generic.go:334] "Generic (PLEG): container finished" podID="f302bdd8-8044-48a4-aacd-13967f94570c" containerID="028bf2e70f6264d9ac250238c465525469085579c8a83d96f201c0b1d5db8e55" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670278 4971 generic.go:334] "Generic (PLEG): container finished" podID="f302bdd8-8044-48a4-aacd-13967f94570c" containerID="6c34a39cd1f6d1492aad90077ec6a42408f46f3bcccb7b0064b9bf0aa8abf4a2" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670287 4971 generic.go:334] "Generic (PLEG): container finished" podID="f302bdd8-8044-48a4-aacd-13967f94570c" containerID="dc82b02f06416b72870d4e96fa28c536b46ebac807f74b9e586c94cff84ebbd1" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670295 4971 generic.go:334] "Generic (PLEG): container finished" podID="f302bdd8-8044-48a4-aacd-13967f94570c" containerID="d2c2292180738b4ce2c8c5098f9124b67253c759fa65dc97e488d1143f9022af" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670305 4971 generic.go:334] "Generic (PLEG): container finished" podID="f302bdd8-8044-48a4-aacd-13967f94570c" containerID="a09b6b080b275f78b31ee43c6251e1ac8b9df12f3d3f6e0cb2935eb9aac50aed" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670312 4971 generic.go:334] "Generic (PLEG): container finished" podID="f302bdd8-8044-48a4-aacd-13967f94570c" containerID="2bd62e61d9273d27adae24883798b95d981eba165342c0115a8ed473604a86a5" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670320 4971 generic.go:334] "Generic (PLEG): container finished" podID="f302bdd8-8044-48a4-aacd-13967f94570c" containerID="b691a5dd6b911fb191a9c07bc398f6bbebb3f00217946b2ff8238c3bb5f4731d" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670328 4971 generic.go:334] "Generic (PLEG): container finished" podID="f302bdd8-8044-48a4-aacd-13967f94570c" containerID="0c8ebcacb6282d984b0446d963fdc2bd3528a6f38ec6de9de6a73c6dd79bccb6" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670336 4971 generic.go:334] "Generic (PLEG): container finished" podID="f302bdd8-8044-48a4-aacd-13967f94570c" containerID="b5aacb4f2dd858d7eb95084bd2708dbe1b313f689202584451b13642d9b8f55f" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670360 4971 generic.go:334] "Generic (PLEG): container finished" podID="f302bdd8-8044-48a4-aacd-13967f94570c" containerID="b10368b9dc860c3bd7458e035e3e68547c3356437a15b60e49b2b6502921da5e" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670367 4971 generic.go:334] "Generic (PLEG): container finished" podID="f302bdd8-8044-48a4-aacd-13967f94570c" containerID="046324d703432fdbb7bf43b7e37121cadf48166bd4a1318c968d8008c238e159" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670373 4971 generic.go:334] "Generic (PLEG): container finished" podID="f302bdd8-8044-48a4-aacd-13967f94570c" containerID="63998b1ce663cb73c696042d13093ad012e5476f293ecfa291bfe156e6a6731b" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670379 4971 generic.go:334] "Generic (PLEG): container finished" podID="f302bdd8-8044-48a4-aacd-13967f94570c" containerID="0846790c197ee9ba49995a83fe78257d13b81b6633feb00583ad1f49ffdf762a" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670418 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerDied","Data":"4f4c3914c725af7e2c42f00fd89515519532d47360a29b5c80c0e74d5900dd68"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670442 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerDied","Data":"028bf2e70f6264d9ac250238c465525469085579c8a83d96f201c0b1d5db8e55"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670453 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerDied","Data":"6c34a39cd1f6d1492aad90077ec6a42408f46f3bcccb7b0064b9bf0aa8abf4a2"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670460 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerDied","Data":"dc82b02f06416b72870d4e96fa28c536b46ebac807f74b9e586c94cff84ebbd1"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670468 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerDied","Data":"d2c2292180738b4ce2c8c5098f9124b67253c759fa65dc97e488d1143f9022af"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670476 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerDied","Data":"a09b6b080b275f78b31ee43c6251e1ac8b9df12f3d3f6e0cb2935eb9aac50aed"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670485 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerDied","Data":"2bd62e61d9273d27adae24883798b95d981eba165342c0115a8ed473604a86a5"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670494 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerDied","Data":"b691a5dd6b911fb191a9c07bc398f6bbebb3f00217946b2ff8238c3bb5f4731d"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670501 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerDied","Data":"0c8ebcacb6282d984b0446d963fdc2bd3528a6f38ec6de9de6a73c6dd79bccb6"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670509 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerDied","Data":"b5aacb4f2dd858d7eb95084bd2708dbe1b313f689202584451b13642d9b8f55f"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670517 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerDied","Data":"b10368b9dc860c3bd7458e035e3e68547c3356437a15b60e49b2b6502921da5e"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670526 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerDied","Data":"046324d703432fdbb7bf43b7e37121cadf48166bd4a1318c968d8008c238e159"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670534 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerDied","Data":"63998b1ce663cb73c696042d13093ad012e5476f293ecfa291bfe156e6a6731b"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.670543 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerDied","Data":"0846790c197ee9ba49995a83fe78257d13b81b6633feb00583ad1f49ffdf762a"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679262 4971 generic.go:334] "Generic (PLEG): container finished" podID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerID="2f9499650f3c8ff7cee6d1b2c7ee361d719b0dfefa5b0bccaddb4f38a3681cda" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679300 4971 generic.go:334] "Generic (PLEG): container finished" podID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerID="010bd76869fd777a88efe67de904a37c0c4d058f63e845c195a9b3a4a07771fb" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679309 4971 generic.go:334] "Generic (PLEG): container finished" podID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerID="8514f9892d73207bca6439d58cf121d6234889b8e57b2db16cffc790d7b4ad49" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679316 4971 generic.go:334] "Generic (PLEG): container finished" podID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerID="778a01a71c0604ad7e14750e5dd3ae66e36d42a9179af09e745e3aa305e5ad95" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679323 4971 generic.go:334] "Generic (PLEG): container finished" podID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerID="371c269962f08a5e5cb9d92b8dd0305621f0d0b732e821209bd9c77742716ce3" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679330 4971 generic.go:334] "Generic (PLEG): container finished" podID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerID="d9efd0b7e90d1c5fc1f09a7e00f3865753ca39f93436949de4897d929b5dde96" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679338 4971 generic.go:334] "Generic (PLEG): container finished" podID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerID="685ebb342528743121758ca1e9c7e33a0df5f99a17f412802da9e5017c61621f" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679359 4971 generic.go:334] "Generic (PLEG): container finished" podID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerID="50b37896eea06628ecc0ff8113beb83f92dacd56dda44a82698dd4757d0484ec" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679366 4971 generic.go:334] "Generic (PLEG): container finished" podID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerID="ff7d1cbddb197d85711318a82b94ec0d52e6adfecd4bd4e0dc18c99a41942d12" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679373 4971 generic.go:334] "Generic (PLEG): container finished" podID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerID="9f807e3c3ca88bd802a7aea7370f8a5cc7d67e20d59a31b0972dd8f3c4371e29" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679380 4971 generic.go:334] "Generic (PLEG): container finished" podID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerID="d528d13ffd9863dfe83d2b0b1af5d6c819c2f47dc3a02586cf497de786326e75" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679387 4971 generic.go:334] "Generic (PLEG): container finished" podID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerID="b1c066201987d391a8dd14926c5be6933ba1dbe962df715b3f3afa9727d8f13e" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679394 4971 generic.go:334] "Generic (PLEG): container finished" podID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerID="d5455beca7782c67fb2e7459725302522a881919ad965b93e26e23b98b6a2900" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679401 4971 generic.go:334] "Generic (PLEG): container finished" podID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerID="835399e58bad9c868523c328e4f2809cb440e6fd3f172ab8b0694f22bc790969" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679442 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerDied","Data":"2f9499650f3c8ff7cee6d1b2c7ee361d719b0dfefa5b0bccaddb4f38a3681cda"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679468 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerDied","Data":"010bd76869fd777a88efe67de904a37c0c4d058f63e845c195a9b3a4a07771fb"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679478 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerDied","Data":"8514f9892d73207bca6439d58cf121d6234889b8e57b2db16cffc790d7b4ad49"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679486 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerDied","Data":"778a01a71c0604ad7e14750e5dd3ae66e36d42a9179af09e745e3aa305e5ad95"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679496 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerDied","Data":"371c269962f08a5e5cb9d92b8dd0305621f0d0b732e821209bd9c77742716ce3"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679505 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerDied","Data":"d9efd0b7e90d1c5fc1f09a7e00f3865753ca39f93436949de4897d929b5dde96"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679514 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerDied","Data":"685ebb342528743121758ca1e9c7e33a0df5f99a17f412802da9e5017c61621f"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679522 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerDied","Data":"50b37896eea06628ecc0ff8113beb83f92dacd56dda44a82698dd4757d0484ec"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679531 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerDied","Data":"ff7d1cbddb197d85711318a82b94ec0d52e6adfecd4bd4e0dc18c99a41942d12"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679539 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerDied","Data":"9f807e3c3ca88bd802a7aea7370f8a5cc7d67e20d59a31b0972dd8f3c4371e29"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679547 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerDied","Data":"d528d13ffd9863dfe83d2b0b1af5d6c819c2f47dc3a02586cf497de786326e75"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679556 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerDied","Data":"b1c066201987d391a8dd14926c5be6933ba1dbe962df715b3f3afa9727d8f13e"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679564 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerDied","Data":"d5455beca7782c67fb2e7459725302522a881919ad965b93e26e23b98b6a2900"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.679572 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerDied","Data":"835399e58bad9c868523c328e4f2809cb440e6fd3f172ab8b0694f22bc790969"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.681814 4971 generic.go:334] "Generic (PLEG): container finished" podID="315d491f-24ac-4eda-9e07-1e0533f2f9b7" containerID="3ece388c0688fe0f8fb63a883a4cc8c07356cda3a3ae8f63dcc9935c32bd11b1" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.681855 4971 generic.go:334] "Generic (PLEG): container finished" podID="315d491f-24ac-4eda-9e07-1e0533f2f9b7" containerID="448228bb223b887e0933c5ff8a7cec55427aa455ff0273e66a0d2d7d1c127720" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.681950 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" event={"ID":"315d491f-24ac-4eda-9e07-1e0533f2f9b7","Type":"ContainerDied","Data":"3ece388c0688fe0f8fb63a883a4cc8c07356cda3a3ae8f63dcc9935c32bd11b1"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.681981 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" event={"ID":"315d491f-24ac-4eda-9e07-1e0533f2f9b7","Type":"ContainerDied","Data":"448228bb223b887e0933c5ff8a7cec55427aa455ff0273e66a0d2d7d1c127720"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.688996 4971 generic.go:334] "Generic (PLEG): container finished" podID="6f4feb95-a276-4089-9876-d30cde31f67c" containerID="47b0f5e147cc9ee4d9663829ece46b7c6b53841d0cb3b27cf5502732e5f43b8c" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689030 4971 generic.go:334] "Generic (PLEG): container finished" podID="6f4feb95-a276-4089-9876-d30cde31f67c" containerID="c7f80e6d39240a4cdc9de835422fcb08a0a16fe50edd76e83c861182a184e612" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689040 4971 generic.go:334] "Generic (PLEG): container finished" podID="6f4feb95-a276-4089-9876-d30cde31f67c" containerID="81bd53effc126ccb7c3e3a23efcd194275c8f688c0d68922e1496208248e5ea7" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689048 4971 generic.go:334] "Generic (PLEG): container finished" podID="6f4feb95-a276-4089-9876-d30cde31f67c" containerID="a7508e7c1c0fc417c2ba35abcd0621e333541392996e38c35c97fcefd12b89d3" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689057 4971 generic.go:334] "Generic (PLEG): container finished" podID="6f4feb95-a276-4089-9876-d30cde31f67c" containerID="15187b478914a74d87583729ecb3e4c017f73934ddb3f6f0c883f4ecbb824648" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689065 4971 generic.go:334] "Generic (PLEG): container finished" podID="6f4feb95-a276-4089-9876-d30cde31f67c" containerID="6f3107911aa7fed5ce204be52c5e4f42ea17e7dc2e1bd34b47cd9d4acf8a077b" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689073 4971 generic.go:334] "Generic (PLEG): container finished" podID="6f4feb95-a276-4089-9876-d30cde31f67c" containerID="121a90c09f9c392afe81374c449487857e4e089ff467bf96b51986d9517ffc13" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689089 4971 generic.go:334] "Generic (PLEG): container finished" podID="6f4feb95-a276-4089-9876-d30cde31f67c" containerID="5b04e74aeb6b43bdc4c680266f8796501ff152e2bbb4c4781e0892e1b5b2822c" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689098 4971 generic.go:334] "Generic (PLEG): container finished" podID="6f4feb95-a276-4089-9876-d30cde31f67c" containerID="a036c583066b72a231c3293a4b9c4774af28c7de5ea959bad83acf1618efcbe9" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689106 4971 generic.go:334] "Generic (PLEG): container finished" podID="6f4feb95-a276-4089-9876-d30cde31f67c" containerID="1d15d67628c21612f8710d292982a5e01a70e02eabd3218c498d759364ecd6bc" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689115 4971 generic.go:334] "Generic (PLEG): container finished" podID="6f4feb95-a276-4089-9876-d30cde31f67c" containerID="192f90e34d0663d6e2965b1a7b69f75914a2b0b54f1d1415713f38a3a6d31b17" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689123 4971 generic.go:334] "Generic (PLEG): container finished" podID="6f4feb95-a276-4089-9876-d30cde31f67c" containerID="c95508be51add3e7d0796dc7cb4960c28457d409183918120fa7690944dd13d0" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689131 4971 generic.go:334] "Generic (PLEG): container finished" podID="6f4feb95-a276-4089-9876-d30cde31f67c" containerID="cb1cd3bf27a24f2736c0759f0ec976fc1d7c9b4e09b6185137076ce9b86d7de4" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689139 4971 generic.go:334] "Generic (PLEG): container finished" podID="6f4feb95-a276-4089-9876-d30cde31f67c" containerID="848aad21bf36207d4da113fb7cd49be9b8fa537539be36374caa4a2cb670844e" exitCode=0 Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689108 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerDied","Data":"47b0f5e147cc9ee4d9663829ece46b7c6b53841d0cb3b27cf5502732e5f43b8c"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689189 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerDied","Data":"c7f80e6d39240a4cdc9de835422fcb08a0a16fe50edd76e83c861182a184e612"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689205 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerDied","Data":"81bd53effc126ccb7c3e3a23efcd194275c8f688c0d68922e1496208248e5ea7"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689218 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerDied","Data":"a7508e7c1c0fc417c2ba35abcd0621e333541392996e38c35c97fcefd12b89d3"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689235 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerDied","Data":"15187b478914a74d87583729ecb3e4c017f73934ddb3f6f0c883f4ecbb824648"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689254 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerDied","Data":"6f3107911aa7fed5ce204be52c5e4f42ea17e7dc2e1bd34b47cd9d4acf8a077b"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689266 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerDied","Data":"121a90c09f9c392afe81374c449487857e4e089ff467bf96b51986d9517ffc13"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689278 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerDied","Data":"5b04e74aeb6b43bdc4c680266f8796501ff152e2bbb4c4781e0892e1b5b2822c"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689290 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerDied","Data":"a036c583066b72a231c3293a4b9c4774af28c7de5ea959bad83acf1618efcbe9"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689300 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerDied","Data":"1d15d67628c21612f8710d292982a5e01a70e02eabd3218c498d759364ecd6bc"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689313 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerDied","Data":"192f90e34d0663d6e2965b1a7b69f75914a2b0b54f1d1415713f38a3a6d31b17"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689325 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerDied","Data":"c95508be51add3e7d0796dc7cb4960c28457d409183918120fa7690944dd13d0"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689336 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerDied","Data":"cb1cd3bf27a24f2736c0759f0ec976fc1d7c9b4e09b6185137076ce9b86d7de4"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.689422 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerDied","Data":"848aad21bf36207d4da113fb7cd49be9b8fa537539be36374caa4a2cb670844e"} Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.817500 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.953751 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt" Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.966870 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6752k\" (UniqueName: \"kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-kube-api-access-6752k\") pod \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\" (UID: \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\") " Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.966985 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/315d491f-24ac-4eda-9e07-1e0533f2f9b7-log-httpd\") pod \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\" (UID: \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\") " Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.967005 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-etc-swift\") pod \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\" (UID: \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\") " Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.967077 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/315d491f-24ac-4eda-9e07-1e0533f2f9b7-run-httpd\") pod \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\" (UID: \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\") " Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.967097 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315d491f-24ac-4eda-9e07-1e0533f2f9b7-config-data\") pod \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\" (UID: \"315d491f-24ac-4eda-9e07-1e0533f2f9b7\") " Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.967614 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/315d491f-24ac-4eda-9e07-1e0533f2f9b7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "315d491f-24ac-4eda-9e07-1e0533f2f9b7" (UID: "315d491f-24ac-4eda-9e07-1e0533f2f9b7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.968450 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/315d491f-24ac-4eda-9e07-1e0533f2f9b7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "315d491f-24ac-4eda-9e07-1e0533f2f9b7" (UID: "315d491f-24ac-4eda-9e07-1e0533f2f9b7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.972598 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "315d491f-24ac-4eda-9e07-1e0533f2f9b7" (UID: "315d491f-24ac-4eda-9e07-1e0533f2f9b7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:39:21 crc kubenswrapper[4971]: I0309 09:39:21.972746 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-kube-api-access-6752k" (OuterVolumeSpecName: "kube-api-access-6752k") pod "315d491f-24ac-4eda-9e07-1e0533f2f9b7" (UID: "315d491f-24ac-4eda-9e07-1e0533f2f9b7"). InnerVolumeSpecName "kube-api-access-6752k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.006462 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/315d491f-24ac-4eda-9e07-1e0533f2f9b7-config-data" (OuterVolumeSpecName: "config-data") pod "315d491f-24ac-4eda-9e07-1e0533f2f9b7" (UID: "315d491f-24ac-4eda-9e07-1e0533f2f9b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.068275 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52cd61bd-9afd-40de-8369-c704972e7314-scripts\") pod \"52cd61bd-9afd-40de-8369-c704972e7314\" (UID: \"52cd61bd-9afd-40de-8369-c704972e7314\") " Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.068392 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-558c9\" (UniqueName: \"kubernetes.io/projected/52cd61bd-9afd-40de-8369-c704972e7314-kube-api-access-558c9\") pod \"52cd61bd-9afd-40de-8369-c704972e7314\" (UID: \"52cd61bd-9afd-40de-8369-c704972e7314\") " Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.068475 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/52cd61bd-9afd-40de-8369-c704972e7314-swiftconf\") pod \"52cd61bd-9afd-40de-8369-c704972e7314\" (UID: \"52cd61bd-9afd-40de-8369-c704972e7314\") " Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.068516 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/52cd61bd-9afd-40de-8369-c704972e7314-dispersionconf\") pod \"52cd61bd-9afd-40de-8369-c704972e7314\" (UID: \"52cd61bd-9afd-40de-8369-c704972e7314\") " Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.068575 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/52cd61bd-9afd-40de-8369-c704972e7314-etc-swift\") pod \"52cd61bd-9afd-40de-8369-c704972e7314\" (UID: \"52cd61bd-9afd-40de-8369-c704972e7314\") " Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.068631 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/52cd61bd-9afd-40de-8369-c704972e7314-ring-data-devices\") pod \"52cd61bd-9afd-40de-8369-c704972e7314\" (UID: \"52cd61bd-9afd-40de-8369-c704972e7314\") " Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.069000 4971 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/315d491f-24ac-4eda-9e07-1e0533f2f9b7-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.069031 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.069046 4971 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/315d491f-24ac-4eda-9e07-1e0533f2f9b7-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.069058 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315d491f-24ac-4eda-9e07-1e0533f2f9b7-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.069072 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6752k\" (UniqueName: \"kubernetes.io/projected/315d491f-24ac-4eda-9e07-1e0533f2f9b7-kube-api-access-6752k\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.069987 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52cd61bd-9afd-40de-8369-c704972e7314-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "52cd61bd-9afd-40de-8369-c704972e7314" (UID: "52cd61bd-9afd-40de-8369-c704972e7314"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.071090 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52cd61bd-9afd-40de-8369-c704972e7314-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "52cd61bd-9afd-40de-8369-c704972e7314" (UID: "52cd61bd-9afd-40de-8369-c704972e7314"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.073315 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52cd61bd-9afd-40de-8369-c704972e7314-kube-api-access-558c9" (OuterVolumeSpecName: "kube-api-access-558c9") pod "52cd61bd-9afd-40de-8369-c704972e7314" (UID: "52cd61bd-9afd-40de-8369-c704972e7314"). InnerVolumeSpecName "kube-api-access-558c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.085664 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52cd61bd-9afd-40de-8369-c704972e7314-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "52cd61bd-9afd-40de-8369-c704972e7314" (UID: "52cd61bd-9afd-40de-8369-c704972e7314"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.087869 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52cd61bd-9afd-40de-8369-c704972e7314-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "52cd61bd-9afd-40de-8369-c704972e7314" (UID: "52cd61bd-9afd-40de-8369-c704972e7314"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.088729 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52cd61bd-9afd-40de-8369-c704972e7314-scripts" (OuterVolumeSpecName: "scripts") pod "52cd61bd-9afd-40de-8369-c704972e7314" (UID: "52cd61bd-9afd-40de-8369-c704972e7314"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.170487 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52cd61bd-9afd-40de-8369-c704972e7314-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.170530 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-558c9\" (UniqueName: \"kubernetes.io/projected/52cd61bd-9afd-40de-8369-c704972e7314-kube-api-access-558c9\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.170545 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/52cd61bd-9afd-40de-8369-c704972e7314-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.170556 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/52cd61bd-9afd-40de-8369-c704972e7314-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.170568 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/52cd61bd-9afd-40de-8369-c704972e7314-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.170577 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/52cd61bd-9afd-40de-8369-c704972e7314-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.702633 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" event={"ID":"315d491f-24ac-4eda-9e07-1e0533f2f9b7","Type":"ContainerDied","Data":"a7588003a6294342238d38881207f7fbe0c789e9e9580fd23c28c1d1dab119a7"} Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.702799 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-76c998454c-htk6v" Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.702811 4971 scope.go:117] "RemoveContainer" containerID="3ece388c0688fe0f8fb63a883a4cc8c07356cda3a3ae8f63dcc9935c32bd11b1" Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.709457 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-k7gpt" Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.729795 4971 scope.go:117] "RemoveContainer" containerID="448228bb223b887e0933c5ff8a7cec55427aa455ff0273e66a0d2d7d1c127720" Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.740134 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-htk6v"] Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.747107 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-htk6v"] Mar 09 09:39:22 crc kubenswrapper[4971]: I0309 09:39:22.752135 4971 scope.go:117] "RemoveContainer" containerID="a930d6c87add9d624579cded3ce6a4a8d78f3381664435635a22f7c01ad8f49b" Mar 09 09:39:23 crc kubenswrapper[4971]: I0309 09:39:23.161142 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="315d491f-24ac-4eda-9e07-1e0533f2f9b7" path="/var/lib/kubelet/pods/315d491f-24ac-4eda-9e07-1e0533f2f9b7/volumes" Mar 09 09:39:23 crc kubenswrapper[4971]: I0309 09:39:23.162217 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52cd61bd-9afd-40de-8369-c704972e7314" path="/var/lib/kubelet/pods/52cd61bd-9afd-40de-8369-c704972e7314/volumes" Mar 09 09:39:50 crc kubenswrapper[4971]: I0309 09:39:50.939799 4971 generic.go:334] "Generic (PLEG): container finished" podID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerID="4458d3e84ca56cb30a624364166ae4fa8207e1a3939676005bb9a4bda0ad96cb" exitCode=137 Mar 09 09:39:50 crc kubenswrapper[4971]: I0309 09:39:50.939863 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerDied","Data":"4458d3e84ca56cb30a624364166ae4fa8207e1a3939676005bb9a4bda0ad96cb"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.115262 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.187729 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlws9\" (UniqueName: \"kubernetes.io/projected/52c89471-afd6-4cce-8a00-54dbcd4ef92b-kube-api-access-mlws9\") pod \"52c89471-afd6-4cce-8a00-54dbcd4ef92b\" (UID: \"52c89471-afd6-4cce-8a00-54dbcd4ef92b\") " Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.187814 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/52c89471-afd6-4cce-8a00-54dbcd4ef92b-cache\") pod \"52c89471-afd6-4cce-8a00-54dbcd4ef92b\" (UID: \"52c89471-afd6-4cce-8a00-54dbcd4ef92b\") " Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.187844 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/52c89471-afd6-4cce-8a00-54dbcd4ef92b-etc-swift\") pod \"52c89471-afd6-4cce-8a00-54dbcd4ef92b\" (UID: \"52c89471-afd6-4cce-8a00-54dbcd4ef92b\") " Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.187898 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/52c89471-afd6-4cce-8a00-54dbcd4ef92b-lock\") pod \"52c89471-afd6-4cce-8a00-54dbcd4ef92b\" (UID: \"52c89471-afd6-4cce-8a00-54dbcd4ef92b\") " Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.187917 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"52c89471-afd6-4cce-8a00-54dbcd4ef92b\" (UID: \"52c89471-afd6-4cce-8a00-54dbcd4ef92b\") " Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.188718 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52c89471-afd6-4cce-8a00-54dbcd4ef92b-lock" (OuterVolumeSpecName: "lock") pod "52c89471-afd6-4cce-8a00-54dbcd4ef92b" (UID: "52c89471-afd6-4cce-8a00-54dbcd4ef92b"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.188788 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52c89471-afd6-4cce-8a00-54dbcd4ef92b-cache" (OuterVolumeSpecName: "cache") pod "52c89471-afd6-4cce-8a00-54dbcd4ef92b" (UID: "52c89471-afd6-4cce-8a00-54dbcd4ef92b"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.194516 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52c89471-afd6-4cce-8a00-54dbcd4ef92b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "52c89471-afd6-4cce-8a00-54dbcd4ef92b" (UID: "52c89471-afd6-4cce-8a00-54dbcd4ef92b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.194534 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "swift") pod "52c89471-afd6-4cce-8a00-54dbcd4ef92b" (UID: "52c89471-afd6-4cce-8a00-54dbcd4ef92b"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.194724 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52c89471-afd6-4cce-8a00-54dbcd4ef92b-kube-api-access-mlws9" (OuterVolumeSpecName: "kube-api-access-mlws9") pod "52c89471-afd6-4cce-8a00-54dbcd4ef92b" (UID: "52c89471-afd6-4cce-8a00-54dbcd4ef92b"). InnerVolumeSpecName "kube-api-access-mlws9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.235551 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.274107 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.289132 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6f4feb95-a276-4089-9876-d30cde31f67c-lock\") pod \"6f4feb95-a276-4089-9876-d30cde31f67c\" (UID: \"6f4feb95-a276-4089-9876-d30cde31f67c\") " Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.289188 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x56b\" (UniqueName: \"kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-kube-api-access-5x56b\") pod \"6f4feb95-a276-4089-9876-d30cde31f67c\" (UID: \"6f4feb95-a276-4089-9876-d30cde31f67c\") " Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.289250 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-etc-swift\") pod \"6f4feb95-a276-4089-9876-d30cde31f67c\" (UID: \"6f4feb95-a276-4089-9876-d30cde31f67c\") " Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.289333 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"6f4feb95-a276-4089-9876-d30cde31f67c\" (UID: \"6f4feb95-a276-4089-9876-d30cde31f67c\") " Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.289402 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6f4feb95-a276-4089-9876-d30cde31f67c-cache\") pod \"6f4feb95-a276-4089-9876-d30cde31f67c\" (UID: \"6f4feb95-a276-4089-9876-d30cde31f67c\") " Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.289946 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f4feb95-a276-4089-9876-d30cde31f67c-lock" (OuterVolumeSpecName: "lock") pod "6f4feb95-a276-4089-9876-d30cde31f67c" (UID: "6f4feb95-a276-4089-9876-d30cde31f67c"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.290548 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f4feb95-a276-4089-9876-d30cde31f67c-cache" (OuterVolumeSpecName: "cache") pod "6f4feb95-a276-4089-9876-d30cde31f67c" (UID: "6f4feb95-a276-4089-9876-d30cde31f67c"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.292098 4971 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/52c89471-afd6-4cce-8a00-54dbcd4ef92b-cache\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.292121 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/52c89471-afd6-4cce-8a00-54dbcd4ef92b-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.292132 4971 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/52c89471-afd6-4cce-8a00-54dbcd4ef92b-lock\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.292154 4971 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.292165 4971 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6f4feb95-a276-4089-9876-d30cde31f67c-cache\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.292176 4971 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6f4feb95-a276-4089-9876-d30cde31f67c-lock\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.292187 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlws9\" (UniqueName: \"kubernetes.io/projected/52c89471-afd6-4cce-8a00-54dbcd4ef92b-kube-api-access-mlws9\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.293620 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-kube-api-access-5x56b" (OuterVolumeSpecName: "kube-api-access-5x56b") pod "6f4feb95-a276-4089-9876-d30cde31f67c" (UID: "6f4feb95-a276-4089-9876-d30cde31f67c"). InnerVolumeSpecName "kube-api-access-5x56b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.293735 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "swift") pod "6f4feb95-a276-4089-9876-d30cde31f67c" (UID: "6f4feb95-a276-4089-9876-d30cde31f67c"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.296988 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6f4feb95-a276-4089-9876-d30cde31f67c" (UID: "6f4feb95-a276-4089-9876-d30cde31f67c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.306692 4971 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.392991 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"f302bdd8-8044-48a4-aacd-13967f94570c\" (UID: \"f302bdd8-8044-48a4-aacd-13967f94570c\") " Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.393430 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f302bdd8-8044-48a4-aacd-13967f94570c-lock\") pod \"f302bdd8-8044-48a4-aacd-13967f94570c\" (UID: \"f302bdd8-8044-48a4-aacd-13967f94570c\") " Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.393463 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f302bdd8-8044-48a4-aacd-13967f94570c-etc-swift\") pod \"f302bdd8-8044-48a4-aacd-13967f94570c\" (UID: \"f302bdd8-8044-48a4-aacd-13967f94570c\") " Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.393584 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h66dp\" (UniqueName: \"kubernetes.io/projected/f302bdd8-8044-48a4-aacd-13967f94570c-kube-api-access-h66dp\") pod \"f302bdd8-8044-48a4-aacd-13967f94570c\" (UID: \"f302bdd8-8044-48a4-aacd-13967f94570c\") " Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.393611 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f302bdd8-8044-48a4-aacd-13967f94570c-cache\") pod \"f302bdd8-8044-48a4-aacd-13967f94570c\" (UID: \"f302bdd8-8044-48a4-aacd-13967f94570c\") " Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.393964 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f302bdd8-8044-48a4-aacd-13967f94570c-lock" (OuterVolumeSpecName: "lock") pod "f302bdd8-8044-48a4-aacd-13967f94570c" (UID: "f302bdd8-8044-48a4-aacd-13967f94570c"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.394033 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f302bdd8-8044-48a4-aacd-13967f94570c-cache" (OuterVolumeSpecName: "cache") pod "f302bdd8-8044-48a4-aacd-13967f94570c" (UID: "f302bdd8-8044-48a4-aacd-13967f94570c"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.395597 4971 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.395625 4971 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f302bdd8-8044-48a4-aacd-13967f94570c-lock\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.395638 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x56b\" (UniqueName: \"kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-kube-api-access-5x56b\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.395651 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6f4feb95-a276-4089-9876-d30cde31f67c-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.395663 4971 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.395673 4971 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f302bdd8-8044-48a4-aacd-13967f94570c-cache\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.396010 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "swift") pod "f302bdd8-8044-48a4-aacd-13967f94570c" (UID: "f302bdd8-8044-48a4-aacd-13967f94570c"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.396498 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f302bdd8-8044-48a4-aacd-13967f94570c-kube-api-access-h66dp" (OuterVolumeSpecName: "kube-api-access-h66dp") pod "f302bdd8-8044-48a4-aacd-13967f94570c" (UID: "f302bdd8-8044-48a4-aacd-13967f94570c"). InnerVolumeSpecName "kube-api-access-h66dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.396770 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f302bdd8-8044-48a4-aacd-13967f94570c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f302bdd8-8044-48a4-aacd-13967f94570c" (UID: "f302bdd8-8044-48a4-aacd-13967f94570c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.409266 4971 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.497513 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h66dp\" (UniqueName: \"kubernetes.io/projected/f302bdd8-8044-48a4-aacd-13967f94570c-kube-api-access-h66dp\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.497561 4971 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.497615 4971 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.497630 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f302bdd8-8044-48a4-aacd-13967f94570c-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.512330 4971 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.599033 4971 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.953980 4971 generic.go:334] "Generic (PLEG): container finished" podID="6f4feb95-a276-4089-9876-d30cde31f67c" containerID="c4aa88515fa1151c995dced60a9702439c048af43e5987392d2bc961e486b0cf" exitCode=137 Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.954137 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerDied","Data":"c4aa88515fa1151c995dced60a9702439c048af43e5987392d2bc961e486b0cf"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.954174 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"6f4feb95-a276-4089-9876-d30cde31f67c","Type":"ContainerDied","Data":"78632660fac2835d640401ec4ef5bbfb577bd0df04b45cf9d41ee5bf7d04e684"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.954225 4971 scope.go:117] "RemoveContainer" containerID="c4aa88515fa1151c995dced60a9702439c048af43e5987392d2bc961e486b0cf" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.954378 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.962044 4971 generic.go:334] "Generic (PLEG): container finished" podID="f302bdd8-8044-48a4-aacd-13967f94570c" containerID="9f70981b5e4f72eda1c8702f375251a2076f188c8de7ff62f47f3965fab5ebe9" exitCode=137 Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.962100 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerDied","Data":"9f70981b5e4f72eda1c8702f375251a2076f188c8de7ff62f47f3965fab5ebe9"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.962124 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"f302bdd8-8044-48a4-aacd-13967f94570c","Type":"ContainerDied","Data":"fe2d3e8aaddf3d2086091ba540a1cc69d288f6c02e38f8e76279f891e769b629"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.962135 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a09b6b080b275f78b31ee43c6251e1ac8b9df12f3d3f6e0cb2935eb9aac50aed"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.962145 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2bd62e61d9273d27adae24883798b95d981eba165342c0115a8ed473604a86a5"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.962150 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b691a5dd6b911fb191a9c07bc398f6bbebb3f00217946b2ff8238c3bb5f4731d"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.962156 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c8ebcacb6282d984b0446d963fdc2bd3528a6f38ec6de9de6a73c6dd79bccb6"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.962161 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5aacb4f2dd858d7eb95084bd2708dbe1b313f689202584451b13642d9b8f55f"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.962166 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b10368b9dc860c3bd7458e035e3e68547c3356437a15b60e49b2b6502921da5e"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.962171 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"046324d703432fdbb7bf43b7e37121cadf48166bd4a1318c968d8008c238e159"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.962176 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"63998b1ce663cb73c696042d13093ad012e5476f293ecfa291bfe156e6a6731b"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.962181 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0846790c197ee9ba49995a83fe78257d13b81b6633feb00583ad1f49ffdf762a"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.962283 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.969648 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"52c89471-afd6-4cce-8a00-54dbcd4ef92b","Type":"ContainerDied","Data":"61e7bf904994a0c837f0fd4cacda734b9a925d1138328259709a053f3b92c5b4"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.969697 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4458d3e84ca56cb30a624364166ae4fa8207e1a3939676005bb9a4bda0ad96cb"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.969710 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2f9499650f3c8ff7cee6d1b2c7ee361d719b0dfefa5b0bccaddb4f38a3681cda"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.969719 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"010bd76869fd777a88efe67de904a37c0c4d058f63e845c195a9b3a4a07771fb"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.969728 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8514f9892d73207bca6439d58cf121d6234889b8e57b2db16cffc790d7b4ad49"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.969737 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"778a01a71c0604ad7e14750e5dd3ae66e36d42a9179af09e745e3aa305e5ad95"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.969744 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"371c269962f08a5e5cb9d92b8dd0305621f0d0b732e821209bd9c77742716ce3"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.969752 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9efd0b7e90d1c5fc1f09a7e00f3865753ca39f93436949de4897d929b5dde96"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.969760 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"685ebb342528743121758ca1e9c7e33a0df5f99a17f412802da9e5017c61621f"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.969767 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50b37896eea06628ecc0ff8113beb83f92dacd56dda44a82698dd4757d0484ec"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.969775 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff7d1cbddb197d85711318a82b94ec0d52e6adfecd4bd4e0dc18c99a41942d12"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.969782 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f807e3c3ca88bd802a7aea7370f8a5cc7d67e20d59a31b0972dd8f3c4371e29"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.969789 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d528d13ffd9863dfe83d2b0b1af5d6c819c2f47dc3a02586cf497de786326e75"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.969797 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b1c066201987d391a8dd14926c5be6933ba1dbe962df715b3f3afa9727d8f13e"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.969804 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d5455beca7782c67fb2e7459725302522a881919ad965b93e26e23b98b6a2900"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.969812 4971 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"835399e58bad9c868523c328e4f2809cb440e6fd3f172ab8b0694f22bc790969"} Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.969822 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.974796 4971 scope.go:117] "RemoveContainer" containerID="47b0f5e147cc9ee4d9663829ece46b7c6b53841d0cb3b27cf5502732e5f43b8c" Mar 09 09:39:51 crc kubenswrapper[4971]: I0309 09:39:51.995327 4971 scope.go:117] "RemoveContainer" containerID="c7f80e6d39240a4cdc9de835422fcb08a0a16fe50edd76e83c861182a184e612" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.017104 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.027167 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.033123 4971 scope.go:117] "RemoveContainer" containerID="81bd53effc126ccb7c3e3a23efcd194275c8f688c0d68922e1496208248e5ea7" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.040808 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.048181 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.057768 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.061629 4971 scope.go:117] "RemoveContainer" containerID="a7508e7c1c0fc417c2ba35abcd0621e333541392996e38c35c97fcefd12b89d3" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.064920 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.079201 4971 scope.go:117] "RemoveContainer" containerID="15187b478914a74d87583729ecb3e4c017f73934ddb3f6f0c883f4ecbb824648" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.122416 4971 scope.go:117] "RemoveContainer" containerID="6f3107911aa7fed5ce204be52c5e4f42ea17e7dc2e1bd34b47cd9d4acf8a077b" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.147421 4971 scope.go:117] "RemoveContainer" containerID="121a90c09f9c392afe81374c449487857e4e089ff467bf96b51986d9517ffc13" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.165648 4971 scope.go:117] "RemoveContainer" containerID="5b04e74aeb6b43bdc4c680266f8796501ff152e2bbb4c4781e0892e1b5b2822c" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.183200 4971 scope.go:117] "RemoveContainer" containerID="a036c583066b72a231c3293a4b9c4774af28c7de5ea959bad83acf1618efcbe9" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.201161 4971 scope.go:117] "RemoveContainer" containerID="1d15d67628c21612f8710d292982a5e01a70e02eabd3218c498d759364ecd6bc" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.224297 4971 scope.go:117] "RemoveContainer" containerID="192f90e34d0663d6e2965b1a7b69f75914a2b0b54f1d1415713f38a3a6d31b17" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.243964 4971 scope.go:117] "RemoveContainer" containerID="c95508be51add3e7d0796dc7cb4960c28457d409183918120fa7690944dd13d0" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.261927 4971 scope.go:117] "RemoveContainer" containerID="cb1cd3bf27a24f2736c0759f0ec976fc1d7c9b4e09b6185137076ce9b86d7de4" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.276886 4971 scope.go:117] "RemoveContainer" containerID="848aad21bf36207d4da113fb7cd49be9b8fa537539be36374caa4a2cb670844e" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.291680 4971 scope.go:117] "RemoveContainer" containerID="c4aa88515fa1151c995dced60a9702439c048af43e5987392d2bc961e486b0cf" Mar 09 09:39:52 crc kubenswrapper[4971]: E0309 09:39:52.292145 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4aa88515fa1151c995dced60a9702439c048af43e5987392d2bc961e486b0cf\": container with ID starting with c4aa88515fa1151c995dced60a9702439c048af43e5987392d2bc961e486b0cf not found: ID does not exist" containerID="c4aa88515fa1151c995dced60a9702439c048af43e5987392d2bc961e486b0cf" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.292225 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4aa88515fa1151c995dced60a9702439c048af43e5987392d2bc961e486b0cf"} err="failed to get container status \"c4aa88515fa1151c995dced60a9702439c048af43e5987392d2bc961e486b0cf\": rpc error: code = NotFound desc = could not find container \"c4aa88515fa1151c995dced60a9702439c048af43e5987392d2bc961e486b0cf\": container with ID starting with c4aa88515fa1151c995dced60a9702439c048af43e5987392d2bc961e486b0cf not found: ID does not exist" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.292255 4971 scope.go:117] "RemoveContainer" containerID="47b0f5e147cc9ee4d9663829ece46b7c6b53841d0cb3b27cf5502732e5f43b8c" Mar 09 09:39:52 crc kubenswrapper[4971]: E0309 09:39:52.292664 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47b0f5e147cc9ee4d9663829ece46b7c6b53841d0cb3b27cf5502732e5f43b8c\": container with ID starting with 47b0f5e147cc9ee4d9663829ece46b7c6b53841d0cb3b27cf5502732e5f43b8c not found: ID does not exist" containerID="47b0f5e147cc9ee4d9663829ece46b7c6b53841d0cb3b27cf5502732e5f43b8c" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.292684 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47b0f5e147cc9ee4d9663829ece46b7c6b53841d0cb3b27cf5502732e5f43b8c"} err="failed to get container status \"47b0f5e147cc9ee4d9663829ece46b7c6b53841d0cb3b27cf5502732e5f43b8c\": rpc error: code = NotFound desc = could not find container \"47b0f5e147cc9ee4d9663829ece46b7c6b53841d0cb3b27cf5502732e5f43b8c\": container with ID starting with 47b0f5e147cc9ee4d9663829ece46b7c6b53841d0cb3b27cf5502732e5f43b8c not found: ID does not exist" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.292696 4971 scope.go:117] "RemoveContainer" containerID="c7f80e6d39240a4cdc9de835422fcb08a0a16fe50edd76e83c861182a184e612" Mar 09 09:39:52 crc kubenswrapper[4971]: E0309 09:39:52.292971 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7f80e6d39240a4cdc9de835422fcb08a0a16fe50edd76e83c861182a184e612\": container with ID starting with c7f80e6d39240a4cdc9de835422fcb08a0a16fe50edd76e83c861182a184e612 not found: ID does not exist" containerID="c7f80e6d39240a4cdc9de835422fcb08a0a16fe50edd76e83c861182a184e612" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.292991 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7f80e6d39240a4cdc9de835422fcb08a0a16fe50edd76e83c861182a184e612"} err="failed to get container status \"c7f80e6d39240a4cdc9de835422fcb08a0a16fe50edd76e83c861182a184e612\": rpc error: code = NotFound desc = could not find container \"c7f80e6d39240a4cdc9de835422fcb08a0a16fe50edd76e83c861182a184e612\": container with ID starting with c7f80e6d39240a4cdc9de835422fcb08a0a16fe50edd76e83c861182a184e612 not found: ID does not exist" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.293003 4971 scope.go:117] "RemoveContainer" containerID="81bd53effc126ccb7c3e3a23efcd194275c8f688c0d68922e1496208248e5ea7" Mar 09 09:39:52 crc kubenswrapper[4971]: E0309 09:39:52.293429 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81bd53effc126ccb7c3e3a23efcd194275c8f688c0d68922e1496208248e5ea7\": container with ID starting with 81bd53effc126ccb7c3e3a23efcd194275c8f688c0d68922e1496208248e5ea7 not found: ID does not exist" containerID="81bd53effc126ccb7c3e3a23efcd194275c8f688c0d68922e1496208248e5ea7" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.293460 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81bd53effc126ccb7c3e3a23efcd194275c8f688c0d68922e1496208248e5ea7"} err="failed to get container status \"81bd53effc126ccb7c3e3a23efcd194275c8f688c0d68922e1496208248e5ea7\": rpc error: code = NotFound desc = could not find container \"81bd53effc126ccb7c3e3a23efcd194275c8f688c0d68922e1496208248e5ea7\": container with ID starting with 81bd53effc126ccb7c3e3a23efcd194275c8f688c0d68922e1496208248e5ea7 not found: ID does not exist" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.293473 4971 scope.go:117] "RemoveContainer" containerID="a7508e7c1c0fc417c2ba35abcd0621e333541392996e38c35c97fcefd12b89d3" Mar 09 09:39:52 crc kubenswrapper[4971]: E0309 09:39:52.293938 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7508e7c1c0fc417c2ba35abcd0621e333541392996e38c35c97fcefd12b89d3\": container with ID starting with a7508e7c1c0fc417c2ba35abcd0621e333541392996e38c35c97fcefd12b89d3 not found: ID does not exist" containerID="a7508e7c1c0fc417c2ba35abcd0621e333541392996e38c35c97fcefd12b89d3" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.293957 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7508e7c1c0fc417c2ba35abcd0621e333541392996e38c35c97fcefd12b89d3"} err="failed to get container status \"a7508e7c1c0fc417c2ba35abcd0621e333541392996e38c35c97fcefd12b89d3\": rpc error: code = NotFound desc = could not find container \"a7508e7c1c0fc417c2ba35abcd0621e333541392996e38c35c97fcefd12b89d3\": container with ID starting with a7508e7c1c0fc417c2ba35abcd0621e333541392996e38c35c97fcefd12b89d3 not found: ID does not exist" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.293969 4971 scope.go:117] "RemoveContainer" containerID="15187b478914a74d87583729ecb3e4c017f73934ddb3f6f0c883f4ecbb824648" Mar 09 09:39:52 crc kubenswrapper[4971]: E0309 09:39:52.294223 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15187b478914a74d87583729ecb3e4c017f73934ddb3f6f0c883f4ecbb824648\": container with ID starting with 15187b478914a74d87583729ecb3e4c017f73934ddb3f6f0c883f4ecbb824648 not found: ID does not exist" containerID="15187b478914a74d87583729ecb3e4c017f73934ddb3f6f0c883f4ecbb824648" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.294255 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15187b478914a74d87583729ecb3e4c017f73934ddb3f6f0c883f4ecbb824648"} err="failed to get container status \"15187b478914a74d87583729ecb3e4c017f73934ddb3f6f0c883f4ecbb824648\": rpc error: code = NotFound desc = could not find container \"15187b478914a74d87583729ecb3e4c017f73934ddb3f6f0c883f4ecbb824648\": container with ID starting with 15187b478914a74d87583729ecb3e4c017f73934ddb3f6f0c883f4ecbb824648 not found: ID does not exist" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.294280 4971 scope.go:117] "RemoveContainer" containerID="6f3107911aa7fed5ce204be52c5e4f42ea17e7dc2e1bd34b47cd9d4acf8a077b" Mar 09 09:39:52 crc kubenswrapper[4971]: E0309 09:39:52.294730 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f3107911aa7fed5ce204be52c5e4f42ea17e7dc2e1bd34b47cd9d4acf8a077b\": container with ID starting with 6f3107911aa7fed5ce204be52c5e4f42ea17e7dc2e1bd34b47cd9d4acf8a077b not found: ID does not exist" containerID="6f3107911aa7fed5ce204be52c5e4f42ea17e7dc2e1bd34b47cd9d4acf8a077b" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.294761 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f3107911aa7fed5ce204be52c5e4f42ea17e7dc2e1bd34b47cd9d4acf8a077b"} err="failed to get container status \"6f3107911aa7fed5ce204be52c5e4f42ea17e7dc2e1bd34b47cd9d4acf8a077b\": rpc error: code = NotFound desc = could not find container \"6f3107911aa7fed5ce204be52c5e4f42ea17e7dc2e1bd34b47cd9d4acf8a077b\": container with ID starting with 6f3107911aa7fed5ce204be52c5e4f42ea17e7dc2e1bd34b47cd9d4acf8a077b not found: ID does not exist" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.294778 4971 scope.go:117] "RemoveContainer" containerID="121a90c09f9c392afe81374c449487857e4e089ff467bf96b51986d9517ffc13" Mar 09 09:39:52 crc kubenswrapper[4971]: E0309 09:39:52.295074 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"121a90c09f9c392afe81374c449487857e4e089ff467bf96b51986d9517ffc13\": container with ID starting with 121a90c09f9c392afe81374c449487857e4e089ff467bf96b51986d9517ffc13 not found: ID does not exist" containerID="121a90c09f9c392afe81374c449487857e4e089ff467bf96b51986d9517ffc13" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.295101 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"121a90c09f9c392afe81374c449487857e4e089ff467bf96b51986d9517ffc13"} err="failed to get container status \"121a90c09f9c392afe81374c449487857e4e089ff467bf96b51986d9517ffc13\": rpc error: code = NotFound desc = could not find container \"121a90c09f9c392afe81374c449487857e4e089ff467bf96b51986d9517ffc13\": container with ID starting with 121a90c09f9c392afe81374c449487857e4e089ff467bf96b51986d9517ffc13 not found: ID does not exist" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.295115 4971 scope.go:117] "RemoveContainer" containerID="5b04e74aeb6b43bdc4c680266f8796501ff152e2bbb4c4781e0892e1b5b2822c" Mar 09 09:39:52 crc kubenswrapper[4971]: E0309 09:39:52.295367 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b04e74aeb6b43bdc4c680266f8796501ff152e2bbb4c4781e0892e1b5b2822c\": container with ID starting with 5b04e74aeb6b43bdc4c680266f8796501ff152e2bbb4c4781e0892e1b5b2822c not found: ID does not exist" containerID="5b04e74aeb6b43bdc4c680266f8796501ff152e2bbb4c4781e0892e1b5b2822c" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.295386 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b04e74aeb6b43bdc4c680266f8796501ff152e2bbb4c4781e0892e1b5b2822c"} err="failed to get container status \"5b04e74aeb6b43bdc4c680266f8796501ff152e2bbb4c4781e0892e1b5b2822c\": rpc error: code = NotFound desc = could not find container \"5b04e74aeb6b43bdc4c680266f8796501ff152e2bbb4c4781e0892e1b5b2822c\": container with ID starting with 5b04e74aeb6b43bdc4c680266f8796501ff152e2bbb4c4781e0892e1b5b2822c not found: ID does not exist" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.295401 4971 scope.go:117] "RemoveContainer" containerID="a036c583066b72a231c3293a4b9c4774af28c7de5ea959bad83acf1618efcbe9" Mar 09 09:39:52 crc kubenswrapper[4971]: E0309 09:39:52.295610 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a036c583066b72a231c3293a4b9c4774af28c7de5ea959bad83acf1618efcbe9\": container with ID starting with a036c583066b72a231c3293a4b9c4774af28c7de5ea959bad83acf1618efcbe9 not found: ID does not exist" containerID="a036c583066b72a231c3293a4b9c4774af28c7de5ea959bad83acf1618efcbe9" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.295628 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a036c583066b72a231c3293a4b9c4774af28c7de5ea959bad83acf1618efcbe9"} err="failed to get container status \"a036c583066b72a231c3293a4b9c4774af28c7de5ea959bad83acf1618efcbe9\": rpc error: code = NotFound desc = could not find container \"a036c583066b72a231c3293a4b9c4774af28c7de5ea959bad83acf1618efcbe9\": container with ID starting with a036c583066b72a231c3293a4b9c4774af28c7de5ea959bad83acf1618efcbe9 not found: ID does not exist" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.295640 4971 scope.go:117] "RemoveContainer" containerID="1d15d67628c21612f8710d292982a5e01a70e02eabd3218c498d759364ecd6bc" Mar 09 09:39:52 crc kubenswrapper[4971]: E0309 09:39:52.295870 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d15d67628c21612f8710d292982a5e01a70e02eabd3218c498d759364ecd6bc\": container with ID starting with 1d15d67628c21612f8710d292982a5e01a70e02eabd3218c498d759364ecd6bc not found: ID does not exist" containerID="1d15d67628c21612f8710d292982a5e01a70e02eabd3218c498d759364ecd6bc" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.295887 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d15d67628c21612f8710d292982a5e01a70e02eabd3218c498d759364ecd6bc"} err="failed to get container status \"1d15d67628c21612f8710d292982a5e01a70e02eabd3218c498d759364ecd6bc\": rpc error: code = NotFound desc = could not find container \"1d15d67628c21612f8710d292982a5e01a70e02eabd3218c498d759364ecd6bc\": container with ID starting with 1d15d67628c21612f8710d292982a5e01a70e02eabd3218c498d759364ecd6bc not found: ID does not exist" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.295899 4971 scope.go:117] "RemoveContainer" containerID="192f90e34d0663d6e2965b1a7b69f75914a2b0b54f1d1415713f38a3a6d31b17" Mar 09 09:39:52 crc kubenswrapper[4971]: E0309 09:39:52.296093 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"192f90e34d0663d6e2965b1a7b69f75914a2b0b54f1d1415713f38a3a6d31b17\": container with ID starting with 192f90e34d0663d6e2965b1a7b69f75914a2b0b54f1d1415713f38a3a6d31b17 not found: ID does not exist" containerID="192f90e34d0663d6e2965b1a7b69f75914a2b0b54f1d1415713f38a3a6d31b17" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.296118 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192f90e34d0663d6e2965b1a7b69f75914a2b0b54f1d1415713f38a3a6d31b17"} err="failed to get container status \"192f90e34d0663d6e2965b1a7b69f75914a2b0b54f1d1415713f38a3a6d31b17\": rpc error: code = NotFound desc = could not find container \"192f90e34d0663d6e2965b1a7b69f75914a2b0b54f1d1415713f38a3a6d31b17\": container with ID starting with 192f90e34d0663d6e2965b1a7b69f75914a2b0b54f1d1415713f38a3a6d31b17 not found: ID does not exist" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.296134 4971 scope.go:117] "RemoveContainer" containerID="c95508be51add3e7d0796dc7cb4960c28457d409183918120fa7690944dd13d0" Mar 09 09:39:52 crc kubenswrapper[4971]: E0309 09:39:52.296400 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c95508be51add3e7d0796dc7cb4960c28457d409183918120fa7690944dd13d0\": container with ID starting with c95508be51add3e7d0796dc7cb4960c28457d409183918120fa7690944dd13d0 not found: ID does not exist" containerID="c95508be51add3e7d0796dc7cb4960c28457d409183918120fa7690944dd13d0" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.296420 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c95508be51add3e7d0796dc7cb4960c28457d409183918120fa7690944dd13d0"} err="failed to get container status \"c95508be51add3e7d0796dc7cb4960c28457d409183918120fa7690944dd13d0\": rpc error: code = NotFound desc = could not find container \"c95508be51add3e7d0796dc7cb4960c28457d409183918120fa7690944dd13d0\": container with ID starting with c95508be51add3e7d0796dc7cb4960c28457d409183918120fa7690944dd13d0 not found: ID does not exist" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.296431 4971 scope.go:117] "RemoveContainer" containerID="cb1cd3bf27a24f2736c0759f0ec976fc1d7c9b4e09b6185137076ce9b86d7de4" Mar 09 09:39:52 crc kubenswrapper[4971]: E0309 09:39:52.296802 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb1cd3bf27a24f2736c0759f0ec976fc1d7c9b4e09b6185137076ce9b86d7de4\": container with ID starting with cb1cd3bf27a24f2736c0759f0ec976fc1d7c9b4e09b6185137076ce9b86d7de4 not found: ID does not exist" containerID="cb1cd3bf27a24f2736c0759f0ec976fc1d7c9b4e09b6185137076ce9b86d7de4" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.296821 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb1cd3bf27a24f2736c0759f0ec976fc1d7c9b4e09b6185137076ce9b86d7de4"} err="failed to get container status \"cb1cd3bf27a24f2736c0759f0ec976fc1d7c9b4e09b6185137076ce9b86d7de4\": rpc error: code = NotFound desc = could not find container \"cb1cd3bf27a24f2736c0759f0ec976fc1d7c9b4e09b6185137076ce9b86d7de4\": container with ID starting with cb1cd3bf27a24f2736c0759f0ec976fc1d7c9b4e09b6185137076ce9b86d7de4 not found: ID does not exist" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.296833 4971 scope.go:117] "RemoveContainer" containerID="848aad21bf36207d4da113fb7cd49be9b8fa537539be36374caa4a2cb670844e" Mar 09 09:39:52 crc kubenswrapper[4971]: E0309 09:39:52.300781 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"848aad21bf36207d4da113fb7cd49be9b8fa537539be36374caa4a2cb670844e\": container with ID starting with 848aad21bf36207d4da113fb7cd49be9b8fa537539be36374caa4a2cb670844e not found: ID does not exist" containerID="848aad21bf36207d4da113fb7cd49be9b8fa537539be36374caa4a2cb670844e" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.300844 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"848aad21bf36207d4da113fb7cd49be9b8fa537539be36374caa4a2cb670844e"} err="failed to get container status \"848aad21bf36207d4da113fb7cd49be9b8fa537539be36374caa4a2cb670844e\": rpc error: code = NotFound desc = could not find container \"848aad21bf36207d4da113fb7cd49be9b8fa537539be36374caa4a2cb670844e\": container with ID starting with 848aad21bf36207d4da113fb7cd49be9b8fa537539be36374caa4a2cb670844e not found: ID does not exist" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.300873 4971 scope.go:117] "RemoveContainer" containerID="9f70981b5e4f72eda1c8702f375251a2076f188c8de7ff62f47f3965fab5ebe9" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.317685 4971 scope.go:117] "RemoveContainer" containerID="4f4c3914c725af7e2c42f00fd89515519532d47360a29b5c80c0e74d5900dd68" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.333821 4971 scope.go:117] "RemoveContainer" containerID="028bf2e70f6264d9ac250238c465525469085579c8a83d96f201c0b1d5db8e55" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.349755 4971 scope.go:117] "RemoveContainer" containerID="6c34a39cd1f6d1492aad90077ec6a42408f46f3bcccb7b0064b9bf0aa8abf4a2" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.367118 4971 scope.go:117] "RemoveContainer" containerID="dc82b02f06416b72870d4e96fa28c536b46ebac807f74b9e586c94cff84ebbd1" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.386927 4971 scope.go:117] "RemoveContainer" containerID="d2c2292180738b4ce2c8c5098f9124b67253c759fa65dc97e488d1143f9022af" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.404520 4971 scope.go:117] "RemoveContainer" containerID="a09b6b080b275f78b31ee43c6251e1ac8b9df12f3d3f6e0cb2935eb9aac50aed" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.419769 4971 scope.go:117] "RemoveContainer" containerID="2bd62e61d9273d27adae24883798b95d981eba165342c0115a8ed473604a86a5" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.435985 4971 scope.go:117] "RemoveContainer" containerID="b691a5dd6b911fb191a9c07bc398f6bbebb3f00217946b2ff8238c3bb5f4731d" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.453692 4971 scope.go:117] "RemoveContainer" containerID="0c8ebcacb6282d984b0446d963fdc2bd3528a6f38ec6de9de6a73c6dd79bccb6" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.478212 4971 scope.go:117] "RemoveContainer" containerID="b5aacb4f2dd858d7eb95084bd2708dbe1b313f689202584451b13642d9b8f55f" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.497265 4971 scope.go:117] "RemoveContainer" containerID="b10368b9dc860c3bd7458e035e3e68547c3356437a15b60e49b2b6502921da5e" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.514407 4971 scope.go:117] "RemoveContainer" containerID="046324d703432fdbb7bf43b7e37121cadf48166bd4a1318c968d8008c238e159" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.532607 4971 scope.go:117] "RemoveContainer" containerID="63998b1ce663cb73c696042d13093ad012e5476f293ecfa291bfe156e6a6731b" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.552897 4971 scope.go:117] "RemoveContainer" containerID="0846790c197ee9ba49995a83fe78257d13b81b6633feb00583ad1f49ffdf762a" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.570684 4971 scope.go:117] "RemoveContainer" containerID="9f70981b5e4f72eda1c8702f375251a2076f188c8de7ff62f47f3965fab5ebe9" Mar 09 09:39:52 crc kubenswrapper[4971]: E0309 09:39:52.571038 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f70981b5e4f72eda1c8702f375251a2076f188c8de7ff62f47f3965fab5ebe9\": container with ID starting with 9f70981b5e4f72eda1c8702f375251a2076f188c8de7ff62f47f3965fab5ebe9 not found: ID does not exist" containerID="9f70981b5e4f72eda1c8702f375251a2076f188c8de7ff62f47f3965fab5ebe9" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.571069 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f70981b5e4f72eda1c8702f375251a2076f188c8de7ff62f47f3965fab5ebe9"} err="failed to get container status \"9f70981b5e4f72eda1c8702f375251a2076f188c8de7ff62f47f3965fab5ebe9\": rpc error: code = NotFound desc = could not find container \"9f70981b5e4f72eda1c8702f375251a2076f188c8de7ff62f47f3965fab5ebe9\": container with ID starting with 9f70981b5e4f72eda1c8702f375251a2076f188c8de7ff62f47f3965fab5ebe9 not found: ID does not exist" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.571094 4971 scope.go:117] "RemoveContainer" containerID="4f4c3914c725af7e2c42f00fd89515519532d47360a29b5c80c0e74d5900dd68" Mar 09 09:39:52 crc kubenswrapper[4971]: E0309 09:39:52.571407 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f4c3914c725af7e2c42f00fd89515519532d47360a29b5c80c0e74d5900dd68\": container with ID starting with 4f4c3914c725af7e2c42f00fd89515519532d47360a29b5c80c0e74d5900dd68 not found: ID does not exist" containerID="4f4c3914c725af7e2c42f00fd89515519532d47360a29b5c80c0e74d5900dd68" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.571439 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f4c3914c725af7e2c42f00fd89515519532d47360a29b5c80c0e74d5900dd68"} err="failed to get container status \"4f4c3914c725af7e2c42f00fd89515519532d47360a29b5c80c0e74d5900dd68\": rpc error: code = NotFound desc = could not find container \"4f4c3914c725af7e2c42f00fd89515519532d47360a29b5c80c0e74d5900dd68\": container with ID starting with 4f4c3914c725af7e2c42f00fd89515519532d47360a29b5c80c0e74d5900dd68 not found: ID does not exist" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.571459 4971 scope.go:117] "RemoveContainer" containerID="028bf2e70f6264d9ac250238c465525469085579c8a83d96f201c0b1d5db8e55" Mar 09 09:39:52 crc kubenswrapper[4971]: E0309 09:39:52.571790 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"028bf2e70f6264d9ac250238c465525469085579c8a83d96f201c0b1d5db8e55\": container with ID starting with 028bf2e70f6264d9ac250238c465525469085579c8a83d96f201c0b1d5db8e55 not found: ID does not exist" containerID="028bf2e70f6264d9ac250238c465525469085579c8a83d96f201c0b1d5db8e55" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.571815 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"028bf2e70f6264d9ac250238c465525469085579c8a83d96f201c0b1d5db8e55"} err="failed to get container status \"028bf2e70f6264d9ac250238c465525469085579c8a83d96f201c0b1d5db8e55\": rpc error: code = NotFound desc = could not find container \"028bf2e70f6264d9ac250238c465525469085579c8a83d96f201c0b1d5db8e55\": container with ID starting with 028bf2e70f6264d9ac250238c465525469085579c8a83d96f201c0b1d5db8e55 not found: ID does not exist" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.571833 4971 scope.go:117] "RemoveContainer" containerID="6c34a39cd1f6d1492aad90077ec6a42408f46f3bcccb7b0064b9bf0aa8abf4a2" Mar 09 09:39:52 crc kubenswrapper[4971]: E0309 09:39:52.572101 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c34a39cd1f6d1492aad90077ec6a42408f46f3bcccb7b0064b9bf0aa8abf4a2\": container with ID starting with 6c34a39cd1f6d1492aad90077ec6a42408f46f3bcccb7b0064b9bf0aa8abf4a2 not found: ID does not exist" containerID="6c34a39cd1f6d1492aad90077ec6a42408f46f3bcccb7b0064b9bf0aa8abf4a2" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.572128 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c34a39cd1f6d1492aad90077ec6a42408f46f3bcccb7b0064b9bf0aa8abf4a2"} err="failed to get container status \"6c34a39cd1f6d1492aad90077ec6a42408f46f3bcccb7b0064b9bf0aa8abf4a2\": rpc error: code = NotFound desc = could not find container \"6c34a39cd1f6d1492aad90077ec6a42408f46f3bcccb7b0064b9bf0aa8abf4a2\": container with ID starting with 6c34a39cd1f6d1492aad90077ec6a42408f46f3bcccb7b0064b9bf0aa8abf4a2 not found: ID does not exist" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.572146 4971 scope.go:117] "RemoveContainer" containerID="dc82b02f06416b72870d4e96fa28c536b46ebac807f74b9e586c94cff84ebbd1" Mar 09 09:39:52 crc kubenswrapper[4971]: E0309 09:39:52.572429 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc82b02f06416b72870d4e96fa28c536b46ebac807f74b9e586c94cff84ebbd1\": container with ID starting with dc82b02f06416b72870d4e96fa28c536b46ebac807f74b9e586c94cff84ebbd1 not found: ID does not exist" containerID="dc82b02f06416b72870d4e96fa28c536b46ebac807f74b9e586c94cff84ebbd1" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.572455 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc82b02f06416b72870d4e96fa28c536b46ebac807f74b9e586c94cff84ebbd1"} err="failed to get container status \"dc82b02f06416b72870d4e96fa28c536b46ebac807f74b9e586c94cff84ebbd1\": rpc error: code = NotFound desc = could not find container \"dc82b02f06416b72870d4e96fa28c536b46ebac807f74b9e586c94cff84ebbd1\": container with ID starting with dc82b02f06416b72870d4e96fa28c536b46ebac807f74b9e586c94cff84ebbd1 not found: ID does not exist" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.572472 4971 scope.go:117] "RemoveContainer" containerID="d2c2292180738b4ce2c8c5098f9124b67253c759fa65dc97e488d1143f9022af" Mar 09 09:39:52 crc kubenswrapper[4971]: E0309 09:39:52.573229 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2c2292180738b4ce2c8c5098f9124b67253c759fa65dc97e488d1143f9022af\": container with ID starting with d2c2292180738b4ce2c8c5098f9124b67253c759fa65dc97e488d1143f9022af not found: ID does not exist" containerID="d2c2292180738b4ce2c8c5098f9124b67253c759fa65dc97e488d1143f9022af" Mar 09 09:39:52 crc kubenswrapper[4971]: I0309 09:39:52.573280 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2c2292180738b4ce2c8c5098f9124b67253c759fa65dc97e488d1143f9022af"} err="failed to get container status \"d2c2292180738b4ce2c8c5098f9124b67253c759fa65dc97e488d1143f9022af\": rpc error: code = NotFound desc = could not find container \"d2c2292180738b4ce2c8c5098f9124b67253c759fa65dc97e488d1143f9022af\": container with ID starting with d2c2292180738b4ce2c8c5098f9124b67253c759fa65dc97e488d1143f9022af not found: ID does not exist" Mar 09 09:39:53 crc kubenswrapper[4971]: I0309 09:39:53.158569 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" path="/var/lib/kubelet/pods/52c89471-afd6-4cce-8a00-54dbcd4ef92b/volumes" Mar 09 09:39:53 crc kubenswrapper[4971]: I0309 09:39:53.160506 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" path="/var/lib/kubelet/pods/6f4feb95-a276-4089-9876-d30cde31f67c/volumes" Mar 09 09:39:53 crc kubenswrapper[4971]: I0309 09:39:53.161844 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" path="/var/lib/kubelet/pods/f302bdd8-8044-48a4-aacd-13967f94570c/volumes" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.873964 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874253 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="rsync" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874268 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="rsync" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874281 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="account-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874288 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="account-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874300 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="account-server" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874306 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="account-server" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874318 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="object-updater" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874324 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="object-updater" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874331 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="object-updater" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874378 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="object-updater" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874390 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="object-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874396 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="object-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874407 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="account-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874412 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="account-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874451 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="container-server" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874457 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="container-server" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874465 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="account-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874470 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="account-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874479 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="account-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874485 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="account-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874496 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="container-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874502 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="container-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874508 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="rsync" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874514 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="rsync" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874524 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="315d491f-24ac-4eda-9e07-1e0533f2f9b7" containerName="proxy-httpd" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874529 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="315d491f-24ac-4eda-9e07-1e0533f2f9b7" containerName="proxy-httpd" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874536 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="object-expirer" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874541 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="object-expirer" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874548 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="container-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874554 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="container-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874561 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="swift-recon-cron" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874568 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="swift-recon-cron" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874577 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="object-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874582 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="object-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874590 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="swift-recon-cron" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874596 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="swift-recon-cron" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874605 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="315d491f-24ac-4eda-9e07-1e0533f2f9b7" containerName="proxy-server" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874612 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="315d491f-24ac-4eda-9e07-1e0533f2f9b7" containerName="proxy-server" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874618 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="account-reaper" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874623 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="account-reaper" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874631 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="object-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874637 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="object-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874646 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="account-server" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874652 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="account-server" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874660 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="account-reaper" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874666 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="account-reaper" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874675 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="object-server" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874681 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="object-server" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874693 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="container-updater" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874700 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="container-updater" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874706 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="container-updater" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874714 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="container-updater" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874727 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="account-server" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874733 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="account-server" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874746 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="container-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874752 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="container-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874762 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="object-server" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874769 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="object-server" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874782 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="object-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874788 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="object-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874798 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52cd61bd-9afd-40de-8369-c704972e7314" containerName="swift-ring-rebalance" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874805 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="52cd61bd-9afd-40de-8369-c704972e7314" containerName="swift-ring-rebalance" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874814 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="object-expirer" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874822 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="object-expirer" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874833 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="container-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874840 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="container-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874849 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="object-updater" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874855 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="object-updater" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874867 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="account-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874874 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="account-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874884 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="object-expirer" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874890 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="object-expirer" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874899 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="container-updater" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874906 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="container-updater" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874915 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="container-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874922 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="container-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874933 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="container-server" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874940 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="container-server" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874952 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="account-reaper" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874959 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="account-reaper" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874970 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="object-server" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874978 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="object-server" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.874988 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="object-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.874996 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="object-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.875008 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="container-server" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875015 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="container-server" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.875025 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="container-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875033 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="container-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.875043 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="account-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875050 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="account-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.875060 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="swift-recon-cron" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875068 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="swift-recon-cron" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.875079 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="rsync" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875086 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="rsync" Mar 09 09:39:54 crc kubenswrapper[4971]: E0309 09:39:54.875094 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="object-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875101 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="object-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875245 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="container-server" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875256 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="object-updater" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875264 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="account-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875273 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="container-server" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875283 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="account-server" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875295 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="object-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875302 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="object-expirer" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875311 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="account-reaper" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875323 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="account-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875336 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="container-updater" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875363 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="container-updater" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875376 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="swift-recon-cron" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875384 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="container-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875396 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="object-updater" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875406 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="account-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875417 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="account-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875428 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="object-server" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875436 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="container-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875446 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="object-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875455 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="swift-recon-cron" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875464 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="container-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875474 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="rsync" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875484 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="object-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875492 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="container-server" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875501 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="object-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875509 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="account-reaper" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875517 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="object-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875525 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="object-updater" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875535 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="object-server" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875544 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="account-server" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875554 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="container-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875560 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="object-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875570 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="object-expirer" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875581 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="315d491f-24ac-4eda-9e07-1e0533f2f9b7" containerName="proxy-httpd" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875588 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="account-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875597 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="52cd61bd-9afd-40de-8369-c704972e7314" containerName="swift-ring-rebalance" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875605 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="account-reaper" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875612 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="315d491f-24ac-4eda-9e07-1e0533f2f9b7" containerName="proxy-server" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875623 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="object-expirer" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875632 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="container-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875642 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f302bdd8-8044-48a4-aacd-13967f94570c" containerName="rsync" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875651 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="account-replicator" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875661 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="account-server" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875670 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="rsync" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875679 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="object-server" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875690 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4feb95-a276-4089-9876-d30cde31f67c" containerName="container-auditor" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875699 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="container-updater" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.875708 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c89471-afd6-4cce-8a00-54dbcd4ef92b" containerName="swift-recon-cron" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.881250 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.883895 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-conf" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.884403 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-swift-dockercfg-rq69w" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.887205 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-files" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.887507 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-storage-config-data" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.889833 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.944014 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-etc-swift\") pod \"swift-storage-0\" (UID: \"7cf1281b-f79b-4219-902e-eea6fb707cb4\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.944084 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"7cf1281b-f79b-4219-902e-eea6fb707cb4\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.944155 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7cf1281b-f79b-4219-902e-eea6fb707cb4-cache\") pod \"swift-storage-0\" (UID: \"7cf1281b-f79b-4219-902e-eea6fb707cb4\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.944181 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7cf1281b-f79b-4219-902e-eea6fb707cb4-lock\") pod \"swift-storage-0\" (UID: \"7cf1281b-f79b-4219-902e-eea6fb707cb4\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:39:54 crc kubenswrapper[4971]: I0309 09:39:54.944208 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs8hm\" (UniqueName: \"kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-kube-api-access-gs8hm\") pod \"swift-storage-0\" (UID: \"7cf1281b-f79b-4219-902e-eea6fb707cb4\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:39:55 crc kubenswrapper[4971]: I0309 09:39:55.046167 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7cf1281b-f79b-4219-902e-eea6fb707cb4-cache\") pod \"swift-storage-0\" (UID: \"7cf1281b-f79b-4219-902e-eea6fb707cb4\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:39:55 crc kubenswrapper[4971]: I0309 09:39:55.046246 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7cf1281b-f79b-4219-902e-eea6fb707cb4-lock\") pod \"swift-storage-0\" (UID: \"7cf1281b-f79b-4219-902e-eea6fb707cb4\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:39:55 crc kubenswrapper[4971]: I0309 09:39:55.046271 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs8hm\" (UniqueName: \"kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-kube-api-access-gs8hm\") pod \"swift-storage-0\" (UID: \"7cf1281b-f79b-4219-902e-eea6fb707cb4\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:39:55 crc kubenswrapper[4971]: I0309 09:39:55.046368 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-etc-swift\") pod \"swift-storage-0\" (UID: \"7cf1281b-f79b-4219-902e-eea6fb707cb4\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:39:55 crc kubenswrapper[4971]: I0309 09:39:55.046431 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"7cf1281b-f79b-4219-902e-eea6fb707cb4\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:39:55 crc kubenswrapper[4971]: E0309 09:39:55.046613 4971 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 09:39:55 crc kubenswrapper[4971]: E0309 09:39:55.046649 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 09 09:39:55 crc kubenswrapper[4971]: E0309 09:39:55.046706 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-etc-swift podName:7cf1281b-f79b-4219-902e-eea6fb707cb4 nodeName:}" failed. No retries permitted until 2026-03-09 09:39:55.546686503 +0000 UTC m=+1199.106614313 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-etc-swift") pod "swift-storage-0" (UID: "7cf1281b-f79b-4219-902e-eea6fb707cb4") : configmap "swift-ring-files" not found Mar 09 09:39:55 crc kubenswrapper[4971]: I0309 09:39:55.046817 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7cf1281b-f79b-4219-902e-eea6fb707cb4-cache\") pod \"swift-storage-0\" (UID: \"7cf1281b-f79b-4219-902e-eea6fb707cb4\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:39:55 crc kubenswrapper[4971]: I0309 09:39:55.046913 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"7cf1281b-f79b-4219-902e-eea6fb707cb4\") device mount path \"/mnt/openstack/pv02\"" pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:39:55 crc kubenswrapper[4971]: I0309 09:39:55.046933 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7cf1281b-f79b-4219-902e-eea6fb707cb4-lock\") pod \"swift-storage-0\" (UID: \"7cf1281b-f79b-4219-902e-eea6fb707cb4\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:39:55 crc kubenswrapper[4971]: I0309 09:39:55.066055 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs8hm\" (UniqueName: \"kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-kube-api-access-gs8hm\") pod \"swift-storage-0\" (UID: \"7cf1281b-f79b-4219-902e-eea6fb707cb4\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:39:55 crc kubenswrapper[4971]: I0309 09:39:55.068813 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"7cf1281b-f79b-4219-902e-eea6fb707cb4\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:39:55 crc kubenswrapper[4971]: I0309 09:39:55.553526 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-etc-swift\") pod \"swift-storage-0\" (UID: \"7cf1281b-f79b-4219-902e-eea6fb707cb4\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:39:55 crc kubenswrapper[4971]: E0309 09:39:55.553740 4971 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 09:39:55 crc kubenswrapper[4971]: E0309 09:39:55.553778 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 09 09:39:55 crc kubenswrapper[4971]: E0309 09:39:55.553854 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-etc-swift podName:7cf1281b-f79b-4219-902e-eea6fb707cb4 nodeName:}" failed. No retries permitted until 2026-03-09 09:39:56.553830465 +0000 UTC m=+1200.113758305 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-etc-swift") pod "swift-storage-0" (UID: "7cf1281b-f79b-4219-902e-eea6fb707cb4") : configmap "swift-ring-files" not found Mar 09 09:39:56 crc kubenswrapper[4971]: I0309 09:39:56.566598 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-etc-swift\") pod \"swift-storage-0\" (UID: \"7cf1281b-f79b-4219-902e-eea6fb707cb4\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:39:56 crc kubenswrapper[4971]: E0309 09:39:56.566852 4971 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 09:39:56 crc kubenswrapper[4971]: E0309 09:39:56.566904 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 09 09:39:56 crc kubenswrapper[4971]: E0309 09:39:56.567004 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-etc-swift podName:7cf1281b-f79b-4219-902e-eea6fb707cb4 nodeName:}" failed. No retries permitted until 2026-03-09 09:39:58.566975546 +0000 UTC m=+1202.126903386 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-etc-swift") pod "swift-storage-0" (UID: "7cf1281b-f79b-4219-902e-eea6fb707cb4") : configmap "swift-ring-files" not found Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.599502 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-etc-swift\") pod \"swift-storage-0\" (UID: \"7cf1281b-f79b-4219-902e-eea6fb707cb4\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:39:58 crc kubenswrapper[4971]: E0309 09:39:58.599805 4971 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 09:39:58 crc kubenswrapper[4971]: E0309 09:39:58.601341 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 09 09:39:58 crc kubenswrapper[4971]: E0309 09:39:58.601464 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-etc-swift podName:7cf1281b-f79b-4219-902e-eea6fb707cb4 nodeName:}" failed. No retries permitted until 2026-03-09 09:40:02.601431792 +0000 UTC m=+1206.161359632 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-etc-swift") pod "swift-storage-0" (UID: "7cf1281b-f79b-4219-902e-eea6fb707cb4") : configmap "swift-ring-files" not found Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.734975 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-xbjgd"] Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.736072 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-xbjgd" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.741711 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-proxy-config-data" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.741794 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.741794 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.756840 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-xbjgd"] Mar 09 09:39:58 crc kubenswrapper[4971]: E0309 09:39:58.759496 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[dispersionconf etc-swift kube-api-access-r6kz7 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[dispersionconf etc-swift kube-api-access-r6kz7 ring-data-devices scripts swiftconf]: context canceled" pod="swift-kuttl-tests/swift-ring-rebalance-xbjgd" podUID="c63a6882-d227-46d1-b1f2-0df47c30b82f" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.775759 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-55fsl"] Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.776730 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.783727 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-55fsl"] Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.793304 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-xbjgd"] Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.804481 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2n42\" (UniqueName: \"kubernetes.io/projected/b90b7a65-9704-4b25-9ad9-56ed4bb14886-kube-api-access-n2n42\") pod \"swift-ring-rebalance-55fsl\" (UID: \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\") " pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.804550 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c63a6882-d227-46d1-b1f2-0df47c30b82f-dispersionconf\") pod \"swift-ring-rebalance-xbjgd\" (UID: \"c63a6882-d227-46d1-b1f2-0df47c30b82f\") " pod="swift-kuttl-tests/swift-ring-rebalance-xbjgd" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.804587 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b90b7a65-9704-4b25-9ad9-56ed4bb14886-scripts\") pod \"swift-ring-rebalance-55fsl\" (UID: \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\") " pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.804605 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b90b7a65-9704-4b25-9ad9-56ed4bb14886-dispersionconf\") pod \"swift-ring-rebalance-55fsl\" (UID: \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\") " pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.804641 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6kz7\" (UniqueName: \"kubernetes.io/projected/c63a6882-d227-46d1-b1f2-0df47c30b82f-kube-api-access-r6kz7\") pod \"swift-ring-rebalance-xbjgd\" (UID: \"c63a6882-d227-46d1-b1f2-0df47c30b82f\") " pod="swift-kuttl-tests/swift-ring-rebalance-xbjgd" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.804665 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c63a6882-d227-46d1-b1f2-0df47c30b82f-etc-swift\") pod \"swift-ring-rebalance-xbjgd\" (UID: \"c63a6882-d227-46d1-b1f2-0df47c30b82f\") " pod="swift-kuttl-tests/swift-ring-rebalance-xbjgd" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.804686 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c63a6882-d227-46d1-b1f2-0df47c30b82f-ring-data-devices\") pod \"swift-ring-rebalance-xbjgd\" (UID: \"c63a6882-d227-46d1-b1f2-0df47c30b82f\") " pod="swift-kuttl-tests/swift-ring-rebalance-xbjgd" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.804721 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b90b7a65-9704-4b25-9ad9-56ed4bb14886-swiftconf\") pod \"swift-ring-rebalance-55fsl\" (UID: \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\") " pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.804742 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c63a6882-d227-46d1-b1f2-0df47c30b82f-scripts\") pod \"swift-ring-rebalance-xbjgd\" (UID: \"c63a6882-d227-46d1-b1f2-0df47c30b82f\") " pod="swift-kuttl-tests/swift-ring-rebalance-xbjgd" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.804757 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b90b7a65-9704-4b25-9ad9-56ed4bb14886-ring-data-devices\") pod \"swift-ring-rebalance-55fsl\" (UID: \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\") " pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.804775 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c63a6882-d227-46d1-b1f2-0df47c30b82f-swiftconf\") pod \"swift-ring-rebalance-xbjgd\" (UID: \"c63a6882-d227-46d1-b1f2-0df47c30b82f\") " pod="swift-kuttl-tests/swift-ring-rebalance-xbjgd" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.804799 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b90b7a65-9704-4b25-9ad9-56ed4bb14886-etc-swift\") pod \"swift-ring-rebalance-55fsl\" (UID: \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\") " pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.945334 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b90b7a65-9704-4b25-9ad9-56ed4bb14886-scripts\") pod \"swift-ring-rebalance-55fsl\" (UID: \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\") " pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.945801 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b90b7a65-9704-4b25-9ad9-56ed4bb14886-dispersionconf\") pod \"swift-ring-rebalance-55fsl\" (UID: \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\") " pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.945834 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6kz7\" (UniqueName: \"kubernetes.io/projected/c63a6882-d227-46d1-b1f2-0df47c30b82f-kube-api-access-r6kz7\") pod \"swift-ring-rebalance-xbjgd\" (UID: \"c63a6882-d227-46d1-b1f2-0df47c30b82f\") " pod="swift-kuttl-tests/swift-ring-rebalance-xbjgd" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.945866 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c63a6882-d227-46d1-b1f2-0df47c30b82f-etc-swift\") pod \"swift-ring-rebalance-xbjgd\" (UID: \"c63a6882-d227-46d1-b1f2-0df47c30b82f\") " pod="swift-kuttl-tests/swift-ring-rebalance-xbjgd" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.945898 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c63a6882-d227-46d1-b1f2-0df47c30b82f-ring-data-devices\") pod \"swift-ring-rebalance-xbjgd\" (UID: \"c63a6882-d227-46d1-b1f2-0df47c30b82f\") " pod="swift-kuttl-tests/swift-ring-rebalance-xbjgd" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.945943 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b90b7a65-9704-4b25-9ad9-56ed4bb14886-swiftconf\") pod \"swift-ring-rebalance-55fsl\" (UID: \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\") " pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.945974 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c63a6882-d227-46d1-b1f2-0df47c30b82f-scripts\") pod \"swift-ring-rebalance-xbjgd\" (UID: \"c63a6882-d227-46d1-b1f2-0df47c30b82f\") " pod="swift-kuttl-tests/swift-ring-rebalance-xbjgd" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.945997 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b90b7a65-9704-4b25-9ad9-56ed4bb14886-ring-data-devices\") pod \"swift-ring-rebalance-55fsl\" (UID: \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\") " pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.946024 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c63a6882-d227-46d1-b1f2-0df47c30b82f-swiftconf\") pod \"swift-ring-rebalance-xbjgd\" (UID: \"c63a6882-d227-46d1-b1f2-0df47c30b82f\") " pod="swift-kuttl-tests/swift-ring-rebalance-xbjgd" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.946053 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b90b7a65-9704-4b25-9ad9-56ed4bb14886-etc-swift\") pod \"swift-ring-rebalance-55fsl\" (UID: \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\") " pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.946095 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2n42\" (UniqueName: \"kubernetes.io/projected/b90b7a65-9704-4b25-9ad9-56ed4bb14886-kube-api-access-n2n42\") pod \"swift-ring-rebalance-55fsl\" (UID: \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\") " pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.946130 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c63a6882-d227-46d1-b1f2-0df47c30b82f-dispersionconf\") pod \"swift-ring-rebalance-xbjgd\" (UID: \"c63a6882-d227-46d1-b1f2-0df47c30b82f\") " pod="swift-kuttl-tests/swift-ring-rebalance-xbjgd" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.946334 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b90b7a65-9704-4b25-9ad9-56ed4bb14886-scripts\") pod \"swift-ring-rebalance-55fsl\" (UID: \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\") " pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.947315 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c63a6882-d227-46d1-b1f2-0df47c30b82f-etc-swift\") pod \"swift-ring-rebalance-xbjgd\" (UID: \"c63a6882-d227-46d1-b1f2-0df47c30b82f\") " pod="swift-kuttl-tests/swift-ring-rebalance-xbjgd" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.947579 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b90b7a65-9704-4b25-9ad9-56ed4bb14886-etc-swift\") pod \"swift-ring-rebalance-55fsl\" (UID: \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\") " pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.948557 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b90b7a65-9704-4b25-9ad9-56ed4bb14886-ring-data-devices\") pod \"swift-ring-rebalance-55fsl\" (UID: \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\") " pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.950070 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c63a6882-d227-46d1-b1f2-0df47c30b82f-scripts\") pod \"swift-ring-rebalance-xbjgd\" (UID: \"c63a6882-d227-46d1-b1f2-0df47c30b82f\") " pod="swift-kuttl-tests/swift-ring-rebalance-xbjgd" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.950554 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c63a6882-d227-46d1-b1f2-0df47c30b82f-ring-data-devices\") pod \"swift-ring-rebalance-xbjgd\" (UID: \"c63a6882-d227-46d1-b1f2-0df47c30b82f\") " pod="swift-kuttl-tests/swift-ring-rebalance-xbjgd" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.951589 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b90b7a65-9704-4b25-9ad9-56ed4bb14886-swiftconf\") pod \"swift-ring-rebalance-55fsl\" (UID: \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\") " pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.951605 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b90b7a65-9704-4b25-9ad9-56ed4bb14886-dispersionconf\") pod \"swift-ring-rebalance-55fsl\" (UID: \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\") " pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.951693 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c63a6882-d227-46d1-b1f2-0df47c30b82f-swiftconf\") pod \"swift-ring-rebalance-xbjgd\" (UID: \"c63a6882-d227-46d1-b1f2-0df47c30b82f\") " pod="swift-kuttl-tests/swift-ring-rebalance-xbjgd" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.952631 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c63a6882-d227-46d1-b1f2-0df47c30b82f-dispersionconf\") pod \"swift-ring-rebalance-xbjgd\" (UID: \"c63a6882-d227-46d1-b1f2-0df47c30b82f\") " pod="swift-kuttl-tests/swift-ring-rebalance-xbjgd" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.964538 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6kz7\" (UniqueName: \"kubernetes.io/projected/c63a6882-d227-46d1-b1f2-0df47c30b82f-kube-api-access-r6kz7\") pod \"swift-ring-rebalance-xbjgd\" (UID: \"c63a6882-d227-46d1-b1f2-0df47c30b82f\") " pod="swift-kuttl-tests/swift-ring-rebalance-xbjgd" Mar 09 09:39:58 crc kubenswrapper[4971]: I0309 09:39:58.968439 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2n42\" (UniqueName: \"kubernetes.io/projected/b90b7a65-9704-4b25-9ad9-56ed4bb14886-kube-api-access-n2n42\") pod \"swift-ring-rebalance-55fsl\" (UID: \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\") " pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" Mar 09 09:39:59 crc kubenswrapper[4971]: I0309 09:39:59.019466 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-xbjgd" Mar 09 09:39:59 crc kubenswrapper[4971]: I0309 09:39:59.029904 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-xbjgd" Mar 09 09:39:59 crc kubenswrapper[4971]: I0309 09:39:59.092750 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-swift-dockercfg-rq69w" Mar 09 09:39:59 crc kubenswrapper[4971]: I0309 09:39:59.101068 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" Mar 09 09:39:59 crc kubenswrapper[4971]: I0309 09:39:59.148726 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c63a6882-d227-46d1-b1f2-0df47c30b82f-dispersionconf\") pod \"c63a6882-d227-46d1-b1f2-0df47c30b82f\" (UID: \"c63a6882-d227-46d1-b1f2-0df47c30b82f\") " Mar 09 09:39:59 crc kubenswrapper[4971]: I0309 09:39:59.148876 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6kz7\" (UniqueName: \"kubernetes.io/projected/c63a6882-d227-46d1-b1f2-0df47c30b82f-kube-api-access-r6kz7\") pod \"c63a6882-d227-46d1-b1f2-0df47c30b82f\" (UID: \"c63a6882-d227-46d1-b1f2-0df47c30b82f\") " Mar 09 09:39:59 crc kubenswrapper[4971]: I0309 09:39:59.149054 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c63a6882-d227-46d1-b1f2-0df47c30b82f-swiftconf\") pod \"c63a6882-d227-46d1-b1f2-0df47c30b82f\" (UID: \"c63a6882-d227-46d1-b1f2-0df47c30b82f\") " Mar 09 09:39:59 crc kubenswrapper[4971]: I0309 09:39:59.149110 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c63a6882-d227-46d1-b1f2-0df47c30b82f-scripts\") pod \"c63a6882-d227-46d1-b1f2-0df47c30b82f\" (UID: \"c63a6882-d227-46d1-b1f2-0df47c30b82f\") " Mar 09 09:39:59 crc kubenswrapper[4971]: I0309 09:39:59.150280 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c63a6882-d227-46d1-b1f2-0df47c30b82f-ring-data-devices\") pod \"c63a6882-d227-46d1-b1f2-0df47c30b82f\" (UID: \"c63a6882-d227-46d1-b1f2-0df47c30b82f\") " Mar 09 09:39:59 crc kubenswrapper[4971]: I0309 09:39:59.149942 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c63a6882-d227-46d1-b1f2-0df47c30b82f-scripts" (OuterVolumeSpecName: "scripts") pod "c63a6882-d227-46d1-b1f2-0df47c30b82f" (UID: "c63a6882-d227-46d1-b1f2-0df47c30b82f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:39:59 crc kubenswrapper[4971]: I0309 09:39:59.150357 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c63a6882-d227-46d1-b1f2-0df47c30b82f-etc-swift\") pod \"c63a6882-d227-46d1-b1f2-0df47c30b82f\" (UID: \"c63a6882-d227-46d1-b1f2-0df47c30b82f\") " Mar 09 09:39:59 crc kubenswrapper[4971]: I0309 09:39:59.150858 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c63a6882-d227-46d1-b1f2-0df47c30b82f-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:59 crc kubenswrapper[4971]: I0309 09:39:59.151072 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c63a6882-d227-46d1-b1f2-0df47c30b82f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c63a6882-d227-46d1-b1f2-0df47c30b82f" (UID: "c63a6882-d227-46d1-b1f2-0df47c30b82f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:39:59 crc kubenswrapper[4971]: I0309 09:39:59.152642 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c63a6882-d227-46d1-b1f2-0df47c30b82f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c63a6882-d227-46d1-b1f2-0df47c30b82f" (UID: "c63a6882-d227-46d1-b1f2-0df47c30b82f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:39:59 crc kubenswrapper[4971]: I0309 09:39:59.153039 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c63a6882-d227-46d1-b1f2-0df47c30b82f-kube-api-access-r6kz7" (OuterVolumeSpecName: "kube-api-access-r6kz7") pod "c63a6882-d227-46d1-b1f2-0df47c30b82f" (UID: "c63a6882-d227-46d1-b1f2-0df47c30b82f"). InnerVolumeSpecName "kube-api-access-r6kz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:39:59 crc kubenswrapper[4971]: I0309 09:39:59.153716 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c63a6882-d227-46d1-b1f2-0df47c30b82f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c63a6882-d227-46d1-b1f2-0df47c30b82f" (UID: "c63a6882-d227-46d1-b1f2-0df47c30b82f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:39:59 crc kubenswrapper[4971]: I0309 09:39:59.153756 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c63a6882-d227-46d1-b1f2-0df47c30b82f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c63a6882-d227-46d1-b1f2-0df47c30b82f" (UID: "c63a6882-d227-46d1-b1f2-0df47c30b82f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:39:59 crc kubenswrapper[4971]: I0309 09:39:59.252256 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6kz7\" (UniqueName: \"kubernetes.io/projected/c63a6882-d227-46d1-b1f2-0df47c30b82f-kube-api-access-r6kz7\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:59 crc kubenswrapper[4971]: I0309 09:39:59.252623 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c63a6882-d227-46d1-b1f2-0df47c30b82f-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:59 crc kubenswrapper[4971]: I0309 09:39:59.252638 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c63a6882-d227-46d1-b1f2-0df47c30b82f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:59 crc kubenswrapper[4971]: I0309 09:39:59.252651 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c63a6882-d227-46d1-b1f2-0df47c30b82f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:59 crc kubenswrapper[4971]: I0309 09:39:59.252662 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c63a6882-d227-46d1-b1f2-0df47c30b82f-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:59 crc kubenswrapper[4971]: I0309 09:39:59.515433 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-55fsl"] Mar 09 09:40:00 crc kubenswrapper[4971]: I0309 09:40:00.037806 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-xbjgd" Mar 09 09:40:00 crc kubenswrapper[4971]: I0309 09:40:00.037844 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" event={"ID":"b90b7a65-9704-4b25-9ad9-56ed4bb14886","Type":"ContainerStarted","Data":"645a97b2a1faf42b1dacd4e05546c5e4002c02315607b9596cc2c5be2517c0e7"} Mar 09 09:40:00 crc kubenswrapper[4971]: I0309 09:40:00.038257 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" event={"ID":"b90b7a65-9704-4b25-9ad9-56ed4bb14886","Type":"ContainerStarted","Data":"7323677bd50e2803ba6720fde89f5b8e162a1409d2f8e4142b190a089f993df7"} Mar 09 09:40:00 crc kubenswrapper[4971]: I0309 09:40:00.059597 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" podStartSLOduration=2.059550642 podStartE2EDuration="2.059550642s" podCreationTimestamp="2026-03-09 09:39:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:40:00.056303449 +0000 UTC m=+1203.616231269" watchObservedRunningTime="2026-03-09 09:40:00.059550642 +0000 UTC m=+1203.619478462" Mar 09 09:40:00 crc kubenswrapper[4971]: I0309 09:40:00.093739 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-xbjgd"] Mar 09 09:40:00 crc kubenswrapper[4971]: I0309 09:40:00.100417 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-xbjgd"] Mar 09 09:40:00 crc kubenswrapper[4971]: I0309 09:40:00.131044 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550820-rws2m"] Mar 09 09:40:00 crc kubenswrapper[4971]: I0309 09:40:00.131954 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550820-rws2m" Mar 09 09:40:00 crc kubenswrapper[4971]: I0309 09:40:00.137428 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:40:00 crc kubenswrapper[4971]: I0309 09:40:00.137732 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:40:00 crc kubenswrapper[4971]: I0309 09:40:00.137932 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-xhrv2" Mar 09 09:40:00 crc kubenswrapper[4971]: I0309 09:40:00.153813 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550820-rws2m"] Mar 09 09:40:00 crc kubenswrapper[4971]: I0309 09:40:00.267413 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx6dj\" (UniqueName: \"kubernetes.io/projected/0b79462f-fbdd-40be-9c60-adca2d053c26-kube-api-access-nx6dj\") pod \"auto-csr-approver-29550820-rws2m\" (UID: \"0b79462f-fbdd-40be-9c60-adca2d053c26\") " pod="openshift-infra/auto-csr-approver-29550820-rws2m" Mar 09 09:40:00 crc kubenswrapper[4971]: I0309 09:40:00.368507 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx6dj\" (UniqueName: \"kubernetes.io/projected/0b79462f-fbdd-40be-9c60-adca2d053c26-kube-api-access-nx6dj\") pod \"auto-csr-approver-29550820-rws2m\" (UID: \"0b79462f-fbdd-40be-9c60-adca2d053c26\") " pod="openshift-infra/auto-csr-approver-29550820-rws2m" Mar 09 09:40:00 crc kubenswrapper[4971]: I0309 09:40:00.404194 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx6dj\" (UniqueName: \"kubernetes.io/projected/0b79462f-fbdd-40be-9c60-adca2d053c26-kube-api-access-nx6dj\") pod \"auto-csr-approver-29550820-rws2m\" (UID: \"0b79462f-fbdd-40be-9c60-adca2d053c26\") " pod="openshift-infra/auto-csr-approver-29550820-rws2m" Mar 09 09:40:00 crc kubenswrapper[4971]: I0309 09:40:00.457288 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550820-rws2m" Mar 09 09:40:00 crc kubenswrapper[4971]: I0309 09:40:00.908652 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550820-rws2m"] Mar 09 09:40:00 crc kubenswrapper[4971]: W0309 09:40:00.913943 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b79462f_fbdd_40be_9c60_adca2d053c26.slice/crio-ac7c99c92acf3fc8a7cca968191eaaea1362f7a853745f943ce43d6b7be9b3f8 WatchSource:0}: Error finding container ac7c99c92acf3fc8a7cca968191eaaea1362f7a853745f943ce43d6b7be9b3f8: Status 404 returned error can't find the container with id ac7c99c92acf3fc8a7cca968191eaaea1362f7a853745f943ce43d6b7be9b3f8 Mar 09 09:40:01 crc kubenswrapper[4971]: I0309 09:40:01.055037 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550820-rws2m" event={"ID":"0b79462f-fbdd-40be-9c60-adca2d053c26","Type":"ContainerStarted","Data":"ac7c99c92acf3fc8a7cca968191eaaea1362f7a853745f943ce43d6b7be9b3f8"} Mar 09 09:40:01 crc kubenswrapper[4971]: I0309 09:40:01.161455 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c63a6882-d227-46d1-b1f2-0df47c30b82f" path="/var/lib/kubelet/pods/c63a6882-d227-46d1-b1f2-0df47c30b82f/volumes" Mar 09 09:40:02 crc kubenswrapper[4971]: I0309 09:40:02.702323 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-etc-swift\") pod \"swift-storage-0\" (UID: \"7cf1281b-f79b-4219-902e-eea6fb707cb4\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:40:02 crc kubenswrapper[4971]: E0309 09:40:02.702567 4971 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 09 09:40:02 crc kubenswrapper[4971]: E0309 09:40:02.702601 4971 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 09 09:40:02 crc kubenswrapper[4971]: E0309 09:40:02.702677 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-etc-swift podName:7cf1281b-f79b-4219-902e-eea6fb707cb4 nodeName:}" failed. No retries permitted until 2026-03-09 09:40:10.70265354 +0000 UTC m=+1214.262581350 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-etc-swift") pod "swift-storage-0" (UID: "7cf1281b-f79b-4219-902e-eea6fb707cb4") : configmap "swift-ring-files" not found Mar 09 09:40:03 crc kubenswrapper[4971]: I0309 09:40:03.073415 4971 generic.go:334] "Generic (PLEG): container finished" podID="0b79462f-fbdd-40be-9c60-adca2d053c26" containerID="a5ded82dd6611ffcec77dcce3edfc9bf31e4bbedeb59631d196bba143fc77b63" exitCode=0 Mar 09 09:40:03 crc kubenswrapper[4971]: I0309 09:40:03.073516 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550820-rws2m" event={"ID":"0b79462f-fbdd-40be-9c60-adca2d053c26","Type":"ContainerDied","Data":"a5ded82dd6611ffcec77dcce3edfc9bf31e4bbedeb59631d196bba143fc77b63"} Mar 09 09:40:04 crc kubenswrapper[4971]: I0309 09:40:04.350474 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550820-rws2m" Mar 09 09:40:04 crc kubenswrapper[4971]: I0309 09:40:04.429850 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx6dj\" (UniqueName: \"kubernetes.io/projected/0b79462f-fbdd-40be-9c60-adca2d053c26-kube-api-access-nx6dj\") pod \"0b79462f-fbdd-40be-9c60-adca2d053c26\" (UID: \"0b79462f-fbdd-40be-9c60-adca2d053c26\") " Mar 09 09:40:04 crc kubenswrapper[4971]: I0309 09:40:04.446697 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b79462f-fbdd-40be-9c60-adca2d053c26-kube-api-access-nx6dj" (OuterVolumeSpecName: "kube-api-access-nx6dj") pod "0b79462f-fbdd-40be-9c60-adca2d053c26" (UID: "0b79462f-fbdd-40be-9c60-adca2d053c26"). InnerVolumeSpecName "kube-api-access-nx6dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:40:04 crc kubenswrapper[4971]: I0309 09:40:04.532491 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nx6dj\" (UniqueName: \"kubernetes.io/projected/0b79462f-fbdd-40be-9c60-adca2d053c26-kube-api-access-nx6dj\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:05 crc kubenswrapper[4971]: I0309 09:40:05.090273 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550820-rws2m" event={"ID":"0b79462f-fbdd-40be-9c60-adca2d053c26","Type":"ContainerDied","Data":"ac7c99c92acf3fc8a7cca968191eaaea1362f7a853745f943ce43d6b7be9b3f8"} Mar 09 09:40:05 crc kubenswrapper[4971]: I0309 09:40:05.090321 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac7c99c92acf3fc8a7cca968191eaaea1362f7a853745f943ce43d6b7be9b3f8" Mar 09 09:40:05 crc kubenswrapper[4971]: I0309 09:40:05.090405 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550820-rws2m" Mar 09 09:40:05 crc kubenswrapper[4971]: I0309 09:40:05.410235 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550814-b5bs7"] Mar 09 09:40:05 crc kubenswrapper[4971]: I0309 09:40:05.417047 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550814-b5bs7"] Mar 09 09:40:07 crc kubenswrapper[4971]: I0309 09:40:07.106156 4971 generic.go:334] "Generic (PLEG): container finished" podID="b90b7a65-9704-4b25-9ad9-56ed4bb14886" containerID="645a97b2a1faf42b1dacd4e05546c5e4002c02315607b9596cc2c5be2517c0e7" exitCode=0 Mar 09 09:40:07 crc kubenswrapper[4971]: I0309 09:40:07.106209 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" event={"ID":"b90b7a65-9704-4b25-9ad9-56ed4bb14886","Type":"ContainerDied","Data":"645a97b2a1faf42b1dacd4e05546c5e4002c02315607b9596cc2c5be2517c0e7"} Mar 09 09:40:07 crc kubenswrapper[4971]: I0309 09:40:07.159242 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05e7a398-d1de-4a76-bd69-ff9c7269a24a" path="/var/lib/kubelet/pods/05e7a398-d1de-4a76-bd69-ff9c7269a24a/volumes" Mar 09 09:40:08 crc kubenswrapper[4971]: I0309 09:40:08.395717 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" Mar 09 09:40:08 crc kubenswrapper[4971]: I0309 09:40:08.494601 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b90b7a65-9704-4b25-9ad9-56ed4bb14886-ring-data-devices\") pod \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\" (UID: \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\") " Mar 09 09:40:08 crc kubenswrapper[4971]: I0309 09:40:08.494665 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b90b7a65-9704-4b25-9ad9-56ed4bb14886-swiftconf\") pod \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\" (UID: \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\") " Mar 09 09:40:08 crc kubenswrapper[4971]: I0309 09:40:08.494766 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b90b7a65-9704-4b25-9ad9-56ed4bb14886-scripts\") pod \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\" (UID: \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\") " Mar 09 09:40:08 crc kubenswrapper[4971]: I0309 09:40:08.494801 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b90b7a65-9704-4b25-9ad9-56ed4bb14886-etc-swift\") pod \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\" (UID: \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\") " Mar 09 09:40:08 crc kubenswrapper[4971]: I0309 09:40:08.494828 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b90b7a65-9704-4b25-9ad9-56ed4bb14886-dispersionconf\") pod \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\" (UID: \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\") " Mar 09 09:40:08 crc kubenswrapper[4971]: I0309 09:40:08.494855 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2n42\" (UniqueName: \"kubernetes.io/projected/b90b7a65-9704-4b25-9ad9-56ed4bb14886-kube-api-access-n2n42\") pod \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\" (UID: \"b90b7a65-9704-4b25-9ad9-56ed4bb14886\") " Mar 09 09:40:08 crc kubenswrapper[4971]: I0309 09:40:08.495397 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b90b7a65-9704-4b25-9ad9-56ed4bb14886-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b90b7a65-9704-4b25-9ad9-56ed4bb14886" (UID: "b90b7a65-9704-4b25-9ad9-56ed4bb14886"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:40:08 crc kubenswrapper[4971]: I0309 09:40:08.495783 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b90b7a65-9704-4b25-9ad9-56ed4bb14886-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b90b7a65-9704-4b25-9ad9-56ed4bb14886" (UID: "b90b7a65-9704-4b25-9ad9-56ed4bb14886"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:40:08 crc kubenswrapper[4971]: I0309 09:40:08.495887 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b90b7a65-9704-4b25-9ad9-56ed4bb14886-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:08 crc kubenswrapper[4971]: I0309 09:40:08.495900 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b90b7a65-9704-4b25-9ad9-56ed4bb14886-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:08 crc kubenswrapper[4971]: I0309 09:40:08.499811 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b90b7a65-9704-4b25-9ad9-56ed4bb14886-kube-api-access-n2n42" (OuterVolumeSpecName: "kube-api-access-n2n42") pod "b90b7a65-9704-4b25-9ad9-56ed4bb14886" (UID: "b90b7a65-9704-4b25-9ad9-56ed4bb14886"). InnerVolumeSpecName "kube-api-access-n2n42". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:40:08 crc kubenswrapper[4971]: I0309 09:40:08.502246 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b90b7a65-9704-4b25-9ad9-56ed4bb14886-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b90b7a65-9704-4b25-9ad9-56ed4bb14886" (UID: "b90b7a65-9704-4b25-9ad9-56ed4bb14886"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:40:08 crc kubenswrapper[4971]: I0309 09:40:08.515824 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b90b7a65-9704-4b25-9ad9-56ed4bb14886-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b90b7a65-9704-4b25-9ad9-56ed4bb14886" (UID: "b90b7a65-9704-4b25-9ad9-56ed4bb14886"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:40:08 crc kubenswrapper[4971]: I0309 09:40:08.521914 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b90b7a65-9704-4b25-9ad9-56ed4bb14886-scripts" (OuterVolumeSpecName: "scripts") pod "b90b7a65-9704-4b25-9ad9-56ed4bb14886" (UID: "b90b7a65-9704-4b25-9ad9-56ed4bb14886"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:40:08 crc kubenswrapper[4971]: I0309 09:40:08.596949 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b90b7a65-9704-4b25-9ad9-56ed4bb14886-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:08 crc kubenswrapper[4971]: I0309 09:40:08.596981 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b90b7a65-9704-4b25-9ad9-56ed4bb14886-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:08 crc kubenswrapper[4971]: I0309 09:40:08.596990 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b90b7a65-9704-4b25-9ad9-56ed4bb14886-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:08 crc kubenswrapper[4971]: I0309 09:40:08.597003 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2n42\" (UniqueName: \"kubernetes.io/projected/b90b7a65-9704-4b25-9ad9-56ed4bb14886-kube-api-access-n2n42\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:09 crc kubenswrapper[4971]: I0309 09:40:09.122953 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" event={"ID":"b90b7a65-9704-4b25-9ad9-56ed4bb14886","Type":"ContainerDied","Data":"7323677bd50e2803ba6720fde89f5b8e162a1409d2f8e4142b190a089f993df7"} Mar 09 09:40:09 crc kubenswrapper[4971]: I0309 09:40:09.122994 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7323677bd50e2803ba6720fde89f5b8e162a1409d2f8e4142b190a089f993df7" Mar 09 09:40:09 crc kubenswrapper[4971]: I0309 09:40:09.122993 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-55fsl" Mar 09 09:40:10 crc kubenswrapper[4971]: I0309 09:40:10.728979 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-etc-swift\") pod \"swift-storage-0\" (UID: \"7cf1281b-f79b-4219-902e-eea6fb707cb4\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:40:10 crc kubenswrapper[4971]: I0309 09:40:10.738335 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-etc-swift\") pod \"swift-storage-0\" (UID: \"7cf1281b-f79b-4219-902e-eea6fb707cb4\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:40:10 crc kubenswrapper[4971]: I0309 09:40:10.815628 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:40:11 crc kubenswrapper[4971]: I0309 09:40:11.275759 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 09:40:12 crc kubenswrapper[4971]: I0309 09:40:12.149730 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerStarted","Data":"384d552cdcfa77a481f9fa7d9339755f30189066b1c0db676dd5af57ff9f3d86"} Mar 09 09:40:12 crc kubenswrapper[4971]: I0309 09:40:12.150098 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerStarted","Data":"b6a6726c2672f3f891932c39730d7bf24ebbbaf74b224a8e384a86cc59e3866f"} Mar 09 09:40:12 crc kubenswrapper[4971]: I0309 09:40:12.150114 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerStarted","Data":"10310cdbcf68045954b30655355f31ab5609c827bd113ceb6f36c7acdb67568b"} Mar 09 09:40:12 crc kubenswrapper[4971]: I0309 09:40:12.150126 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerStarted","Data":"33d9df249f56981ed105c7cd6de3c253f19a8190ec6977cac54400d478cb03e7"} Mar 09 09:40:12 crc kubenswrapper[4971]: I0309 09:40:12.150138 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerStarted","Data":"70a4736032a91173a8081a9d98939447d2f1ecece350b377d82dde74455e3069"} Mar 09 09:40:12 crc kubenswrapper[4971]: I0309 09:40:12.150151 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerStarted","Data":"fb1da211386c13b25b034b538da4de69f4cde740ab8f03c5cb0885f93c546dfa"} Mar 09 09:40:13 crc kubenswrapper[4971]: I0309 09:40:13.188139 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerStarted","Data":"0862fe7dba311c6c6b693d2c332d0b86b3f53acee8d036abe908aff5cb6d6e8f"} Mar 09 09:40:13 crc kubenswrapper[4971]: I0309 09:40:13.188513 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerStarted","Data":"d7adf37c0f5ab81fc50a28e521f676d1489460683f4b8dfc2e36052a19f9d07e"} Mar 09 09:40:13 crc kubenswrapper[4971]: I0309 09:40:13.188529 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerStarted","Data":"e0fd9fa4462f1853a43e7cba4a34485ff301c975cb23eef544f0c47106ccc0de"} Mar 09 09:40:13 crc kubenswrapper[4971]: I0309 09:40:13.188543 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerStarted","Data":"24779d2ef2862ffd4c9ec9f48e564d2007ab125e37e3d3b4e67d1ea10b04135f"} Mar 09 09:40:13 crc kubenswrapper[4971]: I0309 09:40:13.188554 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerStarted","Data":"e990ced9d0c8867ebf7e37598faa5157c49269703059f5157a1ed4694f8990a2"} Mar 09 09:40:13 crc kubenswrapper[4971]: I0309 09:40:13.188564 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerStarted","Data":"439626a0c50fc4086e7623b9d44cec2d3c1789da2fdcfd975bd6b2c21ff67bde"} Mar 09 09:40:13 crc kubenswrapper[4971]: I0309 09:40:13.188575 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerStarted","Data":"6cac6b8f712e2749ddb02ae9fa9c26cde95ba42afc84e1c1ab4eb15dda4699ae"} Mar 09 09:40:14 crc kubenswrapper[4971]: I0309 09:40:14.203494 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerStarted","Data":"896548583b712817fd9ac2457aa52b694754b489cd6ec4d51bcf7a2f46d8c65d"} Mar 09 09:40:14 crc kubenswrapper[4971]: I0309 09:40:14.203542 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerStarted","Data":"bd08ba1a76889c53f7f72fa52eabc950709d56d78c23c1a9da6fd3dbfe751148"} Mar 09 09:40:14 crc kubenswrapper[4971]: I0309 09:40:14.203553 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerStarted","Data":"977e8188d1d5d046c5f1d39c7dac78e43224aaeb87daa3d4bee54aee924a5664"} Mar 09 09:40:14 crc kubenswrapper[4971]: I0309 09:40:14.203565 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerStarted","Data":"8173809deda0ceecec61a266c95370ae3e2c5629f0fc60a80c3c4c47b3894635"} Mar 09 09:40:14 crc kubenswrapper[4971]: I0309 09:40:14.236902 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-0" podStartSLOduration=21.236883051 podStartE2EDuration="21.236883051s" podCreationTimestamp="2026-03-09 09:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:40:14.235984495 +0000 UTC m=+1217.795912325" watchObservedRunningTime="2026-03-09 09:40:14.236883051 +0000 UTC m=+1217.796810851" Mar 09 09:40:20 crc kubenswrapper[4971]: I0309 09:40:20.986277 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb"] Mar 09 09:40:20 crc kubenswrapper[4971]: E0309 09:40:20.987147 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b90b7a65-9704-4b25-9ad9-56ed4bb14886" containerName="swift-ring-rebalance" Mar 09 09:40:20 crc kubenswrapper[4971]: I0309 09:40:20.987165 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b90b7a65-9704-4b25-9ad9-56ed4bb14886" containerName="swift-ring-rebalance" Mar 09 09:40:20 crc kubenswrapper[4971]: E0309 09:40:20.987182 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b79462f-fbdd-40be-9c60-adca2d053c26" containerName="oc" Mar 09 09:40:20 crc kubenswrapper[4971]: I0309 09:40:20.987191 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b79462f-fbdd-40be-9c60-adca2d053c26" containerName="oc" Mar 09 09:40:20 crc kubenswrapper[4971]: I0309 09:40:20.987387 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b79462f-fbdd-40be-9c60-adca2d053c26" containerName="oc" Mar 09 09:40:20 crc kubenswrapper[4971]: I0309 09:40:20.987408 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b90b7a65-9704-4b25-9ad9-56ed4bb14886" containerName="swift-ring-rebalance" Mar 09 09:40:20 crc kubenswrapper[4971]: I0309 09:40:20.988294 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" Mar 09 09:40:20 crc kubenswrapper[4971]: I0309 09:40:20.992555 4971 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-proxy-config-data" Mar 09 09:40:20 crc kubenswrapper[4971]: I0309 09:40:20.995191 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb"] Mar 09 09:40:21 crc kubenswrapper[4971]: I0309 09:40:21.108702 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-etc-swift\") pod \"swift-proxy-6fcb54769f-hp2hb\" (UID: \"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4\") " pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" Mar 09 09:40:21 crc kubenswrapper[4971]: I0309 09:40:21.108798 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftfz5\" (UniqueName: \"kubernetes.io/projected/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-kube-api-access-ftfz5\") pod \"swift-proxy-6fcb54769f-hp2hb\" (UID: \"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4\") " pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" Mar 09 09:40:21 crc kubenswrapper[4971]: I0309 09:40:21.108839 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-run-httpd\") pod \"swift-proxy-6fcb54769f-hp2hb\" (UID: \"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4\") " pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" Mar 09 09:40:21 crc kubenswrapper[4971]: I0309 09:40:21.108961 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-config-data\") pod \"swift-proxy-6fcb54769f-hp2hb\" (UID: \"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4\") " pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" Mar 09 09:40:21 crc kubenswrapper[4971]: I0309 09:40:21.109062 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-log-httpd\") pod \"swift-proxy-6fcb54769f-hp2hb\" (UID: \"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4\") " pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" Mar 09 09:40:21 crc kubenswrapper[4971]: I0309 09:40:21.210289 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftfz5\" (UniqueName: \"kubernetes.io/projected/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-kube-api-access-ftfz5\") pod \"swift-proxy-6fcb54769f-hp2hb\" (UID: \"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4\") " pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" Mar 09 09:40:21 crc kubenswrapper[4971]: I0309 09:40:21.210382 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-run-httpd\") pod \"swift-proxy-6fcb54769f-hp2hb\" (UID: \"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4\") " pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" Mar 09 09:40:21 crc kubenswrapper[4971]: I0309 09:40:21.210448 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-config-data\") pod \"swift-proxy-6fcb54769f-hp2hb\" (UID: \"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4\") " pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" Mar 09 09:40:21 crc kubenswrapper[4971]: I0309 09:40:21.210534 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-log-httpd\") pod \"swift-proxy-6fcb54769f-hp2hb\" (UID: \"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4\") " pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" Mar 09 09:40:21 crc kubenswrapper[4971]: I0309 09:40:21.210567 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-etc-swift\") pod \"swift-proxy-6fcb54769f-hp2hb\" (UID: \"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4\") " pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" Mar 09 09:40:21 crc kubenswrapper[4971]: I0309 09:40:21.211719 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-log-httpd\") pod \"swift-proxy-6fcb54769f-hp2hb\" (UID: \"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4\") " pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" Mar 09 09:40:21 crc kubenswrapper[4971]: I0309 09:40:21.212096 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-run-httpd\") pod \"swift-proxy-6fcb54769f-hp2hb\" (UID: \"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4\") " pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" Mar 09 09:40:21 crc kubenswrapper[4971]: I0309 09:40:21.217089 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-config-data\") pod \"swift-proxy-6fcb54769f-hp2hb\" (UID: \"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4\") " pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" Mar 09 09:40:21 crc kubenswrapper[4971]: I0309 09:40:21.217237 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-etc-swift\") pod \"swift-proxy-6fcb54769f-hp2hb\" (UID: \"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4\") " pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" Mar 09 09:40:21 crc kubenswrapper[4971]: I0309 09:40:21.230066 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftfz5\" (UniqueName: \"kubernetes.io/projected/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-kube-api-access-ftfz5\") pod \"swift-proxy-6fcb54769f-hp2hb\" (UID: \"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4\") " pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" Mar 09 09:40:21 crc kubenswrapper[4971]: I0309 09:40:21.318169 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" Mar 09 09:40:21 crc kubenswrapper[4971]: I0309 09:40:21.725382 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb"] Mar 09 09:40:22 crc kubenswrapper[4971]: I0309 09:40:22.262977 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" event={"ID":"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4","Type":"ContainerStarted","Data":"448453c8bf3527aa1dc5871a57c937e5c9757f4fe793744201997b2ed232154c"} Mar 09 09:40:22 crc kubenswrapper[4971]: I0309 09:40:22.263423 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" event={"ID":"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4","Type":"ContainerStarted","Data":"f6992a76acc8f35c94812030cb8c28fdce50d0cc6d424c959284f760cf2f0eb2"} Mar 09 09:40:22 crc kubenswrapper[4971]: I0309 09:40:22.263438 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" event={"ID":"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4","Type":"ContainerStarted","Data":"0e2689e9b77c35b20af58de129e3a1646e5cc3e031477dab4cc4252b9d329442"} Mar 09 09:40:22 crc kubenswrapper[4971]: I0309 09:40:22.263454 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" Mar 09 09:40:22 crc kubenswrapper[4971]: I0309 09:40:22.285772 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" podStartSLOduration=2.285751143 podStartE2EDuration="2.285751143s" podCreationTimestamp="2026-03-09 09:40:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:40:22.281293125 +0000 UTC m=+1225.841220935" watchObservedRunningTime="2026-03-09 09:40:22.285751143 +0000 UTC m=+1225.845678953" Mar 09 09:40:23 crc kubenswrapper[4971]: I0309 09:40:23.268527 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" Mar 09 09:40:26 crc kubenswrapper[4971]: I0309 09:40:26.324980 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" Mar 09 09:40:26 crc kubenswrapper[4971]: I0309 09:40:26.325862 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" Mar 09 09:40:28 crc kubenswrapper[4971]: I0309 09:40:28.115288 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk"] Mar 09 09:40:28 crc kubenswrapper[4971]: I0309 09:40:28.116899 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk" Mar 09 09:40:28 crc kubenswrapper[4971]: I0309 09:40:28.120282 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:40:28 crc kubenswrapper[4971]: I0309 09:40:28.129004 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk"] Mar 09 09:40:28 crc kubenswrapper[4971]: I0309 09:40:28.129289 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:40:28 crc kubenswrapper[4971]: I0309 09:40:28.243684 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-dispersionconf\") pod \"swift-ring-rebalance-debug-lkwgk\" (UID: \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk" Mar 09 09:40:28 crc kubenswrapper[4971]: I0309 09:40:28.243742 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-ring-data-devices\") pod \"swift-ring-rebalance-debug-lkwgk\" (UID: \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk" Mar 09 09:40:28 crc kubenswrapper[4971]: I0309 09:40:28.243761 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-scripts\") pod \"swift-ring-rebalance-debug-lkwgk\" (UID: \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk" Mar 09 09:40:28 crc kubenswrapper[4971]: I0309 09:40:28.243831 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-etc-swift\") pod \"swift-ring-rebalance-debug-lkwgk\" (UID: \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk" Mar 09 09:40:28 crc kubenswrapper[4971]: I0309 09:40:28.243931 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckkn8\" (UniqueName: \"kubernetes.io/projected/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-kube-api-access-ckkn8\") pod \"swift-ring-rebalance-debug-lkwgk\" (UID: \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk" Mar 09 09:40:28 crc kubenswrapper[4971]: I0309 09:40:28.244030 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-swiftconf\") pod \"swift-ring-rebalance-debug-lkwgk\" (UID: \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk" Mar 09 09:40:28 crc kubenswrapper[4971]: I0309 09:40:28.345448 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-dispersionconf\") pod \"swift-ring-rebalance-debug-lkwgk\" (UID: \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk" Mar 09 09:40:28 crc kubenswrapper[4971]: I0309 09:40:28.345510 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-ring-data-devices\") pod \"swift-ring-rebalance-debug-lkwgk\" (UID: \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk" Mar 09 09:40:28 crc kubenswrapper[4971]: I0309 09:40:28.345547 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-scripts\") pod \"swift-ring-rebalance-debug-lkwgk\" (UID: \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk" Mar 09 09:40:28 crc kubenswrapper[4971]: I0309 09:40:28.345595 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-etc-swift\") pod \"swift-ring-rebalance-debug-lkwgk\" (UID: \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk" Mar 09 09:40:28 crc kubenswrapper[4971]: I0309 09:40:28.345618 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckkn8\" (UniqueName: \"kubernetes.io/projected/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-kube-api-access-ckkn8\") pod \"swift-ring-rebalance-debug-lkwgk\" (UID: \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk" Mar 09 09:40:28 crc kubenswrapper[4971]: I0309 09:40:28.345682 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-swiftconf\") pod \"swift-ring-rebalance-debug-lkwgk\" (UID: \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk" Mar 09 09:40:28 crc kubenswrapper[4971]: I0309 09:40:28.346116 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-etc-swift\") pod \"swift-ring-rebalance-debug-lkwgk\" (UID: \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk" Mar 09 09:40:28 crc kubenswrapper[4971]: I0309 09:40:28.346335 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-ring-data-devices\") pod \"swift-ring-rebalance-debug-lkwgk\" (UID: \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk" Mar 09 09:40:28 crc kubenswrapper[4971]: I0309 09:40:28.346497 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-scripts\") pod \"swift-ring-rebalance-debug-lkwgk\" (UID: \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk" Mar 09 09:40:28 crc kubenswrapper[4971]: I0309 09:40:28.350544 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-swiftconf\") pod \"swift-ring-rebalance-debug-lkwgk\" (UID: \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk" Mar 09 09:40:28 crc kubenswrapper[4971]: I0309 09:40:28.350649 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-dispersionconf\") pod \"swift-ring-rebalance-debug-lkwgk\" (UID: \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk" Mar 09 09:40:28 crc kubenswrapper[4971]: I0309 09:40:28.360449 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckkn8\" (UniqueName: \"kubernetes.io/projected/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-kube-api-access-ckkn8\") pod \"swift-ring-rebalance-debug-lkwgk\" (UID: \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk" Mar 09 09:40:28 crc kubenswrapper[4971]: I0309 09:40:28.446693 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk" Mar 09 09:40:28 crc kubenswrapper[4971]: I0309 09:40:28.871109 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk"] Mar 09 09:40:28 crc kubenswrapper[4971]: W0309 09:40:28.874709 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dcbe3ed_4f06_47ef_a40f_8e74b9dead13.slice/crio-3d751eb0614a0916d05595f9c1f1add0acdc379845012956d6f2e595f5b61db0 WatchSource:0}: Error finding container 3d751eb0614a0916d05595f9c1f1add0acdc379845012956d6f2e595f5b61db0: Status 404 returned error can't find the container with id 3d751eb0614a0916d05595f9c1f1add0acdc379845012956d6f2e595f5b61db0 Mar 09 09:40:29 crc kubenswrapper[4971]: I0309 09:40:29.314554 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk" event={"ID":"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13","Type":"ContainerStarted","Data":"4e17ff67043c8a4c5bc82b0e043e0e00ca12a3119aba0c9683f31883e5175949"} Mar 09 09:40:29 crc kubenswrapper[4971]: I0309 09:40:29.314601 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk" event={"ID":"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13","Type":"ContainerStarted","Data":"3d751eb0614a0916d05595f9c1f1add0acdc379845012956d6f2e595f5b61db0"} Mar 09 09:40:29 crc kubenswrapper[4971]: I0309 09:40:29.339557 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk" podStartSLOduration=1.3395320210000001 podStartE2EDuration="1.339532021s" podCreationTimestamp="2026-03-09 09:40:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:40:29.32976137 +0000 UTC m=+1232.889689180" watchObservedRunningTime="2026-03-09 09:40:29.339532021 +0000 UTC m=+1232.899459841" Mar 09 09:40:31 crc kubenswrapper[4971]: I0309 09:40:31.330637 4971 generic.go:334] "Generic (PLEG): container finished" podID="8dcbe3ed-4f06-47ef-a40f-8e74b9dead13" containerID="4e17ff67043c8a4c5bc82b0e043e0e00ca12a3119aba0c9683f31883e5175949" exitCode=0 Mar 09 09:40:31 crc kubenswrapper[4971]: I0309 09:40:31.330710 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk" event={"ID":"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13","Type":"ContainerDied","Data":"4e17ff67043c8a4c5bc82b0e043e0e00ca12a3119aba0c9683f31883e5175949"} Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.603878 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.643245 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk"] Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.651778 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk"] Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.710544 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckkn8\" (UniqueName: \"kubernetes.io/projected/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-kube-api-access-ckkn8\") pod \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\" (UID: \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\") " Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.711515 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-scripts\") pod \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\" (UID: \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\") " Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.711710 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-ring-data-devices\") pod \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\" (UID: \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\") " Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.712519 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-etc-swift\") pod \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\" (UID: \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\") " Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.712427 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8dcbe3ed-4f06-47ef-a40f-8e74b9dead13" (UID: "8dcbe3ed-4f06-47ef-a40f-8e74b9dead13"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.712622 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-dispersionconf\") pod \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\" (UID: \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\") " Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.712655 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-swiftconf\") pod \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\" (UID: \"8dcbe3ed-4f06-47ef-a40f-8e74b9dead13\") " Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.713263 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.713456 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8dcbe3ed-4f06-47ef-a40f-8e74b9dead13" (UID: "8dcbe3ed-4f06-47ef-a40f-8e74b9dead13"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.715444 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-kube-api-access-ckkn8" (OuterVolumeSpecName: "kube-api-access-ckkn8") pod "8dcbe3ed-4f06-47ef-a40f-8e74b9dead13" (UID: "8dcbe3ed-4f06-47ef-a40f-8e74b9dead13"). InnerVolumeSpecName "kube-api-access-ckkn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.730906 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-scripts" (OuterVolumeSpecName: "scripts") pod "8dcbe3ed-4f06-47ef-a40f-8e74b9dead13" (UID: "8dcbe3ed-4f06-47ef-a40f-8e74b9dead13"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.735573 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8dcbe3ed-4f06-47ef-a40f-8e74b9dead13" (UID: "8dcbe3ed-4f06-47ef-a40f-8e74b9dead13"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.736672 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8dcbe3ed-4f06-47ef-a40f-8e74b9dead13" (UID: "8dcbe3ed-4f06-47ef-a40f-8e74b9dead13"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.779769 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6fctn"] Mar 09 09:40:32 crc kubenswrapper[4971]: E0309 09:40:32.780108 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dcbe3ed-4f06-47ef-a40f-8e74b9dead13" containerName="swift-ring-rebalance" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.780130 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dcbe3ed-4f06-47ef-a40f-8e74b9dead13" containerName="swift-ring-rebalance" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.780311 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dcbe3ed-4f06-47ef-a40f-8e74b9dead13" containerName="swift-ring-rebalance" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.780849 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fctn" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.794249 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6fctn"] Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.814731 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/35dbab26-bcf8-4b7e-94da-a7b9501ac576-dispersionconf\") pod \"swift-ring-rebalance-debug-6fctn\" (UID: \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fctn" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.814814 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35dbab26-bcf8-4b7e-94da-a7b9501ac576-scripts\") pod \"swift-ring-rebalance-debug-6fctn\" (UID: \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fctn" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.814851 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/35dbab26-bcf8-4b7e-94da-a7b9501ac576-ring-data-devices\") pod \"swift-ring-rebalance-debug-6fctn\" (UID: \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fctn" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.814884 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4hst\" (UniqueName: \"kubernetes.io/projected/35dbab26-bcf8-4b7e-94da-a7b9501ac576-kube-api-access-c4hst\") pod \"swift-ring-rebalance-debug-6fctn\" (UID: \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fctn" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.814943 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/35dbab26-bcf8-4b7e-94da-a7b9501ac576-etc-swift\") pod \"swift-ring-rebalance-debug-6fctn\" (UID: \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fctn" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.814969 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/35dbab26-bcf8-4b7e-94da-a7b9501ac576-swiftconf\") pod \"swift-ring-rebalance-debug-6fctn\" (UID: \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fctn" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.815019 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckkn8\" (UniqueName: \"kubernetes.io/projected/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-kube-api-access-ckkn8\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.815031 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.815040 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.815049 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.815090 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.916079 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35dbab26-bcf8-4b7e-94da-a7b9501ac576-scripts\") pod \"swift-ring-rebalance-debug-6fctn\" (UID: \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fctn" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.916149 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/35dbab26-bcf8-4b7e-94da-a7b9501ac576-ring-data-devices\") pod \"swift-ring-rebalance-debug-6fctn\" (UID: \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fctn" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.916199 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4hst\" (UniqueName: \"kubernetes.io/projected/35dbab26-bcf8-4b7e-94da-a7b9501ac576-kube-api-access-c4hst\") pod \"swift-ring-rebalance-debug-6fctn\" (UID: \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fctn" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.916232 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/35dbab26-bcf8-4b7e-94da-a7b9501ac576-etc-swift\") pod \"swift-ring-rebalance-debug-6fctn\" (UID: \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fctn" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.916263 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/35dbab26-bcf8-4b7e-94da-a7b9501ac576-swiftconf\") pod \"swift-ring-rebalance-debug-6fctn\" (UID: \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fctn" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.916306 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/35dbab26-bcf8-4b7e-94da-a7b9501ac576-dispersionconf\") pod \"swift-ring-rebalance-debug-6fctn\" (UID: \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fctn" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.917017 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/35dbab26-bcf8-4b7e-94da-a7b9501ac576-ring-data-devices\") pod \"swift-ring-rebalance-debug-6fctn\" (UID: \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fctn" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.917020 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35dbab26-bcf8-4b7e-94da-a7b9501ac576-scripts\") pod \"swift-ring-rebalance-debug-6fctn\" (UID: \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fctn" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.917605 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/35dbab26-bcf8-4b7e-94da-a7b9501ac576-etc-swift\") pod \"swift-ring-rebalance-debug-6fctn\" (UID: \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fctn" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.920199 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/35dbab26-bcf8-4b7e-94da-a7b9501ac576-dispersionconf\") pod \"swift-ring-rebalance-debug-6fctn\" (UID: \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fctn" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.920296 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/35dbab26-bcf8-4b7e-94da-a7b9501ac576-swiftconf\") pod \"swift-ring-rebalance-debug-6fctn\" (UID: \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fctn" Mar 09 09:40:32 crc kubenswrapper[4971]: I0309 09:40:32.941521 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4hst\" (UniqueName: \"kubernetes.io/projected/35dbab26-bcf8-4b7e-94da-a7b9501ac576-kube-api-access-c4hst\") pod \"swift-ring-rebalance-debug-6fctn\" (UID: \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fctn" Mar 09 09:40:33 crc kubenswrapper[4971]: I0309 09:40:33.101599 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fctn" Mar 09 09:40:33 crc kubenswrapper[4971]: I0309 09:40:33.171086 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dcbe3ed-4f06-47ef-a40f-8e74b9dead13" path="/var/lib/kubelet/pods/8dcbe3ed-4f06-47ef-a40f-8e74b9dead13/volumes" Mar 09 09:40:33 crc kubenswrapper[4971]: I0309 09:40:33.395340 4971 scope.go:117] "RemoveContainer" containerID="4e17ff67043c8a4c5bc82b0e043e0e00ca12a3119aba0c9683f31883e5175949" Mar 09 09:40:33 crc kubenswrapper[4971]: I0309 09:40:33.395912 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lkwgk" Mar 09 09:40:33 crc kubenswrapper[4971]: I0309 09:40:33.548711 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6fctn"] Mar 09 09:40:34 crc kubenswrapper[4971]: I0309 09:40:34.406284 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fctn" event={"ID":"35dbab26-bcf8-4b7e-94da-a7b9501ac576","Type":"ContainerStarted","Data":"8ee4b6fa6ddf257f9019cad349464e7cc16abe10a077a926dffb5d1457bb6a3a"} Mar 09 09:40:34 crc kubenswrapper[4971]: I0309 09:40:34.406856 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fctn" event={"ID":"35dbab26-bcf8-4b7e-94da-a7b9501ac576","Type":"ContainerStarted","Data":"b6f5667de67ad3ed43ba3b97a17cf50f65ce2efbd3bb313ff76f83f1b49cf1eb"} Mar 09 09:40:34 crc kubenswrapper[4971]: I0309 09:40:34.421479 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fctn" podStartSLOduration=2.421459031 podStartE2EDuration="2.421459031s" podCreationTimestamp="2026-03-09 09:40:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:40:34.42072882 +0000 UTC m=+1237.980656640" watchObservedRunningTime="2026-03-09 09:40:34.421459031 +0000 UTC m=+1237.981386841" Mar 09 09:40:35 crc kubenswrapper[4971]: I0309 09:40:35.416861 4971 generic.go:334] "Generic (PLEG): container finished" podID="35dbab26-bcf8-4b7e-94da-a7b9501ac576" containerID="8ee4b6fa6ddf257f9019cad349464e7cc16abe10a077a926dffb5d1457bb6a3a" exitCode=0 Mar 09 09:40:35 crc kubenswrapper[4971]: I0309 09:40:35.416903 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fctn" event={"ID":"35dbab26-bcf8-4b7e-94da-a7b9501ac576","Type":"ContainerDied","Data":"8ee4b6fa6ddf257f9019cad349464e7cc16abe10a077a926dffb5d1457bb6a3a"} Mar 09 09:40:36 crc kubenswrapper[4971]: I0309 09:40:36.719852 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fctn" Mar 09 09:40:36 crc kubenswrapper[4971]: I0309 09:40:36.759401 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6fctn"] Mar 09 09:40:36 crc kubenswrapper[4971]: I0309 09:40:36.766760 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6fctn"] Mar 09 09:40:36 crc kubenswrapper[4971]: I0309 09:40:36.894410 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/35dbab26-bcf8-4b7e-94da-a7b9501ac576-dispersionconf\") pod \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\" (UID: \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\") " Mar 09 09:40:36 crc kubenswrapper[4971]: I0309 09:40:36.894496 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35dbab26-bcf8-4b7e-94da-a7b9501ac576-scripts\") pod \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\" (UID: \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\") " Mar 09 09:40:36 crc kubenswrapper[4971]: I0309 09:40:36.894574 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/35dbab26-bcf8-4b7e-94da-a7b9501ac576-etc-swift\") pod \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\" (UID: \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\") " Mar 09 09:40:36 crc kubenswrapper[4971]: I0309 09:40:36.894636 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/35dbab26-bcf8-4b7e-94da-a7b9501ac576-ring-data-devices\") pod \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\" (UID: \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\") " Mar 09 09:40:36 crc kubenswrapper[4971]: I0309 09:40:36.894728 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4hst\" (UniqueName: \"kubernetes.io/projected/35dbab26-bcf8-4b7e-94da-a7b9501ac576-kube-api-access-c4hst\") pod \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\" (UID: \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\") " Mar 09 09:40:36 crc kubenswrapper[4971]: I0309 09:40:36.894917 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/35dbab26-bcf8-4b7e-94da-a7b9501ac576-swiftconf\") pod \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\" (UID: \"35dbab26-bcf8-4b7e-94da-a7b9501ac576\") " Mar 09 09:40:36 crc kubenswrapper[4971]: I0309 09:40:36.895850 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35dbab26-bcf8-4b7e-94da-a7b9501ac576-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "35dbab26-bcf8-4b7e-94da-a7b9501ac576" (UID: "35dbab26-bcf8-4b7e-94da-a7b9501ac576"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:40:36 crc kubenswrapper[4971]: I0309 09:40:36.895935 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35dbab26-bcf8-4b7e-94da-a7b9501ac576-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "35dbab26-bcf8-4b7e-94da-a7b9501ac576" (UID: "35dbab26-bcf8-4b7e-94da-a7b9501ac576"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:40:36 crc kubenswrapper[4971]: I0309 09:40:36.902608 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35dbab26-bcf8-4b7e-94da-a7b9501ac576-kube-api-access-c4hst" (OuterVolumeSpecName: "kube-api-access-c4hst") pod "35dbab26-bcf8-4b7e-94da-a7b9501ac576" (UID: "35dbab26-bcf8-4b7e-94da-a7b9501ac576"). InnerVolumeSpecName "kube-api-access-c4hst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:40:36 crc kubenswrapper[4971]: I0309 09:40:36.917684 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35dbab26-bcf8-4b7e-94da-a7b9501ac576-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "35dbab26-bcf8-4b7e-94da-a7b9501ac576" (UID: "35dbab26-bcf8-4b7e-94da-a7b9501ac576"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:40:36 crc kubenswrapper[4971]: I0309 09:40:36.918066 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35dbab26-bcf8-4b7e-94da-a7b9501ac576-scripts" (OuterVolumeSpecName: "scripts") pod "35dbab26-bcf8-4b7e-94da-a7b9501ac576" (UID: "35dbab26-bcf8-4b7e-94da-a7b9501ac576"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:40:36 crc kubenswrapper[4971]: I0309 09:40:36.928622 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35dbab26-bcf8-4b7e-94da-a7b9501ac576-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "35dbab26-bcf8-4b7e-94da-a7b9501ac576" (UID: "35dbab26-bcf8-4b7e-94da-a7b9501ac576"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:40:36 crc kubenswrapper[4971]: I0309 09:40:36.996234 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/35dbab26-bcf8-4b7e-94da-a7b9501ac576-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:36 crc kubenswrapper[4971]: I0309 09:40:36.996268 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/35dbab26-bcf8-4b7e-94da-a7b9501ac576-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:36 crc kubenswrapper[4971]: I0309 09:40:36.996278 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35dbab26-bcf8-4b7e-94da-a7b9501ac576-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:36 crc kubenswrapper[4971]: I0309 09:40:36.996286 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/35dbab26-bcf8-4b7e-94da-a7b9501ac576-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:36 crc kubenswrapper[4971]: I0309 09:40:36.996294 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/35dbab26-bcf8-4b7e-94da-a7b9501ac576-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:36 crc kubenswrapper[4971]: I0309 09:40:36.996304 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4hst\" (UniqueName: \"kubernetes.io/projected/35dbab26-bcf8-4b7e-94da-a7b9501ac576-kube-api-access-c4hst\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:37 crc kubenswrapper[4971]: I0309 09:40:37.159402 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35dbab26-bcf8-4b7e-94da-a7b9501ac576" path="/var/lib/kubelet/pods/35dbab26-bcf8-4b7e-94da-a7b9501ac576/volumes" Mar 09 09:40:37 crc kubenswrapper[4971]: I0309 09:40:37.438612 4971 scope.go:117] "RemoveContainer" containerID="8ee4b6fa6ddf257f9019cad349464e7cc16abe10a077a926dffb5d1457bb6a3a" Mar 09 09:40:37 crc kubenswrapper[4971]: I0309 09:40:37.438679 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fctn" Mar 09 09:41:03 crc kubenswrapper[4971]: I0309 09:41:03.974251 4971 scope.go:117] "RemoveContainer" containerID="92a19ad23e1640c11c41f05e7db9e460f2c9b32944f1ef05fceb489fb73ec537" Mar 09 09:41:13 crc kubenswrapper[4971]: E0309 09:41:13.529972 4971 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.238:56304->38.102.83.238:42765: write tcp 38.102.83.238:56304->38.102.83.238:42765: write: broken pipe Mar 09 09:41:44 crc kubenswrapper[4971]: I0309 09:41:44.795928 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:41:44 crc kubenswrapper[4971]: I0309 09:41:44.796626 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:42:00 crc kubenswrapper[4971]: I0309 09:42:00.151802 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550822-cqmvz"] Mar 09 09:42:00 crc kubenswrapper[4971]: E0309 09:42:00.152520 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35dbab26-bcf8-4b7e-94da-a7b9501ac576" containerName="swift-ring-rebalance" Mar 09 09:42:00 crc kubenswrapper[4971]: I0309 09:42:00.152535 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="35dbab26-bcf8-4b7e-94da-a7b9501ac576" containerName="swift-ring-rebalance" Mar 09 09:42:00 crc kubenswrapper[4971]: I0309 09:42:00.152722 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="35dbab26-bcf8-4b7e-94da-a7b9501ac576" containerName="swift-ring-rebalance" Mar 09 09:42:00 crc kubenswrapper[4971]: I0309 09:42:00.153392 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550822-cqmvz" Mar 09 09:42:00 crc kubenswrapper[4971]: I0309 09:42:00.155393 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-xhrv2" Mar 09 09:42:00 crc kubenswrapper[4971]: I0309 09:42:00.155554 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:42:00 crc kubenswrapper[4971]: I0309 09:42:00.157847 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:42:00 crc kubenswrapper[4971]: I0309 09:42:00.170682 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550822-cqmvz"] Mar 09 09:42:00 crc kubenswrapper[4971]: I0309 09:42:00.245049 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvssb\" (UniqueName: \"kubernetes.io/projected/ca618e3d-049e-4a5b-b460-13fb0c3ad5d2-kube-api-access-dvssb\") pod \"auto-csr-approver-29550822-cqmvz\" (UID: \"ca618e3d-049e-4a5b-b460-13fb0c3ad5d2\") " pod="openshift-infra/auto-csr-approver-29550822-cqmvz" Mar 09 09:42:00 crc kubenswrapper[4971]: I0309 09:42:00.346930 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvssb\" (UniqueName: \"kubernetes.io/projected/ca618e3d-049e-4a5b-b460-13fb0c3ad5d2-kube-api-access-dvssb\") pod \"auto-csr-approver-29550822-cqmvz\" (UID: \"ca618e3d-049e-4a5b-b460-13fb0c3ad5d2\") " pod="openshift-infra/auto-csr-approver-29550822-cqmvz" Mar 09 09:42:00 crc kubenswrapper[4971]: I0309 09:42:00.370814 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvssb\" (UniqueName: \"kubernetes.io/projected/ca618e3d-049e-4a5b-b460-13fb0c3ad5d2-kube-api-access-dvssb\") pod \"auto-csr-approver-29550822-cqmvz\" (UID: \"ca618e3d-049e-4a5b-b460-13fb0c3ad5d2\") " pod="openshift-infra/auto-csr-approver-29550822-cqmvz" Mar 09 09:42:00 crc kubenswrapper[4971]: I0309 09:42:00.472689 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550822-cqmvz" Mar 09 09:42:00 crc kubenswrapper[4971]: I0309 09:42:00.913342 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550822-cqmvz"] Mar 09 09:42:00 crc kubenswrapper[4971]: W0309 09:42:00.919497 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca618e3d_049e_4a5b_b460_13fb0c3ad5d2.slice/crio-c4d0f1037572d0cc08e5f5bc80f76f2522e2ab3b89df6ec921caf6f135eaac01 WatchSource:0}: Error finding container c4d0f1037572d0cc08e5f5bc80f76f2522e2ab3b89df6ec921caf6f135eaac01: Status 404 returned error can't find the container with id c4d0f1037572d0cc08e5f5bc80f76f2522e2ab3b89df6ec921caf6f135eaac01 Mar 09 09:42:01 crc kubenswrapper[4971]: I0309 09:42:01.090509 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550822-cqmvz" event={"ID":"ca618e3d-049e-4a5b-b460-13fb0c3ad5d2","Type":"ContainerStarted","Data":"c4d0f1037572d0cc08e5f5bc80f76f2522e2ab3b89df6ec921caf6f135eaac01"} Mar 09 09:42:03 crc kubenswrapper[4971]: I0309 09:42:03.105224 4971 generic.go:334] "Generic (PLEG): container finished" podID="ca618e3d-049e-4a5b-b460-13fb0c3ad5d2" containerID="9a8ab46a496be8b8181a93a0b8ac165aad47948812fbaf5ea7e46650de7c9084" exitCode=0 Mar 09 09:42:03 crc kubenswrapper[4971]: I0309 09:42:03.105437 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550822-cqmvz" event={"ID":"ca618e3d-049e-4a5b-b460-13fb0c3ad5d2","Type":"ContainerDied","Data":"9a8ab46a496be8b8181a93a0b8ac165aad47948812fbaf5ea7e46650de7c9084"} Mar 09 09:42:04 crc kubenswrapper[4971]: I0309 09:42:04.369029 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550822-cqmvz" Mar 09 09:42:04 crc kubenswrapper[4971]: I0309 09:42:04.516024 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvssb\" (UniqueName: \"kubernetes.io/projected/ca618e3d-049e-4a5b-b460-13fb0c3ad5d2-kube-api-access-dvssb\") pod \"ca618e3d-049e-4a5b-b460-13fb0c3ad5d2\" (UID: \"ca618e3d-049e-4a5b-b460-13fb0c3ad5d2\") " Mar 09 09:42:04 crc kubenswrapper[4971]: I0309 09:42:04.522526 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca618e3d-049e-4a5b-b460-13fb0c3ad5d2-kube-api-access-dvssb" (OuterVolumeSpecName: "kube-api-access-dvssb") pod "ca618e3d-049e-4a5b-b460-13fb0c3ad5d2" (UID: "ca618e3d-049e-4a5b-b460-13fb0c3ad5d2"). InnerVolumeSpecName "kube-api-access-dvssb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:42:04 crc kubenswrapper[4971]: I0309 09:42:04.617288 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvssb\" (UniqueName: \"kubernetes.io/projected/ca618e3d-049e-4a5b-b460-13fb0c3ad5d2-kube-api-access-dvssb\") on node \"crc\" DevicePath \"\"" Mar 09 09:42:05 crc kubenswrapper[4971]: I0309 09:42:05.127691 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550822-cqmvz" event={"ID":"ca618e3d-049e-4a5b-b460-13fb0c3ad5d2","Type":"ContainerDied","Data":"c4d0f1037572d0cc08e5f5bc80f76f2522e2ab3b89df6ec921caf6f135eaac01"} Mar 09 09:42:05 crc kubenswrapper[4971]: I0309 09:42:05.128106 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4d0f1037572d0cc08e5f5bc80f76f2522e2ab3b89df6ec921caf6f135eaac01" Mar 09 09:42:05 crc kubenswrapper[4971]: I0309 09:42:05.127981 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550822-cqmvz" Mar 09 09:42:05 crc kubenswrapper[4971]: I0309 09:42:05.427714 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550816-8d9h6"] Mar 09 09:42:05 crc kubenswrapper[4971]: I0309 09:42:05.437084 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550816-8d9h6"] Mar 09 09:42:07 crc kubenswrapper[4971]: I0309 09:42:07.160516 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57b3d640-db55-4357-9b31-46a45640a583" path="/var/lib/kubelet/pods/57b3d640-db55-4357-9b31-46a45640a583/volumes" Mar 09 09:42:14 crc kubenswrapper[4971]: I0309 09:42:14.794481 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:42:14 crc kubenswrapper[4971]: I0309 09:42:14.795071 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:42:44 crc kubenswrapper[4971]: I0309 09:42:44.794612 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:42:44 crc kubenswrapper[4971]: I0309 09:42:44.795140 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:42:44 crc kubenswrapper[4971]: I0309 09:42:44.795183 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" Mar 09 09:42:44 crc kubenswrapper[4971]: I0309 09:42:44.795636 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cc375558fe6e32e81af0357f1b5962f3f3827247e841efa171f347f6cf29b99c"} pod="openshift-machine-config-operator/machine-config-daemon-p56wx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:42:44 crc kubenswrapper[4971]: I0309 09:42:44.795682 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" containerID="cri-o://cc375558fe6e32e81af0357f1b5962f3f3827247e841efa171f347f6cf29b99c" gracePeriod=600 Mar 09 09:42:45 crc kubenswrapper[4971]: I0309 09:42:45.416899 4971 generic.go:334] "Generic (PLEG): container finished" podID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerID="cc375558fe6e32e81af0357f1b5962f3f3827247e841efa171f347f6cf29b99c" exitCode=0 Mar 09 09:42:45 crc kubenswrapper[4971]: I0309 09:42:45.416976 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" event={"ID":"05fde3ad-1182-4b15-bb1a-f365ecc92d75","Type":"ContainerDied","Data":"cc375558fe6e32e81af0357f1b5962f3f3827247e841efa171f347f6cf29b99c"} Mar 09 09:42:45 crc kubenswrapper[4971]: I0309 09:42:45.417609 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" event={"ID":"05fde3ad-1182-4b15-bb1a-f365ecc92d75","Type":"ContainerStarted","Data":"0632ac83f355f18592d74efe661ec3d1b8f6614853f6a58652b0adb7bc649d73"} Mar 09 09:42:45 crc kubenswrapper[4971]: I0309 09:42:45.417630 4971 scope.go:117] "RemoveContainer" containerID="850265ce9f01a5c63d70bb3589fb993cb12b2014828540d2dac94573f14584e1" Mar 09 09:43:04 crc kubenswrapper[4971]: I0309 09:43:04.078124 4971 scope.go:117] "RemoveContainer" containerID="f1721017276f6a271ee3c417166add857e6faf7b1a4707bafe463298269b1ab2" Mar 09 09:43:18 crc kubenswrapper[4971]: E0309 09:43:18.398183 4971 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.238:50436->38.102.83.238:42765: write tcp 38.102.83.238:50436->38.102.83.238:42765: write: broken pipe Mar 09 09:44:00 crc kubenswrapper[4971]: I0309 09:44:00.134140 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550824-9v5wh"] Mar 09 09:44:00 crc kubenswrapper[4971]: E0309 09:44:00.135114 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca618e3d-049e-4a5b-b460-13fb0c3ad5d2" containerName="oc" Mar 09 09:44:00 crc kubenswrapper[4971]: I0309 09:44:00.135132 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca618e3d-049e-4a5b-b460-13fb0c3ad5d2" containerName="oc" Mar 09 09:44:00 crc kubenswrapper[4971]: I0309 09:44:00.135869 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca618e3d-049e-4a5b-b460-13fb0c3ad5d2" containerName="oc" Mar 09 09:44:00 crc kubenswrapper[4971]: I0309 09:44:00.137694 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550824-9v5wh" Mar 09 09:44:00 crc kubenswrapper[4971]: I0309 09:44:00.145928 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550824-9v5wh"] Mar 09 09:44:00 crc kubenswrapper[4971]: I0309 09:44:00.151059 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-xhrv2" Mar 09 09:44:00 crc kubenswrapper[4971]: I0309 09:44:00.151250 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:44:00 crc kubenswrapper[4971]: I0309 09:44:00.151543 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:44:00 crc kubenswrapper[4971]: I0309 09:44:00.324040 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgdp9\" (UniqueName: \"kubernetes.io/projected/65488d0c-4805-4f55-ac7c-8c838a321dd8-kube-api-access-tgdp9\") pod \"auto-csr-approver-29550824-9v5wh\" (UID: \"65488d0c-4805-4f55-ac7c-8c838a321dd8\") " pod="openshift-infra/auto-csr-approver-29550824-9v5wh" Mar 09 09:44:00 crc kubenswrapper[4971]: I0309 09:44:00.425869 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgdp9\" (UniqueName: \"kubernetes.io/projected/65488d0c-4805-4f55-ac7c-8c838a321dd8-kube-api-access-tgdp9\") pod \"auto-csr-approver-29550824-9v5wh\" (UID: \"65488d0c-4805-4f55-ac7c-8c838a321dd8\") " pod="openshift-infra/auto-csr-approver-29550824-9v5wh" Mar 09 09:44:00 crc kubenswrapper[4971]: I0309 09:44:00.454126 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgdp9\" (UniqueName: \"kubernetes.io/projected/65488d0c-4805-4f55-ac7c-8c838a321dd8-kube-api-access-tgdp9\") pod \"auto-csr-approver-29550824-9v5wh\" (UID: \"65488d0c-4805-4f55-ac7c-8c838a321dd8\") " pod="openshift-infra/auto-csr-approver-29550824-9v5wh" Mar 09 09:44:00 crc kubenswrapper[4971]: I0309 09:44:00.470746 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550824-9v5wh" Mar 09 09:44:00 crc kubenswrapper[4971]: I0309 09:44:00.882946 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550824-9v5wh"] Mar 09 09:44:00 crc kubenswrapper[4971]: I0309 09:44:00.891996 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:44:01 crc kubenswrapper[4971]: I0309 09:44:01.038203 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550824-9v5wh" event={"ID":"65488d0c-4805-4f55-ac7c-8c838a321dd8","Type":"ContainerStarted","Data":"c5fadc4140889c25e2fb5b269aad852f94902eaaa067e7a0f850dc16543e61a4"} Mar 09 09:44:03 crc kubenswrapper[4971]: I0309 09:44:03.056017 4971 generic.go:334] "Generic (PLEG): container finished" podID="65488d0c-4805-4f55-ac7c-8c838a321dd8" containerID="31610ab8cdf335343919d37271e3ad98627b76e135511459bf3992633eea2899" exitCode=0 Mar 09 09:44:03 crc kubenswrapper[4971]: I0309 09:44:03.056072 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550824-9v5wh" event={"ID":"65488d0c-4805-4f55-ac7c-8c838a321dd8","Type":"ContainerDied","Data":"31610ab8cdf335343919d37271e3ad98627b76e135511459bf3992633eea2899"} Mar 09 09:44:04 crc kubenswrapper[4971]: I0309 09:44:04.348118 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550824-9v5wh" Mar 09 09:44:04 crc kubenswrapper[4971]: I0309 09:44:04.497622 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgdp9\" (UniqueName: \"kubernetes.io/projected/65488d0c-4805-4f55-ac7c-8c838a321dd8-kube-api-access-tgdp9\") pod \"65488d0c-4805-4f55-ac7c-8c838a321dd8\" (UID: \"65488d0c-4805-4f55-ac7c-8c838a321dd8\") " Mar 09 09:44:04 crc kubenswrapper[4971]: I0309 09:44:04.506849 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65488d0c-4805-4f55-ac7c-8c838a321dd8-kube-api-access-tgdp9" (OuterVolumeSpecName: "kube-api-access-tgdp9") pod "65488d0c-4805-4f55-ac7c-8c838a321dd8" (UID: "65488d0c-4805-4f55-ac7c-8c838a321dd8"). InnerVolumeSpecName "kube-api-access-tgdp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:44:04 crc kubenswrapper[4971]: I0309 09:44:04.599237 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgdp9\" (UniqueName: \"kubernetes.io/projected/65488d0c-4805-4f55-ac7c-8c838a321dd8-kube-api-access-tgdp9\") on node \"crc\" DevicePath \"\"" Mar 09 09:44:05 crc kubenswrapper[4971]: I0309 09:44:05.083683 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550824-9v5wh" event={"ID":"65488d0c-4805-4f55-ac7c-8c838a321dd8","Type":"ContainerDied","Data":"c5fadc4140889c25e2fb5b269aad852f94902eaaa067e7a0f850dc16543e61a4"} Mar 09 09:44:05 crc kubenswrapper[4971]: I0309 09:44:05.084133 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5fadc4140889c25e2fb5b269aad852f94902eaaa067e7a0f850dc16543e61a4" Mar 09 09:44:05 crc kubenswrapper[4971]: I0309 09:44:05.083739 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550824-9v5wh" Mar 09 09:44:05 crc kubenswrapper[4971]: I0309 09:44:05.404434 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550818-248bz"] Mar 09 09:44:05 crc kubenswrapper[4971]: I0309 09:44:05.409568 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550818-248bz"] Mar 09 09:44:07 crc kubenswrapper[4971]: I0309 09:44:07.166913 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a672281e-9e08-4c3f-8c0c-fd4acd6f0666" path="/var/lib/kubelet/pods/a672281e-9e08-4c3f-8c0c-fd4acd6f0666/volumes" Mar 09 09:44:33 crc kubenswrapper[4971]: E0309 09:44:33.335971 4971 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.238:47412->38.102.83.238:42765: write tcp 38.102.83.238:47412->38.102.83.238:42765: write: broken pipe Mar 09 09:45:00 crc kubenswrapper[4971]: I0309 09:45:00.149569 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550825-hdbml"] Mar 09 09:45:00 crc kubenswrapper[4971]: E0309 09:45:00.150628 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65488d0c-4805-4f55-ac7c-8c838a321dd8" containerName="oc" Mar 09 09:45:00 crc kubenswrapper[4971]: I0309 09:45:00.150641 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="65488d0c-4805-4f55-ac7c-8c838a321dd8" containerName="oc" Mar 09 09:45:00 crc kubenswrapper[4971]: I0309 09:45:00.150801 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="65488d0c-4805-4f55-ac7c-8c838a321dd8" containerName="oc" Mar 09 09:45:00 crc kubenswrapper[4971]: I0309 09:45:00.151235 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-hdbml" Mar 09 09:45:00 crc kubenswrapper[4971]: I0309 09:45:00.213283 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 09:45:00 crc kubenswrapper[4971]: I0309 09:45:00.213628 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 09:45:00 crc kubenswrapper[4971]: I0309 09:45:00.226584 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550825-hdbml"] Mar 09 09:45:00 crc kubenswrapper[4971]: I0309 09:45:00.275896 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h22vk\" (UniqueName: \"kubernetes.io/projected/68c9e4c8-9f54-4122-bc7d-d58c864d7cd9-kube-api-access-h22vk\") pod \"collect-profiles-29550825-hdbml\" (UID: \"68c9e4c8-9f54-4122-bc7d-d58c864d7cd9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-hdbml" Mar 09 09:45:00 crc kubenswrapper[4971]: I0309 09:45:00.276204 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68c9e4c8-9f54-4122-bc7d-d58c864d7cd9-config-volume\") pod \"collect-profiles-29550825-hdbml\" (UID: \"68c9e4c8-9f54-4122-bc7d-d58c864d7cd9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-hdbml" Mar 09 09:45:00 crc kubenswrapper[4971]: I0309 09:45:00.276393 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68c9e4c8-9f54-4122-bc7d-d58c864d7cd9-secret-volume\") pod \"collect-profiles-29550825-hdbml\" (UID: \"68c9e4c8-9f54-4122-bc7d-d58c864d7cd9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-hdbml" Mar 09 09:45:00 crc kubenswrapper[4971]: I0309 09:45:00.377617 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h22vk\" (UniqueName: \"kubernetes.io/projected/68c9e4c8-9f54-4122-bc7d-d58c864d7cd9-kube-api-access-h22vk\") pod \"collect-profiles-29550825-hdbml\" (UID: \"68c9e4c8-9f54-4122-bc7d-d58c864d7cd9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-hdbml" Mar 09 09:45:00 crc kubenswrapper[4971]: I0309 09:45:00.377750 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68c9e4c8-9f54-4122-bc7d-d58c864d7cd9-config-volume\") pod \"collect-profiles-29550825-hdbml\" (UID: \"68c9e4c8-9f54-4122-bc7d-d58c864d7cd9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-hdbml" Mar 09 09:45:00 crc kubenswrapper[4971]: I0309 09:45:00.377808 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68c9e4c8-9f54-4122-bc7d-d58c864d7cd9-secret-volume\") pod \"collect-profiles-29550825-hdbml\" (UID: \"68c9e4c8-9f54-4122-bc7d-d58c864d7cd9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-hdbml" Mar 09 09:45:00 crc kubenswrapper[4971]: I0309 09:45:00.379456 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68c9e4c8-9f54-4122-bc7d-d58c864d7cd9-config-volume\") pod \"collect-profiles-29550825-hdbml\" (UID: \"68c9e4c8-9f54-4122-bc7d-d58c864d7cd9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-hdbml" Mar 09 09:45:00 crc kubenswrapper[4971]: I0309 09:45:00.384084 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68c9e4c8-9f54-4122-bc7d-d58c864d7cd9-secret-volume\") pod \"collect-profiles-29550825-hdbml\" (UID: \"68c9e4c8-9f54-4122-bc7d-d58c864d7cd9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-hdbml" Mar 09 09:45:00 crc kubenswrapper[4971]: I0309 09:45:00.396479 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h22vk\" (UniqueName: \"kubernetes.io/projected/68c9e4c8-9f54-4122-bc7d-d58c864d7cd9-kube-api-access-h22vk\") pod \"collect-profiles-29550825-hdbml\" (UID: \"68c9e4c8-9f54-4122-bc7d-d58c864d7cd9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-hdbml" Mar 09 09:45:00 crc kubenswrapper[4971]: I0309 09:45:00.531549 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-hdbml" Mar 09 09:45:00 crc kubenswrapper[4971]: I0309 09:45:00.968535 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550825-hdbml"] Mar 09 09:45:01 crc kubenswrapper[4971]: I0309 09:45:01.497884 4971 generic.go:334] "Generic (PLEG): container finished" podID="68c9e4c8-9f54-4122-bc7d-d58c864d7cd9" containerID="785ac000d8e0e3d16bfaa1568f515d2d6560a3bbce5de32c5c7b7a65b75b422c" exitCode=0 Mar 09 09:45:01 crc kubenswrapper[4971]: I0309 09:45:01.497929 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-hdbml" event={"ID":"68c9e4c8-9f54-4122-bc7d-d58c864d7cd9","Type":"ContainerDied","Data":"785ac000d8e0e3d16bfaa1568f515d2d6560a3bbce5de32c5c7b7a65b75b422c"} Mar 09 09:45:01 crc kubenswrapper[4971]: I0309 09:45:01.497956 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-hdbml" event={"ID":"68c9e4c8-9f54-4122-bc7d-d58c864d7cd9","Type":"ContainerStarted","Data":"fbca9b1bf516a5189d4b4a7a1930b595397ba3f0f465379043dd016dfe5e4c55"} Mar 09 09:45:02 crc kubenswrapper[4971]: I0309 09:45:02.821945 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-hdbml" Mar 09 09:45:03 crc kubenswrapper[4971]: I0309 09:45:03.017951 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68c9e4c8-9f54-4122-bc7d-d58c864d7cd9-secret-volume\") pod \"68c9e4c8-9f54-4122-bc7d-d58c864d7cd9\" (UID: \"68c9e4c8-9f54-4122-bc7d-d58c864d7cd9\") " Mar 09 09:45:03 crc kubenswrapper[4971]: I0309 09:45:03.018392 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h22vk\" (UniqueName: \"kubernetes.io/projected/68c9e4c8-9f54-4122-bc7d-d58c864d7cd9-kube-api-access-h22vk\") pod \"68c9e4c8-9f54-4122-bc7d-d58c864d7cd9\" (UID: \"68c9e4c8-9f54-4122-bc7d-d58c864d7cd9\") " Mar 09 09:45:03 crc kubenswrapper[4971]: I0309 09:45:03.018445 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68c9e4c8-9f54-4122-bc7d-d58c864d7cd9-config-volume\") pod \"68c9e4c8-9f54-4122-bc7d-d58c864d7cd9\" (UID: \"68c9e4c8-9f54-4122-bc7d-d58c864d7cd9\") " Mar 09 09:45:03 crc kubenswrapper[4971]: I0309 09:45:03.019236 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68c9e4c8-9f54-4122-bc7d-d58c864d7cd9-config-volume" (OuterVolumeSpecName: "config-volume") pod "68c9e4c8-9f54-4122-bc7d-d58c864d7cd9" (UID: "68c9e4c8-9f54-4122-bc7d-d58c864d7cd9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:45:03 crc kubenswrapper[4971]: I0309 09:45:03.023153 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68c9e4c8-9f54-4122-bc7d-d58c864d7cd9-kube-api-access-h22vk" (OuterVolumeSpecName: "kube-api-access-h22vk") pod "68c9e4c8-9f54-4122-bc7d-d58c864d7cd9" (UID: "68c9e4c8-9f54-4122-bc7d-d58c864d7cd9"). InnerVolumeSpecName "kube-api-access-h22vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:45:03 crc kubenswrapper[4971]: I0309 09:45:03.026517 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68c9e4c8-9f54-4122-bc7d-d58c864d7cd9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "68c9e4c8-9f54-4122-bc7d-d58c864d7cd9" (UID: "68c9e4c8-9f54-4122-bc7d-d58c864d7cd9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:45:03 crc kubenswrapper[4971]: I0309 09:45:03.120185 4971 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68c9e4c8-9f54-4122-bc7d-d58c864d7cd9-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 09:45:03 crc kubenswrapper[4971]: I0309 09:45:03.120240 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h22vk\" (UniqueName: \"kubernetes.io/projected/68c9e4c8-9f54-4122-bc7d-d58c864d7cd9-kube-api-access-h22vk\") on node \"crc\" DevicePath \"\"" Mar 09 09:45:03 crc kubenswrapper[4971]: I0309 09:45:03.120258 4971 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68c9e4c8-9f54-4122-bc7d-d58c864d7cd9-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 09:45:03 crc kubenswrapper[4971]: I0309 09:45:03.513869 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-hdbml" event={"ID":"68c9e4c8-9f54-4122-bc7d-d58c864d7cd9","Type":"ContainerDied","Data":"fbca9b1bf516a5189d4b4a7a1930b595397ba3f0f465379043dd016dfe5e4c55"} Mar 09 09:45:03 crc kubenswrapper[4971]: I0309 09:45:03.513914 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbca9b1bf516a5189d4b4a7a1930b595397ba3f0f465379043dd016dfe5e4c55" Mar 09 09:45:03 crc kubenswrapper[4971]: I0309 09:45:03.513929 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-hdbml" Mar 09 09:45:04 crc kubenswrapper[4971]: I0309 09:45:04.168337 4971 scope.go:117] "RemoveContainer" containerID="b1c066201987d391a8dd14926c5be6933ba1dbe962df715b3f3afa9727d8f13e" Mar 09 09:45:04 crc kubenswrapper[4971]: I0309 09:45:04.191496 4971 scope.go:117] "RemoveContainer" containerID="554187e3ab94ee132b60bde01c040ad442e61f588caa077ffab326909e95d74f" Mar 09 09:45:04 crc kubenswrapper[4971]: I0309 09:45:04.213545 4971 scope.go:117] "RemoveContainer" containerID="835399e58bad9c868523c328e4f2809cb440e6fd3f172ab8b0694f22bc790969" Mar 09 09:45:04 crc kubenswrapper[4971]: I0309 09:45:04.241227 4971 scope.go:117] "RemoveContainer" containerID="9f807e3c3ca88bd802a7aea7370f8a5cc7d67e20d59a31b0972dd8f3c4371e29" Mar 09 09:45:04 crc kubenswrapper[4971]: I0309 09:45:04.257557 4971 scope.go:117] "RemoveContainer" containerID="50b37896eea06628ecc0ff8113beb83f92dacd56dda44a82698dd4757d0484ec" Mar 09 09:45:04 crc kubenswrapper[4971]: I0309 09:45:04.275869 4971 scope.go:117] "RemoveContainer" containerID="e2c0fb322aacd49c39694152ff8bffb836dccf9b61150ee892504b3c52bfe072" Mar 09 09:45:04 crc kubenswrapper[4971]: I0309 09:45:04.307267 4971 scope.go:117] "RemoveContainer" containerID="d5455beca7782c67fb2e7459725302522a881919ad965b93e26e23b98b6a2900" Mar 09 09:45:04 crc kubenswrapper[4971]: I0309 09:45:04.330623 4971 scope.go:117] "RemoveContainer" containerID="395ee7a0aa47b6d164abe6bfa7fab3fefc7833a6a2a4b2b407ab04f1e5f34459" Mar 09 09:45:04 crc kubenswrapper[4971]: I0309 09:45:04.380408 4971 scope.go:117] "RemoveContainer" containerID="ff7d1cbddb197d85711318a82b94ec0d52e6adfecd4bd4e0dc18c99a41942d12" Mar 09 09:45:04 crc kubenswrapper[4971]: I0309 09:45:04.396196 4971 scope.go:117] "RemoveContainer" containerID="d528d13ffd9863dfe83d2b0b1af5d6c819c2f47dc3a02586cf497de786326e75" Mar 09 09:45:14 crc kubenswrapper[4971]: I0309 09:45:14.794990 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:45:14 crc kubenswrapper[4971]: I0309 09:45:14.795668 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:45:29 crc kubenswrapper[4971]: I0309 09:45:29.048804 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/root-account-create-update-kx6hb"] Mar 09 09:45:29 crc kubenswrapper[4971]: I0309 09:45:29.055225 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/root-account-create-update-kx6hb"] Mar 09 09:45:29 crc kubenswrapper[4971]: I0309 09:45:29.159537 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f15888f6-dc1e-4f8d-8f06-3cf15b21ad21" path="/var/lib/kubelet/pods/f15888f6-dc1e-4f8d-8f06-3cf15b21ad21/volumes" Mar 09 09:45:44 crc kubenswrapper[4971]: I0309 09:45:44.795066 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:45:44 crc kubenswrapper[4971]: I0309 09:45:44.795690 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:46:00 crc kubenswrapper[4971]: I0309 09:46:00.155728 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550826-4xn5v"] Mar 09 09:46:00 crc kubenswrapper[4971]: E0309 09:46:00.156670 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68c9e4c8-9f54-4122-bc7d-d58c864d7cd9" containerName="collect-profiles" Mar 09 09:46:00 crc kubenswrapper[4971]: I0309 09:46:00.156687 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="68c9e4c8-9f54-4122-bc7d-d58c864d7cd9" containerName="collect-profiles" Mar 09 09:46:00 crc kubenswrapper[4971]: I0309 09:46:00.156844 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="68c9e4c8-9f54-4122-bc7d-d58c864d7cd9" containerName="collect-profiles" Mar 09 09:46:00 crc kubenswrapper[4971]: I0309 09:46:00.157415 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550826-4xn5v" Mar 09 09:46:00 crc kubenswrapper[4971]: I0309 09:46:00.160113 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:46:00 crc kubenswrapper[4971]: I0309 09:46:00.160281 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-xhrv2" Mar 09 09:46:00 crc kubenswrapper[4971]: I0309 09:46:00.160463 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:46:00 crc kubenswrapper[4971]: I0309 09:46:00.165909 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550826-4xn5v"] Mar 09 09:46:00 crc kubenswrapper[4971]: I0309 09:46:00.276092 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrhpt\" (UniqueName: \"kubernetes.io/projected/e6dc741c-5773-4202-9293-aec350558517-kube-api-access-zrhpt\") pod \"auto-csr-approver-29550826-4xn5v\" (UID: \"e6dc741c-5773-4202-9293-aec350558517\") " pod="openshift-infra/auto-csr-approver-29550826-4xn5v" Mar 09 09:46:00 crc kubenswrapper[4971]: I0309 09:46:00.377207 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrhpt\" (UniqueName: \"kubernetes.io/projected/e6dc741c-5773-4202-9293-aec350558517-kube-api-access-zrhpt\") pod \"auto-csr-approver-29550826-4xn5v\" (UID: \"e6dc741c-5773-4202-9293-aec350558517\") " pod="openshift-infra/auto-csr-approver-29550826-4xn5v" Mar 09 09:46:00 crc kubenswrapper[4971]: I0309 09:46:00.399565 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrhpt\" (UniqueName: \"kubernetes.io/projected/e6dc741c-5773-4202-9293-aec350558517-kube-api-access-zrhpt\") pod \"auto-csr-approver-29550826-4xn5v\" (UID: \"e6dc741c-5773-4202-9293-aec350558517\") " pod="openshift-infra/auto-csr-approver-29550826-4xn5v" Mar 09 09:46:00 crc kubenswrapper[4971]: I0309 09:46:00.474679 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550826-4xn5v" Mar 09 09:46:00 crc kubenswrapper[4971]: I0309 09:46:00.892376 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550826-4xn5v"] Mar 09 09:46:00 crc kubenswrapper[4971]: I0309 09:46:00.939394 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550826-4xn5v" event={"ID":"e6dc741c-5773-4202-9293-aec350558517","Type":"ContainerStarted","Data":"d65d88ba98dcf69bdbfb451d9b6ac5e520a761943c9df93e1bd442943f34f995"} Mar 09 09:46:02 crc kubenswrapper[4971]: I0309 09:46:02.961595 4971 generic.go:334] "Generic (PLEG): container finished" podID="e6dc741c-5773-4202-9293-aec350558517" containerID="81bdf10f7d99ca46218cb59d915f9c767940f5d8ba0338265356992f5b7860cf" exitCode=0 Mar 09 09:46:02 crc kubenswrapper[4971]: I0309 09:46:02.961715 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550826-4xn5v" event={"ID":"e6dc741c-5773-4202-9293-aec350558517","Type":"ContainerDied","Data":"81bdf10f7d99ca46218cb59d915f9c767940f5d8ba0338265356992f5b7860cf"} Mar 09 09:46:04 crc kubenswrapper[4971]: I0309 09:46:04.279135 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550826-4xn5v" Mar 09 09:46:04 crc kubenswrapper[4971]: I0309 09:46:04.435402 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrhpt\" (UniqueName: \"kubernetes.io/projected/e6dc741c-5773-4202-9293-aec350558517-kube-api-access-zrhpt\") pod \"e6dc741c-5773-4202-9293-aec350558517\" (UID: \"e6dc741c-5773-4202-9293-aec350558517\") " Mar 09 09:46:04 crc kubenswrapper[4971]: I0309 09:46:04.440721 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6dc741c-5773-4202-9293-aec350558517-kube-api-access-zrhpt" (OuterVolumeSpecName: "kube-api-access-zrhpt") pod "e6dc741c-5773-4202-9293-aec350558517" (UID: "e6dc741c-5773-4202-9293-aec350558517"). InnerVolumeSpecName "kube-api-access-zrhpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:46:04 crc kubenswrapper[4971]: I0309 09:46:04.466236 4971 scope.go:117] "RemoveContainer" containerID="9703dcd74fa7b81e233bb1f220d60ca4230be8db77e5a78031bb8d2c9692f4f5" Mar 09 09:46:04 crc kubenswrapper[4971]: I0309 09:46:04.486204 4971 scope.go:117] "RemoveContainer" containerID="9d78bfe5d79bc1332f000be05a02ae3363e37f66af27305f24d9aaf4de48aeda" Mar 09 09:46:04 crc kubenswrapper[4971]: I0309 09:46:04.514393 4971 scope.go:117] "RemoveContainer" containerID="8514f9892d73207bca6439d58cf121d6234889b8e57b2db16cffc790d7b4ad49" Mar 09 09:46:04 crc kubenswrapper[4971]: I0309 09:46:04.537466 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrhpt\" (UniqueName: \"kubernetes.io/projected/e6dc741c-5773-4202-9293-aec350558517-kube-api-access-zrhpt\") on node \"crc\" DevicePath \"\"" Mar 09 09:46:04 crc kubenswrapper[4971]: I0309 09:46:04.544376 4971 scope.go:117] "RemoveContainer" containerID="d9efd0b7e90d1c5fc1f09a7e00f3865753ca39f93436949de4897d929b5dde96" Mar 09 09:46:04 crc kubenswrapper[4971]: I0309 09:46:04.560226 4971 scope.go:117] "RemoveContainer" containerID="685ebb342528743121758ca1e9c7e33a0df5f99a17f412802da9e5017c61621f" Mar 09 09:46:04 crc kubenswrapper[4971]: I0309 09:46:04.574536 4971 scope.go:117] "RemoveContainer" containerID="778a01a71c0604ad7e14750e5dd3ae66e36d42a9179af09e745e3aa305e5ad95" Mar 09 09:46:04 crc kubenswrapper[4971]: I0309 09:46:04.589190 4971 scope.go:117] "RemoveContainer" containerID="371c269962f08a5e5cb9d92b8dd0305621f0d0b732e821209bd9c77742716ce3" Mar 09 09:46:04 crc kubenswrapper[4971]: I0309 09:46:04.605167 4971 scope.go:117] "RemoveContainer" containerID="4458d3e84ca56cb30a624364166ae4fa8207e1a3939676005bb9a4bda0ad96cb" Mar 09 09:46:04 crc kubenswrapper[4971]: I0309 09:46:04.621768 4971 scope.go:117] "RemoveContainer" containerID="2f9499650f3c8ff7cee6d1b2c7ee361d719b0dfefa5b0bccaddb4f38a3681cda" Mar 09 09:46:04 crc kubenswrapper[4971]: I0309 09:46:04.636587 4971 scope.go:117] "RemoveContainer" containerID="010bd76869fd777a88efe67de904a37c0c4d058f63e845c195a9b3a4a07771fb" Mar 09 09:46:04 crc kubenswrapper[4971]: I0309 09:46:04.977134 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550826-4xn5v" event={"ID":"e6dc741c-5773-4202-9293-aec350558517","Type":"ContainerDied","Data":"d65d88ba98dcf69bdbfb451d9b6ac5e520a761943c9df93e1bd442943f34f995"} Mar 09 09:46:04 crc kubenswrapper[4971]: I0309 09:46:04.977231 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550826-4xn5v" Mar 09 09:46:04 crc kubenswrapper[4971]: I0309 09:46:04.977218 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d65d88ba98dcf69bdbfb451d9b6ac5e520a761943c9df93e1bd442943f34f995" Mar 09 09:46:05 crc kubenswrapper[4971]: I0309 09:46:05.330743 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550820-rws2m"] Mar 09 09:46:05 crc kubenswrapper[4971]: I0309 09:46:05.337128 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550820-rws2m"] Mar 09 09:46:07 crc kubenswrapper[4971]: I0309 09:46:07.160865 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b79462f-fbdd-40be-9c60-adca2d053c26" path="/var/lib/kubelet/pods/0b79462f-fbdd-40be-9c60-adca2d053c26/volumes" Mar 09 09:46:13 crc kubenswrapper[4971]: I0309 09:46:13.893470 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dnqv5"] Mar 09 09:46:13 crc kubenswrapper[4971]: E0309 09:46:13.894371 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6dc741c-5773-4202-9293-aec350558517" containerName="oc" Mar 09 09:46:13 crc kubenswrapper[4971]: I0309 09:46:13.894385 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6dc741c-5773-4202-9293-aec350558517" containerName="oc" Mar 09 09:46:13 crc kubenswrapper[4971]: I0309 09:46:13.894548 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6dc741c-5773-4202-9293-aec350558517" containerName="oc" Mar 09 09:46:13 crc kubenswrapper[4971]: I0309 09:46:13.895572 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dnqv5" Mar 09 09:46:13 crc kubenswrapper[4971]: I0309 09:46:13.905039 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dnqv5"] Mar 09 09:46:13 crc kubenswrapper[4971]: I0309 09:46:13.972689 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2531398f-7581-46f5-b192-cb50c23e8c1b-catalog-content\") pod \"certified-operators-dnqv5\" (UID: \"2531398f-7581-46f5-b192-cb50c23e8c1b\") " pod="openshift-marketplace/certified-operators-dnqv5" Mar 09 09:46:13 crc kubenswrapper[4971]: I0309 09:46:13.972881 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2531398f-7581-46f5-b192-cb50c23e8c1b-utilities\") pod \"certified-operators-dnqv5\" (UID: \"2531398f-7581-46f5-b192-cb50c23e8c1b\") " pod="openshift-marketplace/certified-operators-dnqv5" Mar 09 09:46:13 crc kubenswrapper[4971]: I0309 09:46:13.973031 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ckz4\" (UniqueName: \"kubernetes.io/projected/2531398f-7581-46f5-b192-cb50c23e8c1b-kube-api-access-5ckz4\") pod \"certified-operators-dnqv5\" (UID: \"2531398f-7581-46f5-b192-cb50c23e8c1b\") " pod="openshift-marketplace/certified-operators-dnqv5" Mar 09 09:46:14 crc kubenswrapper[4971]: I0309 09:46:14.074853 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2531398f-7581-46f5-b192-cb50c23e8c1b-utilities\") pod \"certified-operators-dnqv5\" (UID: \"2531398f-7581-46f5-b192-cb50c23e8c1b\") " pod="openshift-marketplace/certified-operators-dnqv5" Mar 09 09:46:14 crc kubenswrapper[4971]: I0309 09:46:14.074921 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ckz4\" (UniqueName: \"kubernetes.io/projected/2531398f-7581-46f5-b192-cb50c23e8c1b-kube-api-access-5ckz4\") pod \"certified-operators-dnqv5\" (UID: \"2531398f-7581-46f5-b192-cb50c23e8c1b\") " pod="openshift-marketplace/certified-operators-dnqv5" Mar 09 09:46:14 crc kubenswrapper[4971]: I0309 09:46:14.075014 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2531398f-7581-46f5-b192-cb50c23e8c1b-catalog-content\") pod \"certified-operators-dnqv5\" (UID: \"2531398f-7581-46f5-b192-cb50c23e8c1b\") " pod="openshift-marketplace/certified-operators-dnqv5" Mar 09 09:46:14 crc kubenswrapper[4971]: I0309 09:46:14.075572 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2531398f-7581-46f5-b192-cb50c23e8c1b-catalog-content\") pod \"certified-operators-dnqv5\" (UID: \"2531398f-7581-46f5-b192-cb50c23e8c1b\") " pod="openshift-marketplace/certified-operators-dnqv5" Mar 09 09:46:14 crc kubenswrapper[4971]: I0309 09:46:14.075696 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2531398f-7581-46f5-b192-cb50c23e8c1b-utilities\") pod \"certified-operators-dnqv5\" (UID: \"2531398f-7581-46f5-b192-cb50c23e8c1b\") " pod="openshift-marketplace/certified-operators-dnqv5" Mar 09 09:46:14 crc kubenswrapper[4971]: I0309 09:46:14.093728 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ckz4\" (UniqueName: \"kubernetes.io/projected/2531398f-7581-46f5-b192-cb50c23e8c1b-kube-api-access-5ckz4\") pod \"certified-operators-dnqv5\" (UID: \"2531398f-7581-46f5-b192-cb50c23e8c1b\") " pod="openshift-marketplace/certified-operators-dnqv5" Mar 09 09:46:14 crc kubenswrapper[4971]: I0309 09:46:14.213696 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dnqv5" Mar 09 09:46:14 crc kubenswrapper[4971]: I0309 09:46:14.543775 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dnqv5"] Mar 09 09:46:14 crc kubenswrapper[4971]: I0309 09:46:14.794775 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:46:14 crc kubenswrapper[4971]: I0309 09:46:14.794845 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:46:14 crc kubenswrapper[4971]: I0309 09:46:14.794894 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" Mar 09 09:46:14 crc kubenswrapper[4971]: I0309 09:46:14.795530 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0632ac83f355f18592d74efe661ec3d1b8f6614853f6a58652b0adb7bc649d73"} pod="openshift-machine-config-operator/machine-config-daemon-p56wx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:46:14 crc kubenswrapper[4971]: I0309 09:46:14.795595 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" containerID="cri-o://0632ac83f355f18592d74efe661ec3d1b8f6614853f6a58652b0adb7bc649d73" gracePeriod=600 Mar 09 09:46:15 crc kubenswrapper[4971]: I0309 09:46:15.045607 4971 generic.go:334] "Generic (PLEG): container finished" podID="2531398f-7581-46f5-b192-cb50c23e8c1b" containerID="1355bbbe54bd4b4931d1ec93483e013375eff85df1c25c74ed7e31c6689b8762" exitCode=0 Mar 09 09:46:15 crc kubenswrapper[4971]: I0309 09:46:15.046110 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dnqv5" event={"ID":"2531398f-7581-46f5-b192-cb50c23e8c1b","Type":"ContainerDied","Data":"1355bbbe54bd4b4931d1ec93483e013375eff85df1c25c74ed7e31c6689b8762"} Mar 09 09:46:15 crc kubenswrapper[4971]: I0309 09:46:15.046156 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dnqv5" event={"ID":"2531398f-7581-46f5-b192-cb50c23e8c1b","Type":"ContainerStarted","Data":"0531599abe796803bd6c0581f56b5034296e9659c562c82af81b2232f91f778b"} Mar 09 09:46:15 crc kubenswrapper[4971]: I0309 09:46:15.053173 4971 generic.go:334] "Generic (PLEG): container finished" podID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerID="0632ac83f355f18592d74efe661ec3d1b8f6614853f6a58652b0adb7bc649d73" exitCode=0 Mar 09 09:46:15 crc kubenswrapper[4971]: I0309 09:46:15.053232 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" event={"ID":"05fde3ad-1182-4b15-bb1a-f365ecc92d75","Type":"ContainerDied","Data":"0632ac83f355f18592d74efe661ec3d1b8f6614853f6a58652b0adb7bc649d73"} Mar 09 09:46:15 crc kubenswrapper[4971]: I0309 09:46:15.053439 4971 scope.go:117] "RemoveContainer" containerID="cc375558fe6e32e81af0357f1b5962f3f3827247e841efa171f347f6cf29b99c" Mar 09 09:46:16 crc kubenswrapper[4971]: I0309 09:46:16.061903 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dnqv5" event={"ID":"2531398f-7581-46f5-b192-cb50c23e8c1b","Type":"ContainerStarted","Data":"19e3ac63a1fcb4260651aa5a11af81ed8cab4eef1c7b0855cfa95404667cc698"} Mar 09 09:46:16 crc kubenswrapper[4971]: I0309 09:46:16.065985 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" event={"ID":"05fde3ad-1182-4b15-bb1a-f365ecc92d75","Type":"ContainerStarted","Data":"b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808"} Mar 09 09:46:16 crc kubenswrapper[4971]: I0309 09:46:16.693180 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wrs5s"] Mar 09 09:46:16 crc kubenswrapper[4971]: I0309 09:46:16.695883 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wrs5s" Mar 09 09:46:16 crc kubenswrapper[4971]: I0309 09:46:16.700987 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wrs5s"] Mar 09 09:46:16 crc kubenswrapper[4971]: I0309 09:46:16.815534 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xhp9\" (UniqueName: \"kubernetes.io/projected/071ea90a-80b5-4327-afbf-ee65f0e899d2-kube-api-access-6xhp9\") pod \"community-operators-wrs5s\" (UID: \"071ea90a-80b5-4327-afbf-ee65f0e899d2\") " pod="openshift-marketplace/community-operators-wrs5s" Mar 09 09:46:16 crc kubenswrapper[4971]: I0309 09:46:16.815595 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/071ea90a-80b5-4327-afbf-ee65f0e899d2-utilities\") pod \"community-operators-wrs5s\" (UID: \"071ea90a-80b5-4327-afbf-ee65f0e899d2\") " pod="openshift-marketplace/community-operators-wrs5s" Mar 09 09:46:16 crc kubenswrapper[4971]: I0309 09:46:16.815716 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/071ea90a-80b5-4327-afbf-ee65f0e899d2-catalog-content\") pod \"community-operators-wrs5s\" (UID: \"071ea90a-80b5-4327-afbf-ee65f0e899d2\") " pod="openshift-marketplace/community-operators-wrs5s" Mar 09 09:46:16 crc kubenswrapper[4971]: I0309 09:46:16.916900 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xhp9\" (UniqueName: \"kubernetes.io/projected/071ea90a-80b5-4327-afbf-ee65f0e899d2-kube-api-access-6xhp9\") pod \"community-operators-wrs5s\" (UID: \"071ea90a-80b5-4327-afbf-ee65f0e899d2\") " pod="openshift-marketplace/community-operators-wrs5s" Mar 09 09:46:16 crc kubenswrapper[4971]: I0309 09:46:16.916956 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/071ea90a-80b5-4327-afbf-ee65f0e899d2-utilities\") pod \"community-operators-wrs5s\" (UID: \"071ea90a-80b5-4327-afbf-ee65f0e899d2\") " pod="openshift-marketplace/community-operators-wrs5s" Mar 09 09:46:16 crc kubenswrapper[4971]: I0309 09:46:16.917013 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/071ea90a-80b5-4327-afbf-ee65f0e899d2-catalog-content\") pod \"community-operators-wrs5s\" (UID: \"071ea90a-80b5-4327-afbf-ee65f0e899d2\") " pod="openshift-marketplace/community-operators-wrs5s" Mar 09 09:46:16 crc kubenswrapper[4971]: I0309 09:46:16.917501 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/071ea90a-80b5-4327-afbf-ee65f0e899d2-catalog-content\") pod \"community-operators-wrs5s\" (UID: \"071ea90a-80b5-4327-afbf-ee65f0e899d2\") " pod="openshift-marketplace/community-operators-wrs5s" Mar 09 09:46:16 crc kubenswrapper[4971]: I0309 09:46:16.917511 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/071ea90a-80b5-4327-afbf-ee65f0e899d2-utilities\") pod \"community-operators-wrs5s\" (UID: \"071ea90a-80b5-4327-afbf-ee65f0e899d2\") " pod="openshift-marketplace/community-operators-wrs5s" Mar 09 09:46:16 crc kubenswrapper[4971]: I0309 09:46:16.935926 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xhp9\" (UniqueName: \"kubernetes.io/projected/071ea90a-80b5-4327-afbf-ee65f0e899d2-kube-api-access-6xhp9\") pod \"community-operators-wrs5s\" (UID: \"071ea90a-80b5-4327-afbf-ee65f0e899d2\") " pod="openshift-marketplace/community-operators-wrs5s" Mar 09 09:46:17 crc kubenswrapper[4971]: I0309 09:46:17.031378 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wrs5s" Mar 09 09:46:17 crc kubenswrapper[4971]: I0309 09:46:17.090104 4971 generic.go:334] "Generic (PLEG): container finished" podID="2531398f-7581-46f5-b192-cb50c23e8c1b" containerID="19e3ac63a1fcb4260651aa5a11af81ed8cab4eef1c7b0855cfa95404667cc698" exitCode=0 Mar 09 09:46:17 crc kubenswrapper[4971]: I0309 09:46:17.090462 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dnqv5" event={"ID":"2531398f-7581-46f5-b192-cb50c23e8c1b","Type":"ContainerDied","Data":"19e3ac63a1fcb4260651aa5a11af81ed8cab4eef1c7b0855cfa95404667cc698"} Mar 09 09:46:17 crc kubenswrapper[4971]: I0309 09:46:17.541719 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wrs5s"] Mar 09 09:46:17 crc kubenswrapper[4971]: W0309 09:46:17.545518 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod071ea90a_80b5_4327_afbf_ee65f0e899d2.slice/crio-d548e1c5169714d092b0e9df5bdd402dbfa022ca81a6f79f1ed20a54ea63d82a WatchSource:0}: Error finding container d548e1c5169714d092b0e9df5bdd402dbfa022ca81a6f79f1ed20a54ea63d82a: Status 404 returned error can't find the container with id d548e1c5169714d092b0e9df5bdd402dbfa022ca81a6f79f1ed20a54ea63d82a Mar 09 09:46:18 crc kubenswrapper[4971]: I0309 09:46:18.099434 4971 generic.go:334] "Generic (PLEG): container finished" podID="071ea90a-80b5-4327-afbf-ee65f0e899d2" containerID="1b91ce1a349e1767f2c4e4ac532a41fb583199ab20c0216590a4dabff0285f79" exitCode=0 Mar 09 09:46:18 crc kubenswrapper[4971]: I0309 09:46:18.099527 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrs5s" event={"ID":"071ea90a-80b5-4327-afbf-ee65f0e899d2","Type":"ContainerDied","Data":"1b91ce1a349e1767f2c4e4ac532a41fb583199ab20c0216590a4dabff0285f79"} Mar 09 09:46:18 crc kubenswrapper[4971]: I0309 09:46:18.100087 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrs5s" event={"ID":"071ea90a-80b5-4327-afbf-ee65f0e899d2","Type":"ContainerStarted","Data":"d548e1c5169714d092b0e9df5bdd402dbfa022ca81a6f79f1ed20a54ea63d82a"} Mar 09 09:46:18 crc kubenswrapper[4971]: I0309 09:46:18.106621 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dnqv5" event={"ID":"2531398f-7581-46f5-b192-cb50c23e8c1b","Type":"ContainerStarted","Data":"260c4cc78dd9790b4676d548150f249b467494362c4cf2c6bcc48b3ea0e90af8"} Mar 09 09:46:18 crc kubenswrapper[4971]: I0309 09:46:18.142719 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dnqv5" podStartSLOduration=2.698590248 podStartE2EDuration="5.142698362s" podCreationTimestamp="2026-03-09 09:46:13 +0000 UTC" firstStartedPulling="2026-03-09 09:46:15.048586199 +0000 UTC m=+1578.608514009" lastFinishedPulling="2026-03-09 09:46:17.492694313 +0000 UTC m=+1581.052622123" observedRunningTime="2026-03-09 09:46:18.134438036 +0000 UTC m=+1581.694365846" watchObservedRunningTime="2026-03-09 09:46:18.142698362 +0000 UTC m=+1581.702626172" Mar 09 09:46:19 crc kubenswrapper[4971]: I0309 09:46:19.115707 4971 generic.go:334] "Generic (PLEG): container finished" podID="071ea90a-80b5-4327-afbf-ee65f0e899d2" containerID="fd8395bfcf18228ee414683e0617de3257daee4ebc444aba1ae494954145ae74" exitCode=0 Mar 09 09:46:19 crc kubenswrapper[4971]: I0309 09:46:19.115759 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrs5s" event={"ID":"071ea90a-80b5-4327-afbf-ee65f0e899d2","Type":"ContainerDied","Data":"fd8395bfcf18228ee414683e0617de3257daee4ebc444aba1ae494954145ae74"} Mar 09 09:46:20 crc kubenswrapper[4971]: I0309 09:46:20.156879 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrs5s" event={"ID":"071ea90a-80b5-4327-afbf-ee65f0e899d2","Type":"ContainerStarted","Data":"2efd9b419d55287a59dc97fae07b71cb7de6619d58e834fc359dd77099c36d6d"} Mar 09 09:46:20 crc kubenswrapper[4971]: I0309 09:46:20.207152 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wrs5s" podStartSLOduration=2.7768868490000003 podStartE2EDuration="4.207132438s" podCreationTimestamp="2026-03-09 09:46:16 +0000 UTC" firstStartedPulling="2026-03-09 09:46:18.101485348 +0000 UTC m=+1581.661413168" lastFinishedPulling="2026-03-09 09:46:19.531730957 +0000 UTC m=+1583.091658757" observedRunningTime="2026-03-09 09:46:20.206668335 +0000 UTC m=+1583.766596145" watchObservedRunningTime="2026-03-09 09:46:20.207132438 +0000 UTC m=+1583.767060248" Mar 09 09:46:24 crc kubenswrapper[4971]: I0309 09:46:24.213919 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dnqv5" Mar 09 09:46:24 crc kubenswrapper[4971]: I0309 09:46:24.214527 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dnqv5" Mar 09 09:46:24 crc kubenswrapper[4971]: I0309 09:46:24.255662 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dnqv5" Mar 09 09:46:25 crc kubenswrapper[4971]: I0309 09:46:25.235761 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dnqv5" Mar 09 09:46:25 crc kubenswrapper[4971]: I0309 09:46:25.285643 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dnqv5"] Mar 09 09:46:27 crc kubenswrapper[4971]: I0309 09:46:27.032530 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wrs5s" Mar 09 09:46:27 crc kubenswrapper[4971]: I0309 09:46:27.032594 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wrs5s" Mar 09 09:46:27 crc kubenswrapper[4971]: I0309 09:46:27.082245 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wrs5s" Mar 09 09:46:27 crc kubenswrapper[4971]: I0309 09:46:27.202951 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dnqv5" podUID="2531398f-7581-46f5-b192-cb50c23e8c1b" containerName="registry-server" containerID="cri-o://260c4cc78dd9790b4676d548150f249b467494362c4cf2c6bcc48b3ea0e90af8" gracePeriod=2 Mar 09 09:46:27 crc kubenswrapper[4971]: I0309 09:46:27.252134 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wrs5s" Mar 09 09:46:27 crc kubenswrapper[4971]: I0309 09:46:27.892227 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wrs5s"] Mar 09 09:46:28 crc kubenswrapper[4971]: I0309 09:46:28.208725 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dnqv5" Mar 09 09:46:28 crc kubenswrapper[4971]: I0309 09:46:28.216194 4971 generic.go:334] "Generic (PLEG): container finished" podID="2531398f-7581-46f5-b192-cb50c23e8c1b" containerID="260c4cc78dd9790b4676d548150f249b467494362c4cf2c6bcc48b3ea0e90af8" exitCode=0 Mar 09 09:46:28 crc kubenswrapper[4971]: I0309 09:46:28.216806 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dnqv5" Mar 09 09:46:28 crc kubenswrapper[4971]: I0309 09:46:28.216938 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dnqv5" event={"ID":"2531398f-7581-46f5-b192-cb50c23e8c1b","Type":"ContainerDied","Data":"260c4cc78dd9790b4676d548150f249b467494362c4cf2c6bcc48b3ea0e90af8"} Mar 09 09:46:28 crc kubenswrapper[4971]: I0309 09:46:28.216966 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dnqv5" event={"ID":"2531398f-7581-46f5-b192-cb50c23e8c1b","Type":"ContainerDied","Data":"0531599abe796803bd6c0581f56b5034296e9659c562c82af81b2232f91f778b"} Mar 09 09:46:28 crc kubenswrapper[4971]: I0309 09:46:28.216982 4971 scope.go:117] "RemoveContainer" containerID="260c4cc78dd9790b4676d548150f249b467494362c4cf2c6bcc48b3ea0e90af8" Mar 09 09:46:28 crc kubenswrapper[4971]: I0309 09:46:28.253280 4971 scope.go:117] "RemoveContainer" containerID="19e3ac63a1fcb4260651aa5a11af81ed8cab4eef1c7b0855cfa95404667cc698" Mar 09 09:46:28 crc kubenswrapper[4971]: I0309 09:46:28.280897 4971 scope.go:117] "RemoveContainer" containerID="1355bbbe54bd4b4931d1ec93483e013375eff85df1c25c74ed7e31c6689b8762" Mar 09 09:46:28 crc kubenswrapper[4971]: I0309 09:46:28.299680 4971 scope.go:117] "RemoveContainer" containerID="260c4cc78dd9790b4676d548150f249b467494362c4cf2c6bcc48b3ea0e90af8" Mar 09 09:46:28 crc kubenswrapper[4971]: E0309 09:46:28.300060 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"260c4cc78dd9790b4676d548150f249b467494362c4cf2c6bcc48b3ea0e90af8\": container with ID starting with 260c4cc78dd9790b4676d548150f249b467494362c4cf2c6bcc48b3ea0e90af8 not found: ID does not exist" containerID="260c4cc78dd9790b4676d548150f249b467494362c4cf2c6bcc48b3ea0e90af8" Mar 09 09:46:28 crc kubenswrapper[4971]: I0309 09:46:28.300094 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"260c4cc78dd9790b4676d548150f249b467494362c4cf2c6bcc48b3ea0e90af8"} err="failed to get container status \"260c4cc78dd9790b4676d548150f249b467494362c4cf2c6bcc48b3ea0e90af8\": rpc error: code = NotFound desc = could not find container \"260c4cc78dd9790b4676d548150f249b467494362c4cf2c6bcc48b3ea0e90af8\": container with ID starting with 260c4cc78dd9790b4676d548150f249b467494362c4cf2c6bcc48b3ea0e90af8 not found: ID does not exist" Mar 09 09:46:28 crc kubenswrapper[4971]: I0309 09:46:28.300116 4971 scope.go:117] "RemoveContainer" containerID="19e3ac63a1fcb4260651aa5a11af81ed8cab4eef1c7b0855cfa95404667cc698" Mar 09 09:46:28 crc kubenswrapper[4971]: E0309 09:46:28.300503 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19e3ac63a1fcb4260651aa5a11af81ed8cab4eef1c7b0855cfa95404667cc698\": container with ID starting with 19e3ac63a1fcb4260651aa5a11af81ed8cab4eef1c7b0855cfa95404667cc698 not found: ID does not exist" containerID="19e3ac63a1fcb4260651aa5a11af81ed8cab4eef1c7b0855cfa95404667cc698" Mar 09 09:46:28 crc kubenswrapper[4971]: I0309 09:46:28.300532 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19e3ac63a1fcb4260651aa5a11af81ed8cab4eef1c7b0855cfa95404667cc698"} err="failed to get container status \"19e3ac63a1fcb4260651aa5a11af81ed8cab4eef1c7b0855cfa95404667cc698\": rpc error: code = NotFound desc = could not find container \"19e3ac63a1fcb4260651aa5a11af81ed8cab4eef1c7b0855cfa95404667cc698\": container with ID starting with 19e3ac63a1fcb4260651aa5a11af81ed8cab4eef1c7b0855cfa95404667cc698 not found: ID does not exist" Mar 09 09:46:28 crc kubenswrapper[4971]: I0309 09:46:28.300548 4971 scope.go:117] "RemoveContainer" containerID="1355bbbe54bd4b4931d1ec93483e013375eff85df1c25c74ed7e31c6689b8762" Mar 09 09:46:28 crc kubenswrapper[4971]: E0309 09:46:28.300896 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1355bbbe54bd4b4931d1ec93483e013375eff85df1c25c74ed7e31c6689b8762\": container with ID starting with 1355bbbe54bd4b4931d1ec93483e013375eff85df1c25c74ed7e31c6689b8762 not found: ID does not exist" containerID="1355bbbe54bd4b4931d1ec93483e013375eff85df1c25c74ed7e31c6689b8762" Mar 09 09:46:28 crc kubenswrapper[4971]: I0309 09:46:28.300925 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1355bbbe54bd4b4931d1ec93483e013375eff85df1c25c74ed7e31c6689b8762"} err="failed to get container status \"1355bbbe54bd4b4931d1ec93483e013375eff85df1c25c74ed7e31c6689b8762\": rpc error: code = NotFound desc = could not find container \"1355bbbe54bd4b4931d1ec93483e013375eff85df1c25c74ed7e31c6689b8762\": container with ID starting with 1355bbbe54bd4b4931d1ec93483e013375eff85df1c25c74ed7e31c6689b8762 not found: ID does not exist" Mar 09 09:46:28 crc kubenswrapper[4971]: I0309 09:46:28.304576 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ckz4\" (UniqueName: \"kubernetes.io/projected/2531398f-7581-46f5-b192-cb50c23e8c1b-kube-api-access-5ckz4\") pod \"2531398f-7581-46f5-b192-cb50c23e8c1b\" (UID: \"2531398f-7581-46f5-b192-cb50c23e8c1b\") " Mar 09 09:46:28 crc kubenswrapper[4971]: I0309 09:46:28.305440 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2531398f-7581-46f5-b192-cb50c23e8c1b-catalog-content\") pod \"2531398f-7581-46f5-b192-cb50c23e8c1b\" (UID: \"2531398f-7581-46f5-b192-cb50c23e8c1b\") " Mar 09 09:46:28 crc kubenswrapper[4971]: I0309 09:46:28.305521 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2531398f-7581-46f5-b192-cb50c23e8c1b-utilities\") pod \"2531398f-7581-46f5-b192-cb50c23e8c1b\" (UID: \"2531398f-7581-46f5-b192-cb50c23e8c1b\") " Mar 09 09:46:28 crc kubenswrapper[4971]: I0309 09:46:28.307500 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2531398f-7581-46f5-b192-cb50c23e8c1b-utilities" (OuterVolumeSpecName: "utilities") pod "2531398f-7581-46f5-b192-cb50c23e8c1b" (UID: "2531398f-7581-46f5-b192-cb50c23e8c1b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:46:28 crc kubenswrapper[4971]: I0309 09:46:28.310204 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2531398f-7581-46f5-b192-cb50c23e8c1b-kube-api-access-5ckz4" (OuterVolumeSpecName: "kube-api-access-5ckz4") pod "2531398f-7581-46f5-b192-cb50c23e8c1b" (UID: "2531398f-7581-46f5-b192-cb50c23e8c1b"). InnerVolumeSpecName "kube-api-access-5ckz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:46:28 crc kubenswrapper[4971]: I0309 09:46:28.365784 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2531398f-7581-46f5-b192-cb50c23e8c1b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2531398f-7581-46f5-b192-cb50c23e8c1b" (UID: "2531398f-7581-46f5-b192-cb50c23e8c1b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:46:28 crc kubenswrapper[4971]: I0309 09:46:28.407502 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ckz4\" (UniqueName: \"kubernetes.io/projected/2531398f-7581-46f5-b192-cb50c23e8c1b-kube-api-access-5ckz4\") on node \"crc\" DevicePath \"\"" Mar 09 09:46:28 crc kubenswrapper[4971]: I0309 09:46:28.407547 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2531398f-7581-46f5-b192-cb50c23e8c1b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:46:28 crc kubenswrapper[4971]: I0309 09:46:28.407559 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2531398f-7581-46f5-b192-cb50c23e8c1b-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:46:28 crc kubenswrapper[4971]: I0309 09:46:28.551586 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dnqv5"] Mar 09 09:46:28 crc kubenswrapper[4971]: I0309 09:46:28.557656 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dnqv5"] Mar 09 09:46:29 crc kubenswrapper[4971]: I0309 09:46:29.161635 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2531398f-7581-46f5-b192-cb50c23e8c1b" path="/var/lib/kubelet/pods/2531398f-7581-46f5-b192-cb50c23e8c1b/volumes" Mar 09 09:46:29 crc kubenswrapper[4971]: I0309 09:46:29.227163 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wrs5s" podUID="071ea90a-80b5-4327-afbf-ee65f0e899d2" containerName="registry-server" containerID="cri-o://2efd9b419d55287a59dc97fae07b71cb7de6619d58e834fc359dd77099c36d6d" gracePeriod=2 Mar 09 09:46:30 crc kubenswrapper[4971]: I0309 09:46:30.237932 4971 generic.go:334] "Generic (PLEG): container finished" podID="071ea90a-80b5-4327-afbf-ee65f0e899d2" containerID="2efd9b419d55287a59dc97fae07b71cb7de6619d58e834fc359dd77099c36d6d" exitCode=0 Mar 09 09:46:30 crc kubenswrapper[4971]: I0309 09:46:30.237968 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrs5s" event={"ID":"071ea90a-80b5-4327-afbf-ee65f0e899d2","Type":"ContainerDied","Data":"2efd9b419d55287a59dc97fae07b71cb7de6619d58e834fc359dd77099c36d6d"} Mar 09 09:46:30 crc kubenswrapper[4971]: I0309 09:46:30.238012 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrs5s" event={"ID":"071ea90a-80b5-4327-afbf-ee65f0e899d2","Type":"ContainerDied","Data":"d548e1c5169714d092b0e9df5bdd402dbfa022ca81a6f79f1ed20a54ea63d82a"} Mar 09 09:46:30 crc kubenswrapper[4971]: I0309 09:46:30.238028 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d548e1c5169714d092b0e9df5bdd402dbfa022ca81a6f79f1ed20a54ea63d82a" Mar 09 09:46:30 crc kubenswrapper[4971]: I0309 09:46:30.272217 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wrs5s" Mar 09 09:46:30 crc kubenswrapper[4971]: I0309 09:46:30.697753 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/071ea90a-80b5-4327-afbf-ee65f0e899d2-utilities\") pod \"071ea90a-80b5-4327-afbf-ee65f0e899d2\" (UID: \"071ea90a-80b5-4327-afbf-ee65f0e899d2\") " Mar 09 09:46:30 crc kubenswrapper[4971]: I0309 09:46:30.697933 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xhp9\" (UniqueName: \"kubernetes.io/projected/071ea90a-80b5-4327-afbf-ee65f0e899d2-kube-api-access-6xhp9\") pod \"071ea90a-80b5-4327-afbf-ee65f0e899d2\" (UID: \"071ea90a-80b5-4327-afbf-ee65f0e899d2\") " Mar 09 09:46:30 crc kubenswrapper[4971]: I0309 09:46:30.697960 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/071ea90a-80b5-4327-afbf-ee65f0e899d2-catalog-content\") pod \"071ea90a-80b5-4327-afbf-ee65f0e899d2\" (UID: \"071ea90a-80b5-4327-afbf-ee65f0e899d2\") " Mar 09 09:46:30 crc kubenswrapper[4971]: I0309 09:46:30.707338 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/071ea90a-80b5-4327-afbf-ee65f0e899d2-utilities" (OuterVolumeSpecName: "utilities") pod "071ea90a-80b5-4327-afbf-ee65f0e899d2" (UID: "071ea90a-80b5-4327-afbf-ee65f0e899d2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:46:30 crc kubenswrapper[4971]: I0309 09:46:30.725425 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/071ea90a-80b5-4327-afbf-ee65f0e899d2-kube-api-access-6xhp9" (OuterVolumeSpecName: "kube-api-access-6xhp9") pod "071ea90a-80b5-4327-afbf-ee65f0e899d2" (UID: "071ea90a-80b5-4327-afbf-ee65f0e899d2"). InnerVolumeSpecName "kube-api-access-6xhp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:46:30 crc kubenswrapper[4971]: I0309 09:46:30.772863 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/071ea90a-80b5-4327-afbf-ee65f0e899d2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "071ea90a-80b5-4327-afbf-ee65f0e899d2" (UID: "071ea90a-80b5-4327-afbf-ee65f0e899d2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:46:30 crc kubenswrapper[4971]: I0309 09:46:30.799502 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xhp9\" (UniqueName: \"kubernetes.io/projected/071ea90a-80b5-4327-afbf-ee65f0e899d2-kube-api-access-6xhp9\") on node \"crc\" DevicePath \"\"" Mar 09 09:46:30 crc kubenswrapper[4971]: I0309 09:46:30.799566 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/071ea90a-80b5-4327-afbf-ee65f0e899d2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:46:30 crc kubenswrapper[4971]: I0309 09:46:30.799581 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/071ea90a-80b5-4327-afbf-ee65f0e899d2-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:46:31 crc kubenswrapper[4971]: I0309 09:46:31.385086 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wrs5s" Mar 09 09:46:31 crc kubenswrapper[4971]: I0309 09:46:31.408421 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wrs5s"] Mar 09 09:46:31 crc kubenswrapper[4971]: I0309 09:46:31.415658 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wrs5s"] Mar 09 09:46:33 crc kubenswrapper[4971]: I0309 09:46:33.161677 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="071ea90a-80b5-4327-afbf-ee65f0e899d2" path="/var/lib/kubelet/pods/071ea90a-80b5-4327-afbf-ee65f0e899d2/volumes" Mar 09 09:46:47 crc kubenswrapper[4971]: I0309 09:46:47.041927 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/keystone-db-create-qt784"] Mar 09 09:46:47 crc kubenswrapper[4971]: I0309 09:46:47.048850 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/keystone-1f04-account-create-update-4wtvd"] Mar 09 09:46:47 crc kubenswrapper[4971]: I0309 09:46:47.055189 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/keystone-1f04-account-create-update-4wtvd"] Mar 09 09:46:47 crc kubenswrapper[4971]: I0309 09:46:47.061526 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/keystone-db-create-qt784"] Mar 09 09:46:47 crc kubenswrapper[4971]: I0309 09:46:47.159689 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1869c051-5ff7-4504-92c2-cbf07998153d" path="/var/lib/kubelet/pods/1869c051-5ff7-4504-92c2-cbf07998153d/volumes" Mar 09 09:46:47 crc kubenswrapper[4971]: I0309 09:46:47.160206 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ccc8050-0beb-48dc-9422-04484a337b7e" path="/var/lib/kubelet/pods/7ccc8050-0beb-48dc-9422-04484a337b7e/volumes" Mar 09 09:47:04 crc kubenswrapper[4971]: I0309 09:47:04.038234 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/keystone-db-sync-n9hkd"] Mar 09 09:47:04 crc kubenswrapper[4971]: I0309 09:47:04.044493 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/keystone-db-sync-n9hkd"] Mar 09 09:47:04 crc kubenswrapper[4971]: I0309 09:47:04.698409 4971 scope.go:117] "RemoveContainer" containerID="ba076d7843d817330fd2fde06e7d4350f89f50e00698db6cf8ed6476608f7a0d" Mar 09 09:47:04 crc kubenswrapper[4971]: I0309 09:47:04.728560 4971 scope.go:117] "RemoveContainer" containerID="a5ded82dd6611ffcec77dcce3edfc9bf31e4bbedeb59631d196bba143fc77b63" Mar 09 09:47:04 crc kubenswrapper[4971]: I0309 09:47:04.792281 4971 scope.go:117] "RemoveContainer" containerID="c34a56e58e19b4a4e4f7a4d883f80da11086d79f2d74adf9c01878e889d48f17" Mar 09 09:47:05 crc kubenswrapper[4971]: I0309 09:47:05.162373 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a61e8dbe-0b29-4be0-a931-1ef393790f86" path="/var/lib/kubelet/pods/a61e8dbe-0b29-4be0-a931-1ef393790f86/volumes" Mar 09 09:47:10 crc kubenswrapper[4971]: I0309 09:47:10.027769 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/keystone-bootstrap-6snxg"] Mar 09 09:47:10 crc kubenswrapper[4971]: I0309 09:47:10.032493 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/keystone-bootstrap-6snxg"] Mar 09 09:47:11 crc kubenswrapper[4971]: I0309 09:47:11.170936 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd" path="/var/lib/kubelet/pods/e5e2cdc0-4bc1-4d0c-b132-f621ca3f00fd/volumes" Mar 09 09:47:54 crc kubenswrapper[4971]: I0309 09:47:54.036807 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/barbican-db-create-szs62"] Mar 09 09:47:54 crc kubenswrapper[4971]: I0309 09:47:54.042749 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/barbican-1b72-account-create-update-z76fh"] Mar 09 09:47:54 crc kubenswrapper[4971]: I0309 09:47:54.048302 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/barbican-1b72-account-create-update-z76fh"] Mar 09 09:47:54 crc kubenswrapper[4971]: I0309 09:47:54.053803 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/barbican-db-create-szs62"] Mar 09 09:47:55 crc kubenswrapper[4971]: I0309 09:47:55.165482 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3248073d-6eed-48b2-a088-b84c57ae3579" path="/var/lib/kubelet/pods/3248073d-6eed-48b2-a088-b84c57ae3579/volumes" Mar 09 09:47:55 crc kubenswrapper[4971]: I0309 09:47:55.166670 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6609af45-62cb-4830-b4d1-39700af89b1b" path="/var/lib/kubelet/pods/6609af45-62cb-4830-b4d1-39700af89b1b/volumes" Mar 09 09:48:00 crc kubenswrapper[4971]: I0309 09:48:00.135118 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550828-8x5pt"] Mar 09 09:48:00 crc kubenswrapper[4971]: E0309 09:48:00.136096 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="071ea90a-80b5-4327-afbf-ee65f0e899d2" containerName="registry-server" Mar 09 09:48:00 crc kubenswrapper[4971]: I0309 09:48:00.136114 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="071ea90a-80b5-4327-afbf-ee65f0e899d2" containerName="registry-server" Mar 09 09:48:00 crc kubenswrapper[4971]: E0309 09:48:00.136148 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2531398f-7581-46f5-b192-cb50c23e8c1b" containerName="registry-server" Mar 09 09:48:00 crc kubenswrapper[4971]: I0309 09:48:00.136157 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2531398f-7581-46f5-b192-cb50c23e8c1b" containerName="registry-server" Mar 09 09:48:00 crc kubenswrapper[4971]: E0309 09:48:00.136170 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="071ea90a-80b5-4327-afbf-ee65f0e899d2" containerName="extract-utilities" Mar 09 09:48:00 crc kubenswrapper[4971]: I0309 09:48:00.136182 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="071ea90a-80b5-4327-afbf-ee65f0e899d2" containerName="extract-utilities" Mar 09 09:48:00 crc kubenswrapper[4971]: E0309 09:48:00.136197 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="071ea90a-80b5-4327-afbf-ee65f0e899d2" containerName="extract-content" Mar 09 09:48:00 crc kubenswrapper[4971]: I0309 09:48:00.136204 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="071ea90a-80b5-4327-afbf-ee65f0e899d2" containerName="extract-content" Mar 09 09:48:00 crc kubenswrapper[4971]: E0309 09:48:00.136213 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2531398f-7581-46f5-b192-cb50c23e8c1b" containerName="extract-utilities" Mar 09 09:48:00 crc kubenswrapper[4971]: I0309 09:48:00.136221 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2531398f-7581-46f5-b192-cb50c23e8c1b" containerName="extract-utilities" Mar 09 09:48:00 crc kubenswrapper[4971]: E0309 09:48:00.136237 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2531398f-7581-46f5-b192-cb50c23e8c1b" containerName="extract-content" Mar 09 09:48:00 crc kubenswrapper[4971]: I0309 09:48:00.136245 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2531398f-7581-46f5-b192-cb50c23e8c1b" containerName="extract-content" Mar 09 09:48:00 crc kubenswrapper[4971]: I0309 09:48:00.136441 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="071ea90a-80b5-4327-afbf-ee65f0e899d2" containerName="registry-server" Mar 09 09:48:00 crc kubenswrapper[4971]: I0309 09:48:00.136468 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="2531398f-7581-46f5-b192-cb50c23e8c1b" containerName="registry-server" Mar 09 09:48:00 crc kubenswrapper[4971]: I0309 09:48:00.137071 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550828-8x5pt" Mar 09 09:48:00 crc kubenswrapper[4971]: I0309 09:48:00.140635 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-xhrv2" Mar 09 09:48:00 crc kubenswrapper[4971]: I0309 09:48:00.140718 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:48:00 crc kubenswrapper[4971]: I0309 09:48:00.140825 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:48:00 crc kubenswrapper[4971]: I0309 09:48:00.144418 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550828-8x5pt"] Mar 09 09:48:00 crc kubenswrapper[4971]: I0309 09:48:00.223245 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8j8f\" (UniqueName: \"kubernetes.io/projected/a6f21a0a-06fd-4f66-bef5-c17554e9aae7-kube-api-access-n8j8f\") pod \"auto-csr-approver-29550828-8x5pt\" (UID: \"a6f21a0a-06fd-4f66-bef5-c17554e9aae7\") " pod="openshift-infra/auto-csr-approver-29550828-8x5pt" Mar 09 09:48:00 crc kubenswrapper[4971]: I0309 09:48:00.329118 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8j8f\" (UniqueName: \"kubernetes.io/projected/a6f21a0a-06fd-4f66-bef5-c17554e9aae7-kube-api-access-n8j8f\") pod \"auto-csr-approver-29550828-8x5pt\" (UID: \"a6f21a0a-06fd-4f66-bef5-c17554e9aae7\") " pod="openshift-infra/auto-csr-approver-29550828-8x5pt" Mar 09 09:48:00 crc kubenswrapper[4971]: I0309 09:48:00.347144 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8j8f\" (UniqueName: \"kubernetes.io/projected/a6f21a0a-06fd-4f66-bef5-c17554e9aae7-kube-api-access-n8j8f\") pod \"auto-csr-approver-29550828-8x5pt\" (UID: \"a6f21a0a-06fd-4f66-bef5-c17554e9aae7\") " pod="openshift-infra/auto-csr-approver-29550828-8x5pt" Mar 09 09:48:00 crc kubenswrapper[4971]: I0309 09:48:00.454869 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550828-8x5pt" Mar 09 09:48:00 crc kubenswrapper[4971]: I0309 09:48:00.906970 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550828-8x5pt"] Mar 09 09:48:01 crc kubenswrapper[4971]: I0309 09:48:01.030725 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550828-8x5pt" event={"ID":"a6f21a0a-06fd-4f66-bef5-c17554e9aae7","Type":"ContainerStarted","Data":"7ff36ba4befd6fab7bede016bf23aa1865ccd628641a73dc290f3e854fc5173a"} Mar 09 09:48:04 crc kubenswrapper[4971]: I0309 09:48:04.875441 4971 scope.go:117] "RemoveContainer" containerID="6bd0613e16689845322a70359db1f3b709f820c2d340682b22ecca4e72b9e5b7" Mar 09 09:48:04 crc kubenswrapper[4971]: I0309 09:48:04.897016 4971 scope.go:117] "RemoveContainer" containerID="f696cf4677bbf2f912323d0ee1352fb90272bdbdaf527ee6b58742236234a074" Mar 09 09:48:04 crc kubenswrapper[4971]: I0309 09:48:04.929266 4971 scope.go:117] "RemoveContainer" containerID="6fca691ed1e7095eb057b9fa1f3238ac7ba426b229b8fed4dc336d76f3f79fc6" Mar 09 09:48:04 crc kubenswrapper[4971]: I0309 09:48:04.968559 4971 scope.go:117] "RemoveContainer" containerID="44a5ac950525c0a616a0dfeea6f099440a2ce95a3f797d003275ffdb5cfa379a" Mar 09 09:48:07 crc kubenswrapper[4971]: I0309 09:48:07.082082 4971 generic.go:334] "Generic (PLEG): container finished" podID="a6f21a0a-06fd-4f66-bef5-c17554e9aae7" containerID="a6ffb4ed080bb2d0151614bfb2935785b56947b5da5a3d8e21b253939225810b" exitCode=0 Mar 09 09:48:07 crc kubenswrapper[4971]: I0309 09:48:07.082565 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550828-8x5pt" event={"ID":"a6f21a0a-06fd-4f66-bef5-c17554e9aae7","Type":"ContainerDied","Data":"a6ffb4ed080bb2d0151614bfb2935785b56947b5da5a3d8e21b253939225810b"} Mar 09 09:48:08 crc kubenswrapper[4971]: I0309 09:48:08.356187 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550828-8x5pt" Mar 09 09:48:08 crc kubenswrapper[4971]: I0309 09:48:08.445066 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8j8f\" (UniqueName: \"kubernetes.io/projected/a6f21a0a-06fd-4f66-bef5-c17554e9aae7-kube-api-access-n8j8f\") pod \"a6f21a0a-06fd-4f66-bef5-c17554e9aae7\" (UID: \"a6f21a0a-06fd-4f66-bef5-c17554e9aae7\") " Mar 09 09:48:08 crc kubenswrapper[4971]: I0309 09:48:08.450879 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6f21a0a-06fd-4f66-bef5-c17554e9aae7-kube-api-access-n8j8f" (OuterVolumeSpecName: "kube-api-access-n8j8f") pod "a6f21a0a-06fd-4f66-bef5-c17554e9aae7" (UID: "a6f21a0a-06fd-4f66-bef5-c17554e9aae7"). InnerVolumeSpecName "kube-api-access-n8j8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:48:08 crc kubenswrapper[4971]: I0309 09:48:08.546560 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8j8f\" (UniqueName: \"kubernetes.io/projected/a6f21a0a-06fd-4f66-bef5-c17554e9aae7-kube-api-access-n8j8f\") on node \"crc\" DevicePath \"\"" Mar 09 09:48:09 crc kubenswrapper[4971]: I0309 09:48:09.096492 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550828-8x5pt" event={"ID":"a6f21a0a-06fd-4f66-bef5-c17554e9aae7","Type":"ContainerDied","Data":"7ff36ba4befd6fab7bede016bf23aa1865ccd628641a73dc290f3e854fc5173a"} Mar 09 09:48:09 crc kubenswrapper[4971]: I0309 09:48:09.096548 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550828-8x5pt" Mar 09 09:48:09 crc kubenswrapper[4971]: I0309 09:48:09.096554 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ff36ba4befd6fab7bede016bf23aa1865ccd628641a73dc290f3e854fc5173a" Mar 09 09:48:09 crc kubenswrapper[4971]: I0309 09:48:09.414504 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550822-cqmvz"] Mar 09 09:48:09 crc kubenswrapper[4971]: I0309 09:48:09.420221 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550822-cqmvz"] Mar 09 09:48:11 crc kubenswrapper[4971]: I0309 09:48:11.159564 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca618e3d-049e-4a5b-b460-13fb0c3ad5d2" path="/var/lib/kubelet/pods/ca618e3d-049e-4a5b-b460-13fb0c3ad5d2/volumes" Mar 09 09:48:44 crc kubenswrapper[4971]: I0309 09:48:44.794785 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:48:44 crc kubenswrapper[4971]: I0309 09:48:44.795644 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:49:05 crc kubenswrapper[4971]: I0309 09:49:05.046529 4971 scope.go:117] "RemoveContainer" containerID="9a8ab46a496be8b8181a93a0b8ac165aad47948812fbaf5ea7e46650de7c9084" Mar 09 09:49:14 crc kubenswrapper[4971]: I0309 09:49:14.794808 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:49:14 crc kubenswrapper[4971]: I0309 09:49:14.795318 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:49:44 crc kubenswrapper[4971]: I0309 09:49:44.794456 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:49:44 crc kubenswrapper[4971]: I0309 09:49:44.795039 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:49:44 crc kubenswrapper[4971]: I0309 09:49:44.795083 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" Mar 09 09:49:44 crc kubenswrapper[4971]: I0309 09:49:44.795541 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808"} pod="openshift-machine-config-operator/machine-config-daemon-p56wx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:49:44 crc kubenswrapper[4971]: I0309 09:49:44.795594 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" containerID="cri-o://b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" gracePeriod=600 Mar 09 09:49:44 crc kubenswrapper[4971]: E0309 09:49:44.916428 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 09:49:45 crc kubenswrapper[4971]: I0309 09:49:45.841310 4971 generic.go:334] "Generic (PLEG): container finished" podID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" exitCode=0 Mar 09 09:49:45 crc kubenswrapper[4971]: I0309 09:49:45.841404 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" event={"ID":"05fde3ad-1182-4b15-bb1a-f365ecc92d75","Type":"ContainerDied","Data":"b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808"} Mar 09 09:49:45 crc kubenswrapper[4971]: I0309 09:49:45.841629 4971 scope.go:117] "RemoveContainer" containerID="0632ac83f355f18592d74efe661ec3d1b8f6614853f6a58652b0adb7bc649d73" Mar 09 09:49:45 crc kubenswrapper[4971]: I0309 09:49:45.842262 4971 scope.go:117] "RemoveContainer" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" Mar 09 09:49:45 crc kubenswrapper[4971]: E0309 09:49:45.842647 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 09:49:51 crc kubenswrapper[4971]: I0309 09:49:51.322623 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" podUID="9eafe0b1-303d-4cc5-af18-c9a3d72b38b4" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 09 09:49:57 crc kubenswrapper[4971]: I0309 09:49:57.160584 4971 scope.go:117] "RemoveContainer" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" Mar 09 09:49:57 crc kubenswrapper[4971]: E0309 09:49:57.161394 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 09:50:00 crc kubenswrapper[4971]: I0309 09:50:00.144487 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550830-t6nld"] Mar 09 09:50:00 crc kubenswrapper[4971]: E0309 09:50:00.145121 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f21a0a-06fd-4f66-bef5-c17554e9aae7" containerName="oc" Mar 09 09:50:00 crc kubenswrapper[4971]: I0309 09:50:00.145137 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f21a0a-06fd-4f66-bef5-c17554e9aae7" containerName="oc" Mar 09 09:50:00 crc kubenswrapper[4971]: I0309 09:50:00.145286 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f21a0a-06fd-4f66-bef5-c17554e9aae7" containerName="oc" Mar 09 09:50:00 crc kubenswrapper[4971]: I0309 09:50:00.145732 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550830-t6nld" Mar 09 09:50:00 crc kubenswrapper[4971]: I0309 09:50:00.149161 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-xhrv2" Mar 09 09:50:00 crc kubenswrapper[4971]: I0309 09:50:00.149413 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:50:00 crc kubenswrapper[4971]: I0309 09:50:00.150638 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:50:00 crc kubenswrapper[4971]: I0309 09:50:00.160554 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550830-t6nld"] Mar 09 09:50:00 crc kubenswrapper[4971]: I0309 09:50:00.310907 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mt88\" (UniqueName: \"kubernetes.io/projected/fd0bfbf3-81ff-4403-98da-d03b5baa13ae-kube-api-access-4mt88\") pod \"auto-csr-approver-29550830-t6nld\" (UID: \"fd0bfbf3-81ff-4403-98da-d03b5baa13ae\") " pod="openshift-infra/auto-csr-approver-29550830-t6nld" Mar 09 09:50:00 crc kubenswrapper[4971]: I0309 09:50:00.412655 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mt88\" (UniqueName: \"kubernetes.io/projected/fd0bfbf3-81ff-4403-98da-d03b5baa13ae-kube-api-access-4mt88\") pod \"auto-csr-approver-29550830-t6nld\" (UID: \"fd0bfbf3-81ff-4403-98da-d03b5baa13ae\") " pod="openshift-infra/auto-csr-approver-29550830-t6nld" Mar 09 09:50:00 crc kubenswrapper[4971]: I0309 09:50:00.435209 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mt88\" (UniqueName: \"kubernetes.io/projected/fd0bfbf3-81ff-4403-98da-d03b5baa13ae-kube-api-access-4mt88\") pod \"auto-csr-approver-29550830-t6nld\" (UID: \"fd0bfbf3-81ff-4403-98da-d03b5baa13ae\") " pod="openshift-infra/auto-csr-approver-29550830-t6nld" Mar 09 09:50:00 crc kubenswrapper[4971]: I0309 09:50:00.473980 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550830-t6nld" Mar 09 09:50:01 crc kubenswrapper[4971]: I0309 09:50:01.010688 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550830-t6nld"] Mar 09 09:50:01 crc kubenswrapper[4971]: I0309 09:50:01.023424 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:50:01 crc kubenswrapper[4971]: I0309 09:50:01.968221 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550830-t6nld" event={"ID":"fd0bfbf3-81ff-4403-98da-d03b5baa13ae","Type":"ContainerStarted","Data":"f118d61a9c88a4ca945140971b948a2e3a99353940986e29980a7cfcaf3d8fd8"} Mar 09 09:50:02 crc kubenswrapper[4971]: I0309 09:50:02.977894 4971 generic.go:334] "Generic (PLEG): container finished" podID="fd0bfbf3-81ff-4403-98da-d03b5baa13ae" containerID="b94c023fd09c46b872894ea4b638d1206bd64cfd0dd05ff4d670a7f12624ed17" exitCode=0 Mar 09 09:50:02 crc kubenswrapper[4971]: I0309 09:50:02.978065 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550830-t6nld" event={"ID":"fd0bfbf3-81ff-4403-98da-d03b5baa13ae","Type":"ContainerDied","Data":"b94c023fd09c46b872894ea4b638d1206bd64cfd0dd05ff4d670a7f12624ed17"} Mar 09 09:50:04 crc kubenswrapper[4971]: I0309 09:50:04.252310 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550830-t6nld" Mar 09 09:50:04 crc kubenswrapper[4971]: I0309 09:50:04.276083 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mt88\" (UniqueName: \"kubernetes.io/projected/fd0bfbf3-81ff-4403-98da-d03b5baa13ae-kube-api-access-4mt88\") pod \"fd0bfbf3-81ff-4403-98da-d03b5baa13ae\" (UID: \"fd0bfbf3-81ff-4403-98da-d03b5baa13ae\") " Mar 09 09:50:04 crc kubenswrapper[4971]: I0309 09:50:04.281079 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd0bfbf3-81ff-4403-98da-d03b5baa13ae-kube-api-access-4mt88" (OuterVolumeSpecName: "kube-api-access-4mt88") pod "fd0bfbf3-81ff-4403-98da-d03b5baa13ae" (UID: "fd0bfbf3-81ff-4403-98da-d03b5baa13ae"). InnerVolumeSpecName "kube-api-access-4mt88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:50:04 crc kubenswrapper[4971]: I0309 09:50:04.377477 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mt88\" (UniqueName: \"kubernetes.io/projected/fd0bfbf3-81ff-4403-98da-d03b5baa13ae-kube-api-access-4mt88\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:04 crc kubenswrapper[4971]: I0309 09:50:04.993360 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550830-t6nld" event={"ID":"fd0bfbf3-81ff-4403-98da-d03b5baa13ae","Type":"ContainerDied","Data":"f118d61a9c88a4ca945140971b948a2e3a99353940986e29980a7cfcaf3d8fd8"} Mar 09 09:50:04 crc kubenswrapper[4971]: I0309 09:50:04.993399 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f118d61a9c88a4ca945140971b948a2e3a99353940986e29980a7cfcaf3d8fd8" Mar 09 09:50:04 crc kubenswrapper[4971]: I0309 09:50:04.993455 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550830-t6nld" Mar 09 09:50:05 crc kubenswrapper[4971]: I0309 09:50:05.308892 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550824-9v5wh"] Mar 09 09:50:05 crc kubenswrapper[4971]: I0309 09:50:05.315338 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550824-9v5wh"] Mar 09 09:50:07 crc kubenswrapper[4971]: I0309 09:50:07.173278 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65488d0c-4805-4f55-ac7c-8c838a321dd8" path="/var/lib/kubelet/pods/65488d0c-4805-4f55-ac7c-8c838a321dd8/volumes" Mar 09 09:50:12 crc kubenswrapper[4971]: I0309 09:50:12.152277 4971 scope.go:117] "RemoveContainer" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" Mar 09 09:50:12 crc kubenswrapper[4971]: E0309 09:50:12.153150 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 09:50:24 crc kubenswrapper[4971]: I0309 09:50:24.152666 4971 scope.go:117] "RemoveContainer" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" Mar 09 09:50:24 crc kubenswrapper[4971]: E0309 09:50:24.153405 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 09:50:37 crc kubenswrapper[4971]: I0309 09:50:37.155535 4971 scope.go:117] "RemoveContainer" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" Mar 09 09:50:37 crc kubenswrapper[4971]: E0309 09:50:37.156308 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 09:50:40 crc kubenswrapper[4971]: I0309 09:50:40.624646 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw"] Mar 09 09:50:40 crc kubenswrapper[4971]: E0309 09:50:40.625366 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd0bfbf3-81ff-4403-98da-d03b5baa13ae" containerName="oc" Mar 09 09:50:40 crc kubenswrapper[4971]: I0309 09:50:40.625386 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd0bfbf3-81ff-4403-98da-d03b5baa13ae" containerName="oc" Mar 09 09:50:40 crc kubenswrapper[4971]: I0309 09:50:40.625557 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd0bfbf3-81ff-4403-98da-d03b5baa13ae" containerName="oc" Mar 09 09:50:40 crc kubenswrapper[4971]: I0309 09:50:40.626108 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw" Mar 09 09:50:40 crc kubenswrapper[4971]: I0309 09:50:40.628556 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:50:40 crc kubenswrapper[4971]: I0309 09:50:40.629312 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:50:40 crc kubenswrapper[4971]: I0309 09:50:40.642153 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw"] Mar 09 09:50:40 crc kubenswrapper[4971]: I0309 09:50:40.813234 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-swiftconf\") pod \"swift-ring-rebalance-debug-lb8sw\" (UID: \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw" Mar 09 09:50:40 crc kubenswrapper[4971]: I0309 09:50:40.813314 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-scripts\") pod \"swift-ring-rebalance-debug-lb8sw\" (UID: \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw" Mar 09 09:50:40 crc kubenswrapper[4971]: I0309 09:50:40.813422 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-ring-data-devices\") pod \"swift-ring-rebalance-debug-lb8sw\" (UID: \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw" Mar 09 09:50:40 crc kubenswrapper[4971]: I0309 09:50:40.813585 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-dispersionconf\") pod \"swift-ring-rebalance-debug-lb8sw\" (UID: \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw" Mar 09 09:50:40 crc kubenswrapper[4971]: I0309 09:50:40.813641 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c695z\" (UniqueName: \"kubernetes.io/projected/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-kube-api-access-c695z\") pod \"swift-ring-rebalance-debug-lb8sw\" (UID: \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw" Mar 09 09:50:40 crc kubenswrapper[4971]: I0309 09:50:40.813712 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-etc-swift\") pod \"swift-ring-rebalance-debug-lb8sw\" (UID: \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw" Mar 09 09:50:40 crc kubenswrapper[4971]: I0309 09:50:40.914954 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-dispersionconf\") pod \"swift-ring-rebalance-debug-lb8sw\" (UID: \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw" Mar 09 09:50:40 crc kubenswrapper[4971]: I0309 09:50:40.915009 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c695z\" (UniqueName: \"kubernetes.io/projected/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-kube-api-access-c695z\") pod \"swift-ring-rebalance-debug-lb8sw\" (UID: \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw" Mar 09 09:50:40 crc kubenswrapper[4971]: I0309 09:50:40.915044 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-etc-swift\") pod \"swift-ring-rebalance-debug-lb8sw\" (UID: \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw" Mar 09 09:50:40 crc kubenswrapper[4971]: I0309 09:50:40.915118 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-swiftconf\") pod \"swift-ring-rebalance-debug-lb8sw\" (UID: \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw" Mar 09 09:50:40 crc kubenswrapper[4971]: I0309 09:50:40.915145 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-scripts\") pod \"swift-ring-rebalance-debug-lb8sw\" (UID: \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw" Mar 09 09:50:40 crc kubenswrapper[4971]: I0309 09:50:40.915175 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-ring-data-devices\") pod \"swift-ring-rebalance-debug-lb8sw\" (UID: \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw" Mar 09 09:50:40 crc kubenswrapper[4971]: I0309 09:50:40.915822 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-etc-swift\") pod \"swift-ring-rebalance-debug-lb8sw\" (UID: \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw" Mar 09 09:50:40 crc kubenswrapper[4971]: I0309 09:50:40.916122 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-ring-data-devices\") pod \"swift-ring-rebalance-debug-lb8sw\" (UID: \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw" Mar 09 09:50:40 crc kubenswrapper[4971]: I0309 09:50:40.916128 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-scripts\") pod \"swift-ring-rebalance-debug-lb8sw\" (UID: \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw" Mar 09 09:50:40 crc kubenswrapper[4971]: I0309 09:50:40.921242 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-dispersionconf\") pod \"swift-ring-rebalance-debug-lb8sw\" (UID: \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw" Mar 09 09:50:40 crc kubenswrapper[4971]: I0309 09:50:40.924798 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-swiftconf\") pod \"swift-ring-rebalance-debug-lb8sw\" (UID: \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw" Mar 09 09:50:40 crc kubenswrapper[4971]: I0309 09:50:40.933856 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c695z\" (UniqueName: \"kubernetes.io/projected/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-kube-api-access-c695z\") pod \"swift-ring-rebalance-debug-lb8sw\" (UID: \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw" Mar 09 09:50:40 crc kubenswrapper[4971]: I0309 09:50:40.943266 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw" Mar 09 09:50:41 crc kubenswrapper[4971]: I0309 09:50:41.165069 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw"] Mar 09 09:50:41 crc kubenswrapper[4971]: I0309 09:50:41.240050 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw" event={"ID":"1a0a4d5f-08d9-4793-be76-67380d3fdc9e","Type":"ContainerStarted","Data":"1d73d13e3739895a5d384ca72223fc3695999eebc9b4abacced4670e5f336e2a"} Mar 09 09:50:41 crc kubenswrapper[4971]: I0309 09:50:41.785198 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 09 09:50:41 crc kubenswrapper[4971]: I0309 09:50:41.790142 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:50:41 crc kubenswrapper[4971]: I0309 09:50:41.791988 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 09 09:50:41 crc kubenswrapper[4971]: I0309 09:50:41.797478 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:50:41 crc kubenswrapper[4971]: I0309 09:50:41.808565 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 09 09:50:41 crc kubenswrapper[4971]: I0309 09:50:41.824789 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 09 09:50:41 crc kubenswrapper[4971]: I0309 09:50:41.933717 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/604e95e7-5b66-4837-ae0a-2b08c59fac4b-lock\") pod \"swift-storage-1\" (UID: \"604e95e7-5b66-4837-ae0a-2b08c59fac4b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:50:41 crc kubenswrapper[4971]: I0309 09:50:41.933769 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/604e95e7-5b66-4837-ae0a-2b08c59fac4b-etc-swift\") pod \"swift-storage-1\" (UID: \"604e95e7-5b66-4837-ae0a-2b08c59fac4b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:50:41 crc kubenswrapper[4971]: I0309 09:50:41.933808 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-1\" (UID: \"604e95e7-5b66-4837-ae0a-2b08c59fac4b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:50:41 crc kubenswrapper[4971]: I0309 09:50:41.934037 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ae2371a4-446c-4c46-844e-0132f54ca498-etc-swift\") pod \"swift-storage-2\" (UID: \"ae2371a4-446c-4c46-844e-0132f54ca498\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:50:41 crc kubenswrapper[4971]: I0309 09:50:41.934089 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ae2371a4-446c-4c46-844e-0132f54ca498-cache\") pod \"swift-storage-2\" (UID: \"ae2371a4-446c-4c46-844e-0132f54ca498\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:50:41 crc kubenswrapper[4971]: I0309 09:50:41.934113 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ae2371a4-446c-4c46-844e-0132f54ca498-lock\") pod \"swift-storage-2\" (UID: \"ae2371a4-446c-4c46-844e-0132f54ca498\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:50:41 crc kubenswrapper[4971]: I0309 09:50:41.934133 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jsnb\" (UniqueName: \"kubernetes.io/projected/604e95e7-5b66-4837-ae0a-2b08c59fac4b-kube-api-access-8jsnb\") pod \"swift-storage-1\" (UID: \"604e95e7-5b66-4837-ae0a-2b08c59fac4b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:50:41 crc kubenswrapper[4971]: I0309 09:50:41.934195 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4mqf\" (UniqueName: \"kubernetes.io/projected/ae2371a4-446c-4c46-844e-0132f54ca498-kube-api-access-c4mqf\") pod \"swift-storage-2\" (UID: \"ae2371a4-446c-4c46-844e-0132f54ca498\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:50:41 crc kubenswrapper[4971]: I0309 09:50:41.934416 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-2\" (UID: \"ae2371a4-446c-4c46-844e-0132f54ca498\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:50:41 crc kubenswrapper[4971]: I0309 09:50:41.934458 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/604e95e7-5b66-4837-ae0a-2b08c59fac4b-cache\") pod \"swift-storage-1\" (UID: \"604e95e7-5b66-4837-ae0a-2b08c59fac4b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.035394 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4mqf\" (UniqueName: \"kubernetes.io/projected/ae2371a4-446c-4c46-844e-0132f54ca498-kube-api-access-c4mqf\") pod \"swift-storage-2\" (UID: \"ae2371a4-446c-4c46-844e-0132f54ca498\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.035507 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-2\" (UID: \"ae2371a4-446c-4c46-844e-0132f54ca498\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.035530 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/604e95e7-5b66-4837-ae0a-2b08c59fac4b-cache\") pod \"swift-storage-1\" (UID: \"604e95e7-5b66-4837-ae0a-2b08c59fac4b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.035546 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/604e95e7-5b66-4837-ae0a-2b08c59fac4b-lock\") pod \"swift-storage-1\" (UID: \"604e95e7-5b66-4837-ae0a-2b08c59fac4b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.035563 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/604e95e7-5b66-4837-ae0a-2b08c59fac4b-etc-swift\") pod \"swift-storage-1\" (UID: \"604e95e7-5b66-4837-ae0a-2b08c59fac4b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.035586 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-1\" (UID: \"604e95e7-5b66-4837-ae0a-2b08c59fac4b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.035608 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ae2371a4-446c-4c46-844e-0132f54ca498-etc-swift\") pod \"swift-storage-2\" (UID: \"ae2371a4-446c-4c46-844e-0132f54ca498\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.035627 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ae2371a4-446c-4c46-844e-0132f54ca498-cache\") pod \"swift-storage-2\" (UID: \"ae2371a4-446c-4c46-844e-0132f54ca498\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.035648 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ae2371a4-446c-4c46-844e-0132f54ca498-lock\") pod \"swift-storage-2\" (UID: \"ae2371a4-446c-4c46-844e-0132f54ca498\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.035663 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jsnb\" (UniqueName: \"kubernetes.io/projected/604e95e7-5b66-4837-ae0a-2b08c59fac4b-kube-api-access-8jsnb\") pod \"swift-storage-1\" (UID: \"604e95e7-5b66-4837-ae0a-2b08c59fac4b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.036241 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ae2371a4-446c-4c46-844e-0132f54ca498-cache\") pod \"swift-storage-2\" (UID: \"ae2371a4-446c-4c46-844e-0132f54ca498\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.036337 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ae2371a4-446c-4c46-844e-0132f54ca498-lock\") pod \"swift-storage-2\" (UID: \"ae2371a4-446c-4c46-844e-0132f54ca498\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.036443 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/604e95e7-5b66-4837-ae0a-2b08c59fac4b-lock\") pod \"swift-storage-1\" (UID: \"604e95e7-5b66-4837-ae0a-2b08c59fac4b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.036630 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/604e95e7-5b66-4837-ae0a-2b08c59fac4b-cache\") pod \"swift-storage-1\" (UID: \"604e95e7-5b66-4837-ae0a-2b08c59fac4b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.036818 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-2\" (UID: \"ae2371a4-446c-4c46-844e-0132f54ca498\") device mount path \"/mnt/openstack/pv11\"" pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.037029 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-1\" (UID: \"604e95e7-5b66-4837-ae0a-2b08c59fac4b\") device mount path \"/mnt/openstack/pv10\"" pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.047337 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/604e95e7-5b66-4837-ae0a-2b08c59fac4b-etc-swift\") pod \"swift-storage-1\" (UID: \"604e95e7-5b66-4837-ae0a-2b08c59fac4b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.048948 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ae2371a4-446c-4c46-844e-0132f54ca498-etc-swift\") pod \"swift-storage-2\" (UID: \"ae2371a4-446c-4c46-844e-0132f54ca498\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.054012 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4mqf\" (UniqueName: \"kubernetes.io/projected/ae2371a4-446c-4c46-844e-0132f54ca498-kube-api-access-c4mqf\") pod \"swift-storage-2\" (UID: \"ae2371a4-446c-4c46-844e-0132f54ca498\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.057698 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-1\" (UID: \"604e95e7-5b66-4837-ae0a-2b08c59fac4b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.059773 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-2\" (UID: \"ae2371a4-446c-4c46-844e-0132f54ca498\") " pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.068513 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jsnb\" (UniqueName: \"kubernetes.io/projected/604e95e7-5b66-4837-ae0a-2b08c59fac4b-kube-api-access-8jsnb\") pod \"swift-storage-1\" (UID: \"604e95e7-5b66-4837-ae0a-2b08c59fac4b\") " pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.108133 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-2" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.116817 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-1" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.252300 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw" event={"ID":"1a0a4d5f-08d9-4793-be76-67380d3fdc9e","Type":"ContainerStarted","Data":"cc6b8c6d16fdcd26b230328427571d004b086a3b69f56cebfabf68371b62b69c"} Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.276173 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw" podStartSLOduration=2.2761514 podStartE2EDuration="2.2761514s" podCreationTimestamp="2026-03-09 09:50:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:50:42.267957743 +0000 UTC m=+1845.827885573" watchObservedRunningTime="2026-03-09 09:50:42.2761514 +0000 UTC m=+1845.836079220" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.514371 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-4gkzk"] Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.515561 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-76c998454c-4gkzk" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.527281 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-4gkzk"] Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.603850 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.644340 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmmnl\" (UniqueName: \"kubernetes.io/projected/c0f6b660-a1e1-4d7d-bff5-3b2cc666bada-kube-api-access-cmmnl\") pod \"swift-proxy-76c998454c-4gkzk\" (UID: \"c0f6b660-a1e1-4d7d-bff5-3b2cc666bada\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-4gkzk" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.644436 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f6b660-a1e1-4d7d-bff5-3b2cc666bada-config-data\") pod \"swift-proxy-76c998454c-4gkzk\" (UID: \"c0f6b660-a1e1-4d7d-bff5-3b2cc666bada\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-4gkzk" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.644501 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0f6b660-a1e1-4d7d-bff5-3b2cc666bada-log-httpd\") pod \"swift-proxy-76c998454c-4gkzk\" (UID: \"c0f6b660-a1e1-4d7d-bff5-3b2cc666bada\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-4gkzk" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.644529 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c0f6b660-a1e1-4d7d-bff5-3b2cc666bada-etc-swift\") pod \"swift-proxy-76c998454c-4gkzk\" (UID: \"c0f6b660-a1e1-4d7d-bff5-3b2cc666bada\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-4gkzk" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.644578 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0f6b660-a1e1-4d7d-bff5-3b2cc666bada-run-httpd\") pod \"swift-proxy-76c998454c-4gkzk\" (UID: \"c0f6b660-a1e1-4d7d-bff5-3b2cc666bada\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-4gkzk" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.671799 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.746191 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f6b660-a1e1-4d7d-bff5-3b2cc666bada-config-data\") pod \"swift-proxy-76c998454c-4gkzk\" (UID: \"c0f6b660-a1e1-4d7d-bff5-3b2cc666bada\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-4gkzk" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.746256 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0f6b660-a1e1-4d7d-bff5-3b2cc666bada-log-httpd\") pod \"swift-proxy-76c998454c-4gkzk\" (UID: \"c0f6b660-a1e1-4d7d-bff5-3b2cc666bada\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-4gkzk" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.746290 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c0f6b660-a1e1-4d7d-bff5-3b2cc666bada-etc-swift\") pod \"swift-proxy-76c998454c-4gkzk\" (UID: \"c0f6b660-a1e1-4d7d-bff5-3b2cc666bada\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-4gkzk" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.746362 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0f6b660-a1e1-4d7d-bff5-3b2cc666bada-run-httpd\") pod \"swift-proxy-76c998454c-4gkzk\" (UID: \"c0f6b660-a1e1-4d7d-bff5-3b2cc666bada\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-4gkzk" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.746416 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmmnl\" (UniqueName: \"kubernetes.io/projected/c0f6b660-a1e1-4d7d-bff5-3b2cc666bada-kube-api-access-cmmnl\") pod \"swift-proxy-76c998454c-4gkzk\" (UID: \"c0f6b660-a1e1-4d7d-bff5-3b2cc666bada\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-4gkzk" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.746917 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0f6b660-a1e1-4d7d-bff5-3b2cc666bada-run-httpd\") pod \"swift-proxy-76c998454c-4gkzk\" (UID: \"c0f6b660-a1e1-4d7d-bff5-3b2cc666bada\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-4gkzk" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.746917 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0f6b660-a1e1-4d7d-bff5-3b2cc666bada-log-httpd\") pod \"swift-proxy-76c998454c-4gkzk\" (UID: \"c0f6b660-a1e1-4d7d-bff5-3b2cc666bada\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-4gkzk" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.752982 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f6b660-a1e1-4d7d-bff5-3b2cc666bada-config-data\") pod \"swift-proxy-76c998454c-4gkzk\" (UID: \"c0f6b660-a1e1-4d7d-bff5-3b2cc666bada\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-4gkzk" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.754550 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c0f6b660-a1e1-4d7d-bff5-3b2cc666bada-etc-swift\") pod \"swift-proxy-76c998454c-4gkzk\" (UID: \"c0f6b660-a1e1-4d7d-bff5-3b2cc666bada\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-4gkzk" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.775001 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmmnl\" (UniqueName: \"kubernetes.io/projected/c0f6b660-a1e1-4d7d-bff5-3b2cc666bada-kube-api-access-cmmnl\") pod \"swift-proxy-76c998454c-4gkzk\" (UID: \"c0f6b660-a1e1-4d7d-bff5-3b2cc666bada\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-4gkzk" Mar 09 09:50:42 crc kubenswrapper[4971]: I0309 09:50:42.834747 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-76c998454c-4gkzk" Mar 09 09:50:43 crc kubenswrapper[4971]: I0309 09:50:43.388597 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"604e95e7-5b66-4837-ae0a-2b08c59fac4b","Type":"ContainerStarted","Data":"1c20572f1fa053ed747bd81f7a38cf50c24eb78e8c346bdf1e9fb47fb422d7dd"} Mar 09 09:50:43 crc kubenswrapper[4971]: I0309 09:50:43.388916 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"604e95e7-5b66-4837-ae0a-2b08c59fac4b","Type":"ContainerStarted","Data":"f3705ae8476902d80041d583513756920dff6c6a22324f6ddf99f6a388ae200f"} Mar 09 09:50:43 crc kubenswrapper[4971]: I0309 09:50:43.388928 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"604e95e7-5b66-4837-ae0a-2b08c59fac4b","Type":"ContainerStarted","Data":"7804b1583900dc1c6ed2e05516db6ae0bb19652c1d5a98e3cca95daaba2dd1db"} Mar 09 09:50:43 crc kubenswrapper[4971]: I0309 09:50:43.388936 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"604e95e7-5b66-4837-ae0a-2b08c59fac4b","Type":"ContainerStarted","Data":"787db45098bf30a96ec56176d771e184af9ff30f15caddb3c1b333506caa8297"} Mar 09 09:50:43 crc kubenswrapper[4971]: I0309 09:50:43.420719 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-4gkzk"] Mar 09 09:50:43 crc kubenswrapper[4971]: I0309 09:50:43.422021 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"ae2371a4-446c-4c46-844e-0132f54ca498","Type":"ContainerStarted","Data":"155f51a64ec3c8ea5bdd177bd93d13590536b03e6f67b2f3c41092e5d0e0e1db"} Mar 09 09:50:43 crc kubenswrapper[4971]: I0309 09:50:43.422165 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"ae2371a4-446c-4c46-844e-0132f54ca498","Type":"ContainerStarted","Data":"c0b99f35256119b21c1881c92884703bb28d276ea2cb6b5547ab2a7231833f8e"} Mar 09 09:50:43 crc kubenswrapper[4971]: I0309 09:50:43.422227 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"ae2371a4-446c-4c46-844e-0132f54ca498","Type":"ContainerStarted","Data":"9499e54c74738c6485c459756663991e6f4685c8918adaa6359aae04b59adf09"} Mar 09 09:50:43 crc kubenswrapper[4971]: I0309 09:50:43.422307 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"ae2371a4-446c-4c46-844e-0132f54ca498","Type":"ContainerStarted","Data":"a0d396af55864d6f2a209b876c9301ed0cf48499febf2847f4ebc358090c352c"} Mar 09 09:50:43 crc kubenswrapper[4971]: W0309 09:50:43.478305 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0f6b660_a1e1_4d7d_bff5_3b2cc666bada.slice/crio-a66d19054a5f203b8f87eccba7fecb926f0b71bb724910a25b6783fa4481aa20 WatchSource:0}: Error finding container a66d19054a5f203b8f87eccba7fecb926f0b71bb724910a25b6783fa4481aa20: Status 404 returned error can't find the container with id a66d19054a5f203b8f87eccba7fecb926f0b71bb724910a25b6783fa4481aa20 Mar 09 09:50:44 crc kubenswrapper[4971]: I0309 09:50:44.430301 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-4gkzk" event={"ID":"c0f6b660-a1e1-4d7d-bff5-3b2cc666bada","Type":"ContainerStarted","Data":"0c509c5f8e6f6b8f3ff73444d9161a858782c29922666e5be4cb6f08cf7264df"} Mar 09 09:50:44 crc kubenswrapper[4971]: I0309 09:50:44.431463 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-76c998454c-4gkzk" Mar 09 09:50:44 crc kubenswrapper[4971]: I0309 09:50:44.431532 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-4gkzk" event={"ID":"c0f6b660-a1e1-4d7d-bff5-3b2cc666bada","Type":"ContainerStarted","Data":"1bd21db5f8bc78baa9283a9079983c7e7b00081924a8e3b91c294c7d19b7faf5"} Mar 09 09:50:44 crc kubenswrapper[4971]: I0309 09:50:44.431599 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-4gkzk" event={"ID":"c0f6b660-a1e1-4d7d-bff5-3b2cc666bada","Type":"ContainerStarted","Data":"a66d19054a5f203b8f87eccba7fecb926f0b71bb724910a25b6783fa4481aa20"} Mar 09 09:50:44 crc kubenswrapper[4971]: I0309 09:50:44.431735 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-76c998454c-4gkzk" Mar 09 09:50:44 crc kubenswrapper[4971]: I0309 09:50:44.434607 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"604e95e7-5b66-4837-ae0a-2b08c59fac4b","Type":"ContainerStarted","Data":"712e5690c8b867af1fd39ccff567af5d2b13c1be180237b012c9b538b6fd579f"} Mar 09 09:50:44 crc kubenswrapper[4971]: I0309 09:50:44.434649 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"604e95e7-5b66-4837-ae0a-2b08c59fac4b","Type":"ContainerStarted","Data":"cdfccdb3c26a2c9a59706ddd275b72096d33e7e0b1af362b969a78b5176f76bb"} Mar 09 09:50:44 crc kubenswrapper[4971]: I0309 09:50:44.434659 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"604e95e7-5b66-4837-ae0a-2b08c59fac4b","Type":"ContainerStarted","Data":"e8499eb1ce171d2c14cc8007a3a18911c043b33ac582df546e303a46f951c1ce"} Mar 09 09:50:44 crc kubenswrapper[4971]: I0309 09:50:44.434667 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"604e95e7-5b66-4837-ae0a-2b08c59fac4b","Type":"ContainerStarted","Data":"1f672d4a7adf35d57c751a43df1b5dd1b46a4f456afce719ab93a2619670e3fe"} Mar 09 09:50:44 crc kubenswrapper[4971]: I0309 09:50:44.439166 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"ae2371a4-446c-4c46-844e-0132f54ca498","Type":"ContainerStarted","Data":"a961818ff223c935f2eaa26bb180f5b315717ff964fa407901e35ae99b68b7e5"} Mar 09 09:50:44 crc kubenswrapper[4971]: I0309 09:50:44.439204 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"ae2371a4-446c-4c46-844e-0132f54ca498","Type":"ContainerStarted","Data":"af1f48bef7d5978878be7f401303ee38795132e0ba09eb5441f26ebb352d819e"} Mar 09 09:50:44 crc kubenswrapper[4971]: I0309 09:50:44.439214 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"ae2371a4-446c-4c46-844e-0132f54ca498","Type":"ContainerStarted","Data":"cdfec57990f5aebfdf40ef73ffefc4adbb1aa99bc0e0d290e05d70b6b8b12ba4"} Mar 09 09:50:44 crc kubenswrapper[4971]: I0309 09:50:44.439222 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"ae2371a4-446c-4c46-844e-0132f54ca498","Type":"ContainerStarted","Data":"0b6fef5c675622fb69eaf6afaf955443c387309b84867e4d5cdca1d4ebb9582f"} Mar 09 09:50:44 crc kubenswrapper[4971]: I0309 09:50:44.440983 4971 generic.go:334] "Generic (PLEG): container finished" podID="1a0a4d5f-08d9-4793-be76-67380d3fdc9e" containerID="cc6b8c6d16fdcd26b230328427571d004b086a3b69f56cebfabf68371b62b69c" exitCode=0 Mar 09 09:50:44 crc kubenswrapper[4971]: I0309 09:50:44.441087 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw" event={"ID":"1a0a4d5f-08d9-4793-be76-67380d3fdc9e","Type":"ContainerDied","Data":"cc6b8c6d16fdcd26b230328427571d004b086a3b69f56cebfabf68371b62b69c"} Mar 09 09:50:44 crc kubenswrapper[4971]: I0309 09:50:44.461881 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-proxy-76c998454c-4gkzk" podStartSLOduration=2.461860836 podStartE2EDuration="2.461860836s" podCreationTimestamp="2026-03-09 09:50:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:50:44.457753547 +0000 UTC m=+1848.017681377" watchObservedRunningTime="2026-03-09 09:50:44.461860836 +0000 UTC m=+1848.021788656" Mar 09 09:50:45 crc kubenswrapper[4971]: I0309 09:50:45.531373 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"604e95e7-5b66-4837-ae0a-2b08c59fac4b","Type":"ContainerStarted","Data":"a04ff70c6f326f420773508675e93d09b40d7ea6dd26b4cd39fa49ee7ee94af1"} Mar 09 09:50:45 crc kubenswrapper[4971]: I0309 09:50:45.531738 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"604e95e7-5b66-4837-ae0a-2b08c59fac4b","Type":"ContainerStarted","Data":"9119083e95b9225a7017f65d31382e0ef0588aa5e02fbb4ed03a4e07c4b12c2d"} Mar 09 09:50:45 crc kubenswrapper[4971]: I0309 09:50:45.531751 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"604e95e7-5b66-4837-ae0a-2b08c59fac4b","Type":"ContainerStarted","Data":"3dc1607fda8ebcebfa5ed1d334a09719a647cc23497bc113ccdfb93ece1848d4"} Mar 09 09:50:45 crc kubenswrapper[4971]: I0309 09:50:45.703239 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-55fsl"] Mar 09 09:50:45 crc kubenswrapper[4971]: I0309 09:50:45.767391 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"ae2371a4-446c-4c46-844e-0132f54ca498","Type":"ContainerStarted","Data":"9f3f56db7a6e9d475fbff3a445a1c7d2cf799f5533ace159b44f3b6f74e7f8a6"} Mar 09 09:50:45 crc kubenswrapper[4971]: I0309 09:50:45.767443 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"ae2371a4-446c-4c46-844e-0132f54ca498","Type":"ContainerStarted","Data":"dac1a0bc2136b131380184c97c6077ef57e66aad2ed9555c1cb184cd93a02e05"} Mar 09 09:50:45 crc kubenswrapper[4971]: I0309 09:50:45.767456 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"ae2371a4-446c-4c46-844e-0132f54ca498","Type":"ContainerStarted","Data":"8d3cb8b6f70b3126f4411b800e65ab19568030c15a4f3ed99465ab240580cd81"} Mar 09 09:50:45 crc kubenswrapper[4971]: I0309 09:50:45.771401 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-55fsl"] Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:45.863424 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-2bspt"] Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:45.865101 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.025190 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-etc-swift\") pod \"swift-ring-rebalance-2bspt\" (UID: \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\") " pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.025255 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-dispersionconf\") pod \"swift-ring-rebalance-2bspt\" (UID: \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\") " pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.025275 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-ring-data-devices\") pod \"swift-ring-rebalance-2bspt\" (UID: \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\") " pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.025326 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-scripts\") pod \"swift-ring-rebalance-2bspt\" (UID: \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\") " pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.025359 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-swiftconf\") pod \"swift-ring-rebalance-2bspt\" (UID: \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\") " pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.025407 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr9tb\" (UniqueName: \"kubernetes.io/projected/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-kube-api-access-vr9tb\") pod \"swift-ring-rebalance-2bspt\" (UID: \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\") " pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.233286 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-2bspt"] Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.234498 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-etc-swift\") pod \"swift-ring-rebalance-2bspt\" (UID: \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\") " pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.234544 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-ring-data-devices\") pod \"swift-ring-rebalance-2bspt\" (UID: \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\") " pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.234559 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-dispersionconf\") pod \"swift-ring-rebalance-2bspt\" (UID: \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\") " pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.234586 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-scripts\") pod \"swift-ring-rebalance-2bspt\" (UID: \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\") " pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.234605 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-swiftconf\") pod \"swift-ring-rebalance-2bspt\" (UID: \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\") " pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.234647 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr9tb\" (UniqueName: \"kubernetes.io/projected/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-kube-api-access-vr9tb\") pod \"swift-ring-rebalance-2bspt\" (UID: \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\") " pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.244163 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-scripts\") pod \"swift-ring-rebalance-2bspt\" (UID: \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\") " pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.245551 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-etc-swift\") pod \"swift-ring-rebalance-2bspt\" (UID: \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\") " pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.245850 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-ring-data-devices\") pod \"swift-ring-rebalance-2bspt\" (UID: \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\") " pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.271067 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-swiftconf\") pod \"swift-ring-rebalance-2bspt\" (UID: \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\") " pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.277666 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-dispersionconf\") pod \"swift-ring-rebalance-2bspt\" (UID: \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\") " pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.292962 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr9tb\" (UniqueName: \"kubernetes.io/projected/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-kube-api-access-vr9tb\") pod \"swift-ring-rebalance-2bspt\" (UID: \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\") " pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.560997 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.676702 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.730425 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw"] Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.736292 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw"] Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.818779 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"604e95e7-5b66-4837-ae0a-2b08c59fac4b","Type":"ContainerStarted","Data":"6bf1145ecd0c4f7e113aa32d205f4ef118ef88779ebdb60bc043db86c304c09d"} Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.819285 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"604e95e7-5b66-4837-ae0a-2b08c59fac4b","Type":"ContainerStarted","Data":"8d0404af3ff28a7a246a4b7b514a806aac1b504ed6887ce336e9058b5f016fc7"} Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.819440 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"604e95e7-5b66-4837-ae0a-2b08c59fac4b","Type":"ContainerStarted","Data":"88e7d649462b77a7c13eda687e3811e27e8e82acbe4b250da689b0a457903045"} Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.837954 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"ae2371a4-446c-4c46-844e-0132f54ca498","Type":"ContainerStarted","Data":"77e26a0504b759df535002c20af31c283cf96c43ec3efb05f23cd92bba92e71f"} Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.837992 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"ae2371a4-446c-4c46-844e-0132f54ca498","Type":"ContainerStarted","Data":"b205beaad5a643dac0419169617a28833c44453d6d5faba6548e02a7a8b5acb5"} Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.838002 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"ae2371a4-446c-4c46-844e-0132f54ca498","Type":"ContainerStarted","Data":"818ecf2c0688adaf99be102c145df5027849bf01361cef2f32b0e6172df5c8c1"} Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.842833 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lb8sw" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.842887 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d73d13e3739895a5d384ca72223fc3695999eebc9b4abacced4670e5f336e2a" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.856057 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-dispersionconf\") pod \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\" (UID: \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\") " Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.856092 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c695z\" (UniqueName: \"kubernetes.io/projected/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-kube-api-access-c695z\") pod \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\" (UID: \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\") " Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.856149 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-ring-data-devices\") pod \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\" (UID: \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\") " Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.856168 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-etc-swift\") pod \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\" (UID: \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\") " Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.856210 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-swiftconf\") pod \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\" (UID: \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\") " Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.856230 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-scripts\") pod \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\" (UID: \"1a0a4d5f-08d9-4793-be76-67380d3fdc9e\") " Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.857563 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1a0a4d5f-08d9-4793-be76-67380d3fdc9e" (UID: "1a0a4d5f-08d9-4793-be76-67380d3fdc9e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.858480 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "1a0a4d5f-08d9-4793-be76-67380d3fdc9e" (UID: "1a0a4d5f-08d9-4793-be76-67380d3fdc9e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.865384 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-kube-api-access-c695z" (OuterVolumeSpecName: "kube-api-access-c695z") pod "1a0a4d5f-08d9-4793-be76-67380d3fdc9e" (UID: "1a0a4d5f-08d9-4793-be76-67380d3fdc9e"). InnerVolumeSpecName "kube-api-access-c695z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.893792 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "1a0a4d5f-08d9-4793-be76-67380d3fdc9e" (UID: "1a0a4d5f-08d9-4793-be76-67380d3fdc9e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.911364 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm"] Mar 09 09:50:46 crc kubenswrapper[4971]: E0309 09:50:46.911792 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a0a4d5f-08d9-4793-be76-67380d3fdc9e" containerName="swift-ring-rebalance" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.911816 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a0a4d5f-08d9-4793-be76-67380d3fdc9e" containerName="swift-ring-rebalance" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.912052 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a0a4d5f-08d9-4793-be76-67380d3fdc9e" containerName="swift-ring-rebalance" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.912765 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.917808 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-scripts" (OuterVolumeSpecName: "scripts") pod "1a0a4d5f-08d9-4793-be76-67380d3fdc9e" (UID: "1a0a4d5f-08d9-4793-be76-67380d3fdc9e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.929520 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "1a0a4d5f-08d9-4793-be76-67380d3fdc9e" (UID: "1a0a4d5f-08d9-4793-be76-67380d3fdc9e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.942203 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm"] Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.959798 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.968767 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.968819 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.968833 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.968847 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.968861 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c695z\" (UniqueName: \"kubernetes.io/projected/1a0a4d5f-08d9-4793-be76-67380d3fdc9e-kube-api-access-c695z\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:46 crc kubenswrapper[4971]: I0309 09:50:46.974454 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-2bspt"] Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.070058 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2xww\" (UniqueName: \"kubernetes.io/projected/17af7be5-5a55-4a79-b05f-098df11f2550-kube-api-access-s2xww\") pod \"swift-ring-rebalance-debug-fs2pm\" (UID: \"17af7be5-5a55-4a79-b05f-098df11f2550\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.070421 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/17af7be5-5a55-4a79-b05f-098df11f2550-etc-swift\") pod \"swift-ring-rebalance-debug-fs2pm\" (UID: \"17af7be5-5a55-4a79-b05f-098df11f2550\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.070546 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17af7be5-5a55-4a79-b05f-098df11f2550-scripts\") pod \"swift-ring-rebalance-debug-fs2pm\" (UID: \"17af7be5-5a55-4a79-b05f-098df11f2550\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.070569 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/17af7be5-5a55-4a79-b05f-098df11f2550-ring-data-devices\") pod \"swift-ring-rebalance-debug-fs2pm\" (UID: \"17af7be5-5a55-4a79-b05f-098df11f2550\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.070605 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/17af7be5-5a55-4a79-b05f-098df11f2550-dispersionconf\") pod \"swift-ring-rebalance-debug-fs2pm\" (UID: \"17af7be5-5a55-4a79-b05f-098df11f2550\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.070633 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/17af7be5-5a55-4a79-b05f-098df11f2550-swiftconf\") pod \"swift-ring-rebalance-debug-fs2pm\" (UID: \"17af7be5-5a55-4a79-b05f-098df11f2550\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.168995 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a0a4d5f-08d9-4793-be76-67380d3fdc9e" path="/var/lib/kubelet/pods/1a0a4d5f-08d9-4793-be76-67380d3fdc9e/volumes" Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.170310 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b90b7a65-9704-4b25-9ad9-56ed4bb14886" path="/var/lib/kubelet/pods/b90b7a65-9704-4b25-9ad9-56ed4bb14886/volumes" Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.171735 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17af7be5-5a55-4a79-b05f-098df11f2550-scripts\") pod \"swift-ring-rebalance-debug-fs2pm\" (UID: \"17af7be5-5a55-4a79-b05f-098df11f2550\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.171890 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/17af7be5-5a55-4a79-b05f-098df11f2550-ring-data-devices\") pod \"swift-ring-rebalance-debug-fs2pm\" (UID: \"17af7be5-5a55-4a79-b05f-098df11f2550\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.172003 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/17af7be5-5a55-4a79-b05f-098df11f2550-dispersionconf\") pod \"swift-ring-rebalance-debug-fs2pm\" (UID: \"17af7be5-5a55-4a79-b05f-098df11f2550\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.172099 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/17af7be5-5a55-4a79-b05f-098df11f2550-swiftconf\") pod \"swift-ring-rebalance-debug-fs2pm\" (UID: \"17af7be5-5a55-4a79-b05f-098df11f2550\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.172227 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2xww\" (UniqueName: \"kubernetes.io/projected/17af7be5-5a55-4a79-b05f-098df11f2550-kube-api-access-s2xww\") pod \"swift-ring-rebalance-debug-fs2pm\" (UID: \"17af7be5-5a55-4a79-b05f-098df11f2550\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.172332 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/17af7be5-5a55-4a79-b05f-098df11f2550-etc-swift\") pod \"swift-ring-rebalance-debug-fs2pm\" (UID: \"17af7be5-5a55-4a79-b05f-098df11f2550\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.172969 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/17af7be5-5a55-4a79-b05f-098df11f2550-etc-swift\") pod \"swift-ring-rebalance-debug-fs2pm\" (UID: \"17af7be5-5a55-4a79-b05f-098df11f2550\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.173201 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/17af7be5-5a55-4a79-b05f-098df11f2550-ring-data-devices\") pod \"swift-ring-rebalance-debug-fs2pm\" (UID: \"17af7be5-5a55-4a79-b05f-098df11f2550\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.173698 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17af7be5-5a55-4a79-b05f-098df11f2550-scripts\") pod \"swift-ring-rebalance-debug-fs2pm\" (UID: \"17af7be5-5a55-4a79-b05f-098df11f2550\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.176374 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/17af7be5-5a55-4a79-b05f-098df11f2550-swiftconf\") pod \"swift-ring-rebalance-debug-fs2pm\" (UID: \"17af7be5-5a55-4a79-b05f-098df11f2550\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.178205 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/17af7be5-5a55-4a79-b05f-098df11f2550-dispersionconf\") pod \"swift-ring-rebalance-debug-fs2pm\" (UID: \"17af7be5-5a55-4a79-b05f-098df11f2550\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.193968 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2xww\" (UniqueName: \"kubernetes.io/projected/17af7be5-5a55-4a79-b05f-098df11f2550-kube-api-access-s2xww\") pod \"swift-ring-rebalance-debug-fs2pm\" (UID: \"17af7be5-5a55-4a79-b05f-098df11f2550\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.306219 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.600515 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm"] Mar 09 09:50:47 crc kubenswrapper[4971]: W0309 09:50:47.604695 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17af7be5_5a55_4a79_b05f_098df11f2550.slice/crio-f8fa584c9c3dfc79e7ba2734f99073c5534fc78ff307774167e694821e25716e WatchSource:0}: Error finding container f8fa584c9c3dfc79e7ba2734f99073c5534fc78ff307774167e694821e25716e: Status 404 returned error can't find the container with id f8fa584c9c3dfc79e7ba2734f99073c5534fc78ff307774167e694821e25716e Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.852718 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" event={"ID":"17af7be5-5a55-4a79-b05f-098df11f2550","Type":"ContainerStarted","Data":"2d9637d355dbf9e3eb1b55f43bcdece9da435ade7e950c9dcf4dfe0c05c04d65"} Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.853150 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" event={"ID":"17af7be5-5a55-4a79-b05f-098df11f2550","Type":"ContainerStarted","Data":"f8fa584c9c3dfc79e7ba2734f99073c5534fc78ff307774167e694821e25716e"} Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.860926 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"ae2371a4-446c-4c46-844e-0132f54ca498","Type":"ContainerStarted","Data":"28ef06d3b19b0278f92bb58a9f993b613e2c80a6b4b28ee6a30f2c6456d494dc"} Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.860977 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"ae2371a4-446c-4c46-844e-0132f54ca498","Type":"ContainerStarted","Data":"d98a94ba5c77223c29bd7dc3daa6cd81014aaf3443c95e44d394485589adc408"} Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.874910 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" podStartSLOduration=1.874888814 podStartE2EDuration="1.874888814s" podCreationTimestamp="2026-03-09 09:50:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:50:47.871572149 +0000 UTC m=+1851.431499979" watchObservedRunningTime="2026-03-09 09:50:47.874888814 +0000 UTC m=+1851.434816624" Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.896894 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"604e95e7-5b66-4837-ae0a-2b08c59fac4b","Type":"ContainerStarted","Data":"1d972ab5e713cd1f2a820cb665dfbe634b1a56855b2c80d1ddd90e2c1beea94f"} Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.897146 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"604e95e7-5b66-4837-ae0a-2b08c59fac4b","Type":"ContainerStarted","Data":"949021e14329e3e36497bce2908805f971293c7ca1d0306cc1b70a1bb3934733"} Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.903766 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" event={"ID":"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c","Type":"ContainerStarted","Data":"2e0ff353acec97343d4bb2f8655975997bbb0221a3369a9fe6d3d078ccf4f2f5"} Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.903821 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" event={"ID":"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c","Type":"ContainerStarted","Data":"27792c9e714436c9a24a428407be08c0f86dd8391514d44ce11c2eb69a062ffd"} Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.920393 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-2" podStartSLOduration=7.9203725259999995 podStartE2EDuration="7.920372526s" podCreationTimestamp="2026-03-09 09:50:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:50:47.916593907 +0000 UTC m=+1851.476521737" watchObservedRunningTime="2026-03-09 09:50:47.920372526 +0000 UTC m=+1851.480300336" Mar 09 09:50:47 crc kubenswrapper[4971]: I0309 09:50:47.943726 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" podStartSLOduration=2.943704099 podStartE2EDuration="2.943704099s" podCreationTimestamp="2026-03-09 09:50:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:50:47.937397377 +0000 UTC m=+1851.497325207" watchObservedRunningTime="2026-03-09 09:50:47.943704099 +0000 UTC m=+1851.503631909" Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.013026 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-2bspt"] Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.015457 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-1" podStartSLOduration=8.015432997 podStartE2EDuration="8.015432997s" podCreationTimestamp="2026-03-09 09:50:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:50:48.00030194 +0000 UTC m=+1851.560229760" watchObservedRunningTime="2026-03-09 09:50:48.015432997 +0000 UTC m=+1851.575360807" Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.039572 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.042657 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="account-server" containerID="cri-o://70a4736032a91173a8081a9d98939447d2f1ecece350b377d82dde74455e3069" gracePeriod=30 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.043101 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="container-updater" containerID="cri-o://e990ced9d0c8867ebf7e37598faa5157c49269703059f5157a1ed4694f8990a2" gracePeriod=30 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.043152 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="account-reaper" containerID="cri-o://b6a6726c2672f3f891932c39730d7bf24ebbbaf74b224a8e384a86cc59e3866f" gracePeriod=30 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.043224 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="account-auditor" containerID="cri-o://10310cdbcf68045954b30655355f31ab5609c827bd113ceb6f36c7acdb67568b" gracePeriod=30 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.043204 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="container-replicator" containerID="cri-o://6cac6b8f712e2749ddb02ae9fa9c26cde95ba42afc84e1c1ab4eb15dda4699ae" gracePeriod=30 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.043255 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="account-replicator" containerID="cri-o://33d9df249f56981ed105c7cd6de3c253f19a8190ec6977cac54400d478cb03e7" gracePeriod=30 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.043356 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="container-auditor" containerID="cri-o://439626a0c50fc4086e7623b9d44cec2d3c1789da2fdcfd975bd6b2c21ff67bde" gracePeriod=30 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.043393 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="container-server" containerID="cri-o://384d552cdcfa77a481f9fa7d9339755f30189066b1c0db676dd5af57ff9f3d86" gracePeriod=30 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.043447 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="object-expirer" containerID="cri-o://8173809deda0ceecec61a266c95370ae3e2c5629f0fc60a80c3c4c47b3894635" gracePeriod=30 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.043485 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="rsync" containerID="cri-o://977e8188d1d5d046c5f1d39c7dac78e43224aaeb87daa3d4bee54aee924a5664" gracePeriod=30 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.043495 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="container-sharder" containerID="cri-o://896548583b712817fd9ac2457aa52b694754b489cd6ec4d51bcf7a2f46d8c65d" gracePeriod=30 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.043564 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="object-auditor" containerID="cri-o://d7adf37c0f5ab81fc50a28e521f676d1489460683f4b8dfc2e36052a19f9d07e" gracePeriod=30 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.043585 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="object-updater" containerID="cri-o://0862fe7dba311c6c6b693d2c332d0b86b3f53acee8d036abe908aff5cb6d6e8f" gracePeriod=30 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.043514 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="swift-recon-cron" containerID="cri-o://bd08ba1a76889c53f7f72fa52eabc950709d56d78c23c1a9da6fd3dbfe751148" gracePeriod=30 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.043566 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="object-replicator" containerID="cri-o://e0fd9fa4462f1853a43e7cba4a34485ff301c975cb23eef544f0c47106ccc0de" gracePeriod=30 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.043106 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="object-server" containerID="cri-o://24779d2ef2862ffd4c9ec9f48e564d2007ab125e37e3d3b4e67d1ea10b04135f" gracePeriod=30 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.916221 4971 generic.go:334] "Generic (PLEG): container finished" podID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerID="896548583b712817fd9ac2457aa52b694754b489cd6ec4d51bcf7a2f46d8c65d" exitCode=0 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.917788 4971 generic.go:334] "Generic (PLEG): container finished" podID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerID="977e8188d1d5d046c5f1d39c7dac78e43224aaeb87daa3d4bee54aee924a5664" exitCode=0 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.917895 4971 generic.go:334] "Generic (PLEG): container finished" podID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerID="8173809deda0ceecec61a266c95370ae3e2c5629f0fc60a80c3c4c47b3894635" exitCode=0 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.917977 4971 generic.go:334] "Generic (PLEG): container finished" podID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerID="0862fe7dba311c6c6b693d2c332d0b86b3f53acee8d036abe908aff5cb6d6e8f" exitCode=0 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.918051 4971 generic.go:334] "Generic (PLEG): container finished" podID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerID="d7adf37c0f5ab81fc50a28e521f676d1489460683f4b8dfc2e36052a19f9d07e" exitCode=0 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.918121 4971 generic.go:334] "Generic (PLEG): container finished" podID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerID="e0fd9fa4462f1853a43e7cba4a34485ff301c975cb23eef544f0c47106ccc0de" exitCode=0 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.918228 4971 generic.go:334] "Generic (PLEG): container finished" podID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerID="24779d2ef2862ffd4c9ec9f48e564d2007ab125e37e3d3b4e67d1ea10b04135f" exitCode=0 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.918328 4971 generic.go:334] "Generic (PLEG): container finished" podID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerID="e990ced9d0c8867ebf7e37598faa5157c49269703059f5157a1ed4694f8990a2" exitCode=0 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.918430 4971 generic.go:334] "Generic (PLEG): container finished" podID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerID="439626a0c50fc4086e7623b9d44cec2d3c1789da2fdcfd975bd6b2c21ff67bde" exitCode=0 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.918512 4971 generic.go:334] "Generic (PLEG): container finished" podID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerID="6cac6b8f712e2749ddb02ae9fa9c26cde95ba42afc84e1c1ab4eb15dda4699ae" exitCode=0 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.918570 4971 generic.go:334] "Generic (PLEG): container finished" podID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerID="384d552cdcfa77a481f9fa7d9339755f30189066b1c0db676dd5af57ff9f3d86" exitCode=0 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.918651 4971 generic.go:334] "Generic (PLEG): container finished" podID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerID="b6a6726c2672f3f891932c39730d7bf24ebbbaf74b224a8e384a86cc59e3866f" exitCode=0 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.918730 4971 generic.go:334] "Generic (PLEG): container finished" podID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerID="10310cdbcf68045954b30655355f31ab5609c827bd113ceb6f36c7acdb67568b" exitCode=0 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.918834 4971 generic.go:334] "Generic (PLEG): container finished" podID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerID="33d9df249f56981ed105c7cd6de3c253f19a8190ec6977cac54400d478cb03e7" exitCode=0 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.918940 4971 generic.go:334] "Generic (PLEG): container finished" podID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerID="70a4736032a91173a8081a9d98939447d2f1ecece350b377d82dde74455e3069" exitCode=0 Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.916412 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerDied","Data":"896548583b712817fd9ac2457aa52b694754b489cd6ec4d51bcf7a2f46d8c65d"} Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.919209 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerDied","Data":"977e8188d1d5d046c5f1d39c7dac78e43224aaeb87daa3d4bee54aee924a5664"} Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.919283 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerDied","Data":"8173809deda0ceecec61a266c95370ae3e2c5629f0fc60a80c3c4c47b3894635"} Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.919359 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerDied","Data":"0862fe7dba311c6c6b693d2c332d0b86b3f53acee8d036abe908aff5cb6d6e8f"} Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.919435 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerDied","Data":"d7adf37c0f5ab81fc50a28e521f676d1489460683f4b8dfc2e36052a19f9d07e"} Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.919497 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerDied","Data":"e0fd9fa4462f1853a43e7cba4a34485ff301c975cb23eef544f0c47106ccc0de"} Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.919558 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerDied","Data":"24779d2ef2862ffd4c9ec9f48e564d2007ab125e37e3d3b4e67d1ea10b04135f"} Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.919622 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerDied","Data":"e990ced9d0c8867ebf7e37598faa5157c49269703059f5157a1ed4694f8990a2"} Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.919681 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerDied","Data":"439626a0c50fc4086e7623b9d44cec2d3c1789da2fdcfd975bd6b2c21ff67bde"} Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.919745 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerDied","Data":"6cac6b8f712e2749ddb02ae9fa9c26cde95ba42afc84e1c1ab4eb15dda4699ae"} Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.919818 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerDied","Data":"384d552cdcfa77a481f9fa7d9339755f30189066b1c0db676dd5af57ff9f3d86"} Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.919906 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerDied","Data":"b6a6726c2672f3f891932c39730d7bf24ebbbaf74b224a8e384a86cc59e3866f"} Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.919996 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerDied","Data":"10310cdbcf68045954b30655355f31ab5609c827bd113ceb6f36c7acdb67568b"} Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.920033 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerDied","Data":"33d9df249f56981ed105c7cd6de3c253f19a8190ec6977cac54400d478cb03e7"} Mar 09 09:50:48 crc kubenswrapper[4971]: I0309 09:50:48.920045 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerDied","Data":"70a4736032a91173a8081a9d98939447d2f1ecece350b377d82dde74455e3069"} Mar 09 09:50:49 crc kubenswrapper[4971]: I0309 09:50:49.152054 4971 scope.go:117] "RemoveContainer" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" Mar 09 09:50:49 crc kubenswrapper[4971]: E0309 09:50:49.152482 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 09:50:49 crc kubenswrapper[4971]: I0309 09:50:49.925044 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" podUID="2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c" containerName="swift-ring-rebalance" containerID="cri-o://2e0ff353acec97343d4bb2f8655975997bbb0221a3369a9fe6d3d078ccf4f2f5" gracePeriod=30 Mar 09 09:50:52 crc kubenswrapper[4971]: I0309 09:50:52.839533 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-76c998454c-4gkzk" Mar 09 09:50:52 crc kubenswrapper[4971]: I0309 09:50:52.841070 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-76c998454c-4gkzk" Mar 09 09:50:52 crc kubenswrapper[4971]: I0309 09:50:52.937522 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb"] Mar 09 09:50:52 crc kubenswrapper[4971]: I0309 09:50:52.938522 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" podUID="9eafe0b1-303d-4cc5-af18-c9a3d72b38b4" containerName="proxy-httpd" containerID="cri-o://f6992a76acc8f35c94812030cb8c28fdce50d0cc6d424c959284f760cf2f0eb2" gracePeriod=30 Mar 09 09:50:52 crc kubenswrapper[4971]: I0309 09:50:52.938531 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" podUID="9eafe0b1-303d-4cc5-af18-c9a3d72b38b4" containerName="proxy-server" containerID="cri-o://448453c8bf3527aa1dc5871a57c937e5c9757f4fe793744201997b2ed232154c" gracePeriod=30 Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.522987 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4hqzd"] Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.527975 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4hqzd" Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.539905 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4hqzd"] Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.546851 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.674263 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-run-httpd\") pod \"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4\" (UID: \"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4\") " Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.674722 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9eafe0b1-303d-4cc5-af18-c9a3d72b38b4" (UID: "9eafe0b1-303d-4cc5-af18-c9a3d72b38b4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.675152 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-log-httpd\") pod \"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4\" (UID: \"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4\") " Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.675312 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-etc-swift\") pod \"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4\" (UID: \"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4\") " Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.675461 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9eafe0b1-303d-4cc5-af18-c9a3d72b38b4" (UID: "9eafe0b1-303d-4cc5-af18-c9a3d72b38b4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.675549 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftfz5\" (UniqueName: \"kubernetes.io/projected/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-kube-api-access-ftfz5\") pod \"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4\" (UID: \"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4\") " Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.675703 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-config-data\") pod \"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4\" (UID: \"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4\") " Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.676005 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9mmd\" (UniqueName: \"kubernetes.io/projected/59f43291-066e-4226-9f06-c73014d8f899-kube-api-access-b9mmd\") pod \"redhat-marketplace-4hqzd\" (UID: \"59f43291-066e-4226-9f06-c73014d8f899\") " pod="openshift-marketplace/redhat-marketplace-4hqzd" Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.676118 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59f43291-066e-4226-9f06-c73014d8f899-catalog-content\") pod \"redhat-marketplace-4hqzd\" (UID: \"59f43291-066e-4226-9f06-c73014d8f899\") " pod="openshift-marketplace/redhat-marketplace-4hqzd" Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.676205 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59f43291-066e-4226-9f06-c73014d8f899-utilities\") pod \"redhat-marketplace-4hqzd\" (UID: \"59f43291-066e-4226-9f06-c73014d8f899\") " pod="openshift-marketplace/redhat-marketplace-4hqzd" Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.676359 4971 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.676435 4971 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.682638 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9eafe0b1-303d-4cc5-af18-c9a3d72b38b4" (UID: "9eafe0b1-303d-4cc5-af18-c9a3d72b38b4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.688517 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-kube-api-access-ftfz5" (OuterVolumeSpecName: "kube-api-access-ftfz5") pod "9eafe0b1-303d-4cc5-af18-c9a3d72b38b4" (UID: "9eafe0b1-303d-4cc5-af18-c9a3d72b38b4"). InnerVolumeSpecName "kube-api-access-ftfz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.717380 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-config-data" (OuterVolumeSpecName: "config-data") pod "9eafe0b1-303d-4cc5-af18-c9a3d72b38b4" (UID: "9eafe0b1-303d-4cc5-af18-c9a3d72b38b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.777486 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9mmd\" (UniqueName: \"kubernetes.io/projected/59f43291-066e-4226-9f06-c73014d8f899-kube-api-access-b9mmd\") pod \"redhat-marketplace-4hqzd\" (UID: \"59f43291-066e-4226-9f06-c73014d8f899\") " pod="openshift-marketplace/redhat-marketplace-4hqzd" Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.777806 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59f43291-066e-4226-9f06-c73014d8f899-catalog-content\") pod \"redhat-marketplace-4hqzd\" (UID: \"59f43291-066e-4226-9f06-c73014d8f899\") " pod="openshift-marketplace/redhat-marketplace-4hqzd" Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.777900 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59f43291-066e-4226-9f06-c73014d8f899-utilities\") pod \"redhat-marketplace-4hqzd\" (UID: \"59f43291-066e-4226-9f06-c73014d8f899\") " pod="openshift-marketplace/redhat-marketplace-4hqzd" Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.778083 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.778154 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.778222 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftfz5\" (UniqueName: \"kubernetes.io/projected/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4-kube-api-access-ftfz5\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.778434 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59f43291-066e-4226-9f06-c73014d8f899-utilities\") pod \"redhat-marketplace-4hqzd\" (UID: \"59f43291-066e-4226-9f06-c73014d8f899\") " pod="openshift-marketplace/redhat-marketplace-4hqzd" Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.778691 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59f43291-066e-4226-9f06-c73014d8f899-catalog-content\") pod \"redhat-marketplace-4hqzd\" (UID: \"59f43291-066e-4226-9f06-c73014d8f899\") " pod="openshift-marketplace/redhat-marketplace-4hqzd" Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.799056 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9mmd\" (UniqueName: \"kubernetes.io/projected/59f43291-066e-4226-9f06-c73014d8f899-kube-api-access-b9mmd\") pod \"redhat-marketplace-4hqzd\" (UID: \"59f43291-066e-4226-9f06-c73014d8f899\") " pod="openshift-marketplace/redhat-marketplace-4hqzd" Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.865443 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4hqzd" Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.965767 4971 generic.go:334] "Generic (PLEG): container finished" podID="9eafe0b1-303d-4cc5-af18-c9a3d72b38b4" containerID="448453c8bf3527aa1dc5871a57c937e5c9757f4fe793744201997b2ed232154c" exitCode=0 Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.967508 4971 generic.go:334] "Generic (PLEG): container finished" podID="9eafe0b1-303d-4cc5-af18-c9a3d72b38b4" containerID="f6992a76acc8f35c94812030cb8c28fdce50d0cc6d424c959284f760cf2f0eb2" exitCode=0 Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.965864 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" event={"ID":"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4","Type":"ContainerDied","Data":"448453c8bf3527aa1dc5871a57c937e5c9757f4fe793744201997b2ed232154c"} Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.967553 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" event={"ID":"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4","Type":"ContainerDied","Data":"f6992a76acc8f35c94812030cb8c28fdce50d0cc6d424c959284f760cf2f0eb2"} Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.967568 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" event={"ID":"9eafe0b1-303d-4cc5-af18-c9a3d72b38b4","Type":"ContainerDied","Data":"0e2689e9b77c35b20af58de129e3a1646e5cc3e031477dab4cc4252b9d329442"} Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.967589 4971 scope.go:117] "RemoveContainer" containerID="448453c8bf3527aa1dc5871a57c937e5c9757f4fe793744201997b2ed232154c" Mar 09 09:50:53 crc kubenswrapper[4971]: I0309 09:50:53.965837 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb" Mar 09 09:50:54 crc kubenswrapper[4971]: I0309 09:50:54.007634 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb"] Mar 09 09:50:54 crc kubenswrapper[4971]: I0309 09:50:54.013274 4971 scope.go:117] "RemoveContainer" containerID="f6992a76acc8f35c94812030cb8c28fdce50d0cc6d424c959284f760cf2f0eb2" Mar 09 09:50:54 crc kubenswrapper[4971]: I0309 09:50:54.014257 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-proxy-6fcb54769f-hp2hb"] Mar 09 09:50:54 crc kubenswrapper[4971]: I0309 09:50:54.039645 4971 scope.go:117] "RemoveContainer" containerID="448453c8bf3527aa1dc5871a57c937e5c9757f4fe793744201997b2ed232154c" Mar 09 09:50:54 crc kubenswrapper[4971]: E0309 09:50:54.040170 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"448453c8bf3527aa1dc5871a57c937e5c9757f4fe793744201997b2ed232154c\": container with ID starting with 448453c8bf3527aa1dc5871a57c937e5c9757f4fe793744201997b2ed232154c not found: ID does not exist" containerID="448453c8bf3527aa1dc5871a57c937e5c9757f4fe793744201997b2ed232154c" Mar 09 09:50:54 crc kubenswrapper[4971]: I0309 09:50:54.040208 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"448453c8bf3527aa1dc5871a57c937e5c9757f4fe793744201997b2ed232154c"} err="failed to get container status \"448453c8bf3527aa1dc5871a57c937e5c9757f4fe793744201997b2ed232154c\": rpc error: code = NotFound desc = could not find container \"448453c8bf3527aa1dc5871a57c937e5c9757f4fe793744201997b2ed232154c\": container with ID starting with 448453c8bf3527aa1dc5871a57c937e5c9757f4fe793744201997b2ed232154c not found: ID does not exist" Mar 09 09:50:54 crc kubenswrapper[4971]: I0309 09:50:54.040231 4971 scope.go:117] "RemoveContainer" containerID="f6992a76acc8f35c94812030cb8c28fdce50d0cc6d424c959284f760cf2f0eb2" Mar 09 09:50:54 crc kubenswrapper[4971]: E0309 09:50:54.040685 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6992a76acc8f35c94812030cb8c28fdce50d0cc6d424c959284f760cf2f0eb2\": container with ID starting with f6992a76acc8f35c94812030cb8c28fdce50d0cc6d424c959284f760cf2f0eb2 not found: ID does not exist" containerID="f6992a76acc8f35c94812030cb8c28fdce50d0cc6d424c959284f760cf2f0eb2" Mar 09 09:50:54 crc kubenswrapper[4971]: I0309 09:50:54.040708 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6992a76acc8f35c94812030cb8c28fdce50d0cc6d424c959284f760cf2f0eb2"} err="failed to get container status \"f6992a76acc8f35c94812030cb8c28fdce50d0cc6d424c959284f760cf2f0eb2\": rpc error: code = NotFound desc = could not find container \"f6992a76acc8f35c94812030cb8c28fdce50d0cc6d424c959284f760cf2f0eb2\": container with ID starting with f6992a76acc8f35c94812030cb8c28fdce50d0cc6d424c959284f760cf2f0eb2 not found: ID does not exist" Mar 09 09:50:54 crc kubenswrapper[4971]: I0309 09:50:54.040722 4971 scope.go:117] "RemoveContainer" containerID="448453c8bf3527aa1dc5871a57c937e5c9757f4fe793744201997b2ed232154c" Mar 09 09:50:54 crc kubenswrapper[4971]: I0309 09:50:54.040927 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"448453c8bf3527aa1dc5871a57c937e5c9757f4fe793744201997b2ed232154c"} err="failed to get container status \"448453c8bf3527aa1dc5871a57c937e5c9757f4fe793744201997b2ed232154c\": rpc error: code = NotFound desc = could not find container \"448453c8bf3527aa1dc5871a57c937e5c9757f4fe793744201997b2ed232154c\": container with ID starting with 448453c8bf3527aa1dc5871a57c937e5c9757f4fe793744201997b2ed232154c not found: ID does not exist" Mar 09 09:50:54 crc kubenswrapper[4971]: I0309 09:50:54.040952 4971 scope.go:117] "RemoveContainer" containerID="f6992a76acc8f35c94812030cb8c28fdce50d0cc6d424c959284f760cf2f0eb2" Mar 09 09:50:54 crc kubenswrapper[4971]: I0309 09:50:54.041137 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6992a76acc8f35c94812030cb8c28fdce50d0cc6d424c959284f760cf2f0eb2"} err="failed to get container status \"f6992a76acc8f35c94812030cb8c28fdce50d0cc6d424c959284f760cf2f0eb2\": rpc error: code = NotFound desc = could not find container \"f6992a76acc8f35c94812030cb8c28fdce50d0cc6d424c959284f760cf2f0eb2\": container with ID starting with f6992a76acc8f35c94812030cb8c28fdce50d0cc6d424c959284f760cf2f0eb2 not found: ID does not exist" Mar 09 09:50:54 crc kubenswrapper[4971]: I0309 09:50:54.365088 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4hqzd"] Mar 09 09:50:54 crc kubenswrapper[4971]: I0309 09:50:54.913456 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wm52d"] Mar 09 09:50:54 crc kubenswrapper[4971]: E0309 09:50:54.915728 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9eafe0b1-303d-4cc5-af18-c9a3d72b38b4" containerName="proxy-server" Mar 09 09:50:54 crc kubenswrapper[4971]: I0309 09:50:54.915747 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eafe0b1-303d-4cc5-af18-c9a3d72b38b4" containerName="proxy-server" Mar 09 09:50:54 crc kubenswrapper[4971]: E0309 09:50:54.915765 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9eafe0b1-303d-4cc5-af18-c9a3d72b38b4" containerName="proxy-httpd" Mar 09 09:50:54 crc kubenswrapper[4971]: I0309 09:50:54.915772 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eafe0b1-303d-4cc5-af18-c9a3d72b38b4" containerName="proxy-httpd" Mar 09 09:50:54 crc kubenswrapper[4971]: I0309 09:50:54.915961 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9eafe0b1-303d-4cc5-af18-c9a3d72b38b4" containerName="proxy-server" Mar 09 09:50:54 crc kubenswrapper[4971]: I0309 09:50:54.915974 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9eafe0b1-303d-4cc5-af18-c9a3d72b38b4" containerName="proxy-httpd" Mar 09 09:50:54 crc kubenswrapper[4971]: I0309 09:50:54.919983 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wm52d" Mar 09 09:50:54 crc kubenswrapper[4971]: I0309 09:50:54.935041 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wm52d"] Mar 09 09:50:54 crc kubenswrapper[4971]: I0309 09:50:54.977569 4971 generic.go:334] "Generic (PLEG): container finished" podID="59f43291-066e-4226-9f06-c73014d8f899" containerID="69a5391bf4d4adea91b50225f87dec46d20bf0786abbe504a31e8ab7faf97ed1" exitCode=0 Mar 09 09:50:54 crc kubenswrapper[4971]: I0309 09:50:54.977636 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4hqzd" event={"ID":"59f43291-066e-4226-9f06-c73014d8f899","Type":"ContainerDied","Data":"69a5391bf4d4adea91b50225f87dec46d20bf0786abbe504a31e8ab7faf97ed1"} Mar 09 09:50:54 crc kubenswrapper[4971]: I0309 09:50:54.977913 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4hqzd" event={"ID":"59f43291-066e-4226-9f06-c73014d8f899","Type":"ContainerStarted","Data":"c95d79f5ebe33e6e57da8c0eab45943db60698c435715de0354a24d2cb39f806"} Mar 09 09:50:55 crc kubenswrapper[4971]: I0309 09:50:55.002350 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9-utilities\") pod \"redhat-operators-wm52d\" (UID: \"8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9\") " pod="openshift-marketplace/redhat-operators-wm52d" Mar 09 09:50:55 crc kubenswrapper[4971]: I0309 09:50:55.002473 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9-catalog-content\") pod \"redhat-operators-wm52d\" (UID: \"8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9\") " pod="openshift-marketplace/redhat-operators-wm52d" Mar 09 09:50:55 crc kubenswrapper[4971]: I0309 09:50:55.002500 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr5k4\" (UniqueName: \"kubernetes.io/projected/8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9-kube-api-access-lr5k4\") pod \"redhat-operators-wm52d\" (UID: \"8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9\") " pod="openshift-marketplace/redhat-operators-wm52d" Mar 09 09:50:55 crc kubenswrapper[4971]: I0309 09:50:55.103706 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9-catalog-content\") pod \"redhat-operators-wm52d\" (UID: \"8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9\") " pod="openshift-marketplace/redhat-operators-wm52d" Mar 09 09:50:55 crc kubenswrapper[4971]: I0309 09:50:55.103762 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr5k4\" (UniqueName: \"kubernetes.io/projected/8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9-kube-api-access-lr5k4\") pod \"redhat-operators-wm52d\" (UID: \"8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9\") " pod="openshift-marketplace/redhat-operators-wm52d" Mar 09 09:50:55 crc kubenswrapper[4971]: I0309 09:50:55.103816 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9-utilities\") pod \"redhat-operators-wm52d\" (UID: \"8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9\") " pod="openshift-marketplace/redhat-operators-wm52d" Mar 09 09:50:55 crc kubenswrapper[4971]: I0309 09:50:55.104370 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9-utilities\") pod \"redhat-operators-wm52d\" (UID: \"8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9\") " pod="openshift-marketplace/redhat-operators-wm52d" Mar 09 09:50:55 crc kubenswrapper[4971]: I0309 09:50:55.104591 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9-catalog-content\") pod \"redhat-operators-wm52d\" (UID: \"8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9\") " pod="openshift-marketplace/redhat-operators-wm52d" Mar 09 09:50:55 crc kubenswrapper[4971]: I0309 09:50:55.125732 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr5k4\" (UniqueName: \"kubernetes.io/projected/8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9-kube-api-access-lr5k4\") pod \"redhat-operators-wm52d\" (UID: \"8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9\") " pod="openshift-marketplace/redhat-operators-wm52d" Mar 09 09:50:55 crc kubenswrapper[4971]: I0309 09:50:55.177221 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9eafe0b1-303d-4cc5-af18-c9a3d72b38b4" path="/var/lib/kubelet/pods/9eafe0b1-303d-4cc5-af18-c9a3d72b38b4/volumes" Mar 09 09:50:55 crc kubenswrapper[4971]: I0309 09:50:55.238594 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wm52d" Mar 09 09:50:55 crc kubenswrapper[4971]: I0309 09:50:55.687934 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wm52d"] Mar 09 09:50:55 crc kubenswrapper[4971]: W0309 09:50:55.697469 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8eebc3a1_432b_4f2c_a94c_c7abf5b3bda9.slice/crio-c2b711da18b32c6b2a0789f3a5ec2c995c6fc771fe49883e4006375a377a577c WatchSource:0}: Error finding container c2b711da18b32c6b2a0789f3a5ec2c995c6fc771fe49883e4006375a377a577c: Status 404 returned error can't find the container with id c2b711da18b32c6b2a0789f3a5ec2c995c6fc771fe49883e4006375a377a577c Mar 09 09:50:55 crc kubenswrapper[4971]: I0309 09:50:55.990361 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4hqzd" event={"ID":"59f43291-066e-4226-9f06-c73014d8f899","Type":"ContainerStarted","Data":"89160f568ace9e2c4d450b9b914c96263c4ccfd36673522d89d8d8619a41a24d"} Mar 09 09:50:55 crc kubenswrapper[4971]: I0309 09:50:55.992034 4971 generic.go:334] "Generic (PLEG): container finished" podID="8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9" containerID="87ef4b54add85b1c11a2bd760d52b16682ed01d614a7c3e9ce13cc9efc44805d" exitCode=0 Mar 09 09:50:55 crc kubenswrapper[4971]: I0309 09:50:55.992084 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wm52d" event={"ID":"8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9","Type":"ContainerDied","Data":"87ef4b54add85b1c11a2bd760d52b16682ed01d614a7c3e9ce13cc9efc44805d"} Mar 09 09:50:55 crc kubenswrapper[4971]: I0309 09:50:55.992141 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wm52d" event={"ID":"8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9","Type":"ContainerStarted","Data":"c2b711da18b32c6b2a0789f3a5ec2c995c6fc771fe49883e4006375a377a577c"} Mar 09 09:50:56 crc kubenswrapper[4971]: I0309 09:50:56.441701 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" Mar 09 09:50:56 crc kubenswrapper[4971]: I0309 09:50:56.521534 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-swiftconf\") pod \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\" (UID: \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\") " Mar 09 09:50:56 crc kubenswrapper[4971]: I0309 09:50:56.521605 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr9tb\" (UniqueName: \"kubernetes.io/projected/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-kube-api-access-vr9tb\") pod \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\" (UID: \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\") " Mar 09 09:50:56 crc kubenswrapper[4971]: I0309 09:50:56.521667 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-etc-swift\") pod \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\" (UID: \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\") " Mar 09 09:50:56 crc kubenswrapper[4971]: I0309 09:50:56.521704 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-ring-data-devices\") pod \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\" (UID: \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\") " Mar 09 09:50:56 crc kubenswrapper[4971]: I0309 09:50:56.521758 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-scripts\") pod \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\" (UID: \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\") " Mar 09 09:50:56 crc kubenswrapper[4971]: I0309 09:50:56.521855 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-dispersionconf\") pod \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\" (UID: \"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c\") " Mar 09 09:50:56 crc kubenswrapper[4971]: I0309 09:50:56.522475 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c" (UID: "2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:50:56 crc kubenswrapper[4971]: I0309 09:50:56.522573 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c" (UID: "2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:50:56 crc kubenswrapper[4971]: I0309 09:50:56.522852 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:56 crc kubenswrapper[4971]: I0309 09:50:56.522880 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:56 crc kubenswrapper[4971]: I0309 09:50:56.530619 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-kube-api-access-vr9tb" (OuterVolumeSpecName: "kube-api-access-vr9tb") pod "2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c" (UID: "2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c"). InnerVolumeSpecName "kube-api-access-vr9tb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:50:56 crc kubenswrapper[4971]: I0309 09:50:56.543478 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c" (UID: "2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:50:56 crc kubenswrapper[4971]: I0309 09:50:56.543515 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-scripts" (OuterVolumeSpecName: "scripts") pod "2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c" (UID: "2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:50:56 crc kubenswrapper[4971]: I0309 09:50:56.546220 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c" (UID: "2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:50:56 crc kubenswrapper[4971]: I0309 09:50:56.623843 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:56 crc kubenswrapper[4971]: I0309 09:50:56.623879 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr9tb\" (UniqueName: \"kubernetes.io/projected/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-kube-api-access-vr9tb\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:56 crc kubenswrapper[4971]: I0309 09:50:56.623889 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:56 crc kubenswrapper[4971]: I0309 09:50:56.623902 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:57 crc kubenswrapper[4971]: I0309 09:50:57.002322 4971 generic.go:334] "Generic (PLEG): container finished" podID="59f43291-066e-4226-9f06-c73014d8f899" containerID="89160f568ace9e2c4d450b9b914c96263c4ccfd36673522d89d8d8619a41a24d" exitCode=0 Mar 09 09:50:57 crc kubenswrapper[4971]: I0309 09:50:57.002420 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4hqzd" event={"ID":"59f43291-066e-4226-9f06-c73014d8f899","Type":"ContainerDied","Data":"89160f568ace9e2c4d450b9b914c96263c4ccfd36673522d89d8d8619a41a24d"} Mar 09 09:50:57 crc kubenswrapper[4971]: I0309 09:50:57.004703 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wm52d" event={"ID":"8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9","Type":"ContainerStarted","Data":"370ac44f2b28df09f0de1cd4553fa8a40fa23e0fd0bfe5a3314f5aeae9207756"} Mar 09 09:50:57 crc kubenswrapper[4971]: I0309 09:50:57.008152 4971 generic.go:334] "Generic (PLEG): container finished" podID="2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c" containerID="2e0ff353acec97343d4bb2f8655975997bbb0221a3369a9fe6d3d078ccf4f2f5" exitCode=0 Mar 09 09:50:57 crc kubenswrapper[4971]: I0309 09:50:57.008201 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" Mar 09 09:50:57 crc kubenswrapper[4971]: I0309 09:50:57.008199 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" event={"ID":"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c","Type":"ContainerDied","Data":"2e0ff353acec97343d4bb2f8655975997bbb0221a3369a9fe6d3d078ccf4f2f5"} Mar 09 09:50:57 crc kubenswrapper[4971]: I0309 09:50:57.009058 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-2bspt" event={"ID":"2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c","Type":"ContainerDied","Data":"27792c9e714436c9a24a428407be08c0f86dd8391514d44ce11c2eb69a062ffd"} Mar 09 09:50:57 crc kubenswrapper[4971]: I0309 09:50:57.009087 4971 scope.go:117] "RemoveContainer" containerID="2e0ff353acec97343d4bb2f8655975997bbb0221a3369a9fe6d3d078ccf4f2f5" Mar 09 09:50:57 crc kubenswrapper[4971]: I0309 09:50:57.030941 4971 scope.go:117] "RemoveContainer" containerID="2e0ff353acec97343d4bb2f8655975997bbb0221a3369a9fe6d3d078ccf4f2f5" Mar 09 09:50:57 crc kubenswrapper[4971]: E0309 09:50:57.031469 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e0ff353acec97343d4bb2f8655975997bbb0221a3369a9fe6d3d078ccf4f2f5\": container with ID starting with 2e0ff353acec97343d4bb2f8655975997bbb0221a3369a9fe6d3d078ccf4f2f5 not found: ID does not exist" containerID="2e0ff353acec97343d4bb2f8655975997bbb0221a3369a9fe6d3d078ccf4f2f5" Mar 09 09:50:57 crc kubenswrapper[4971]: I0309 09:50:57.031513 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e0ff353acec97343d4bb2f8655975997bbb0221a3369a9fe6d3d078ccf4f2f5"} err="failed to get container status \"2e0ff353acec97343d4bb2f8655975997bbb0221a3369a9fe6d3d078ccf4f2f5\": rpc error: code = NotFound desc = could not find container \"2e0ff353acec97343d4bb2f8655975997bbb0221a3369a9fe6d3d078ccf4f2f5\": container with ID starting with 2e0ff353acec97343d4bb2f8655975997bbb0221a3369a9fe6d3d078ccf4f2f5 not found: ID does not exist" Mar 09 09:50:57 crc kubenswrapper[4971]: I0309 09:50:57.063844 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-2bspt"] Mar 09 09:50:57 crc kubenswrapper[4971]: I0309 09:50:57.073579 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-2bspt"] Mar 09 09:50:57 crc kubenswrapper[4971]: I0309 09:50:57.161889 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c" path="/var/lib/kubelet/pods/2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c/volumes" Mar 09 09:50:58 crc kubenswrapper[4971]: I0309 09:50:58.020337 4971 generic.go:334] "Generic (PLEG): container finished" podID="8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9" containerID="370ac44f2b28df09f0de1cd4553fa8a40fa23e0fd0bfe5a3314f5aeae9207756" exitCode=0 Mar 09 09:50:58 crc kubenswrapper[4971]: I0309 09:50:58.020562 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wm52d" event={"ID":"8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9","Type":"ContainerDied","Data":"370ac44f2b28df09f0de1cd4553fa8a40fa23e0fd0bfe5a3314f5aeae9207756"} Mar 09 09:50:58 crc kubenswrapper[4971]: I0309 09:50:58.028847 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4hqzd" event={"ID":"59f43291-066e-4226-9f06-c73014d8f899","Type":"ContainerStarted","Data":"d71bf64f5bdd0ca8aece2e8acd9f32c9121446833e140a5449b678fbf2c1327a"} Mar 09 09:50:58 crc kubenswrapper[4971]: I0309 09:50:58.067851 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4hqzd" podStartSLOduration=2.450343121 podStartE2EDuration="5.067829144s" podCreationTimestamp="2026-03-09 09:50:53 +0000 UTC" firstStartedPulling="2026-03-09 09:50:54.979558597 +0000 UTC m=+1858.539486407" lastFinishedPulling="2026-03-09 09:50:57.59704462 +0000 UTC m=+1861.156972430" observedRunningTime="2026-03-09 09:50:58.060846472 +0000 UTC m=+1861.620774292" watchObservedRunningTime="2026-03-09 09:50:58.067829144 +0000 UTC m=+1861.627756954" Mar 09 09:50:59 crc kubenswrapper[4971]: I0309 09:50:59.039855 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wm52d" event={"ID":"8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9","Type":"ContainerStarted","Data":"efc0e4b4cb1beb07beb0bd2fd405365f48469552d756e77f6ed72573f00dc80c"} Mar 09 09:51:00 crc kubenswrapper[4971]: I0309 09:51:00.151615 4971 scope.go:117] "RemoveContainer" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" Mar 09 09:51:00 crc kubenswrapper[4971]: E0309 09:51:00.153057 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 09:51:03 crc kubenswrapper[4971]: I0309 09:51:03.866341 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4hqzd" Mar 09 09:51:03 crc kubenswrapper[4971]: I0309 09:51:03.866843 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4hqzd" Mar 09 09:51:03 crc kubenswrapper[4971]: I0309 09:51:03.914621 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4hqzd" Mar 09 09:51:03 crc kubenswrapper[4971]: I0309 09:51:03.939205 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wm52d" podStartSLOduration=7.430575542 podStartE2EDuration="9.939184508s" podCreationTimestamp="2026-03-09 09:50:54 +0000 UTC" firstStartedPulling="2026-03-09 09:50:55.993370955 +0000 UTC m=+1859.553298765" lastFinishedPulling="2026-03-09 09:50:58.501979921 +0000 UTC m=+1862.061907731" observedRunningTime="2026-03-09 09:50:59.07047449 +0000 UTC m=+1862.630402320" watchObservedRunningTime="2026-03-09 09:51:03.939184508 +0000 UTC m=+1867.499112338" Mar 09 09:51:04 crc kubenswrapper[4971]: I0309 09:51:04.119242 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4hqzd" Mar 09 09:51:05 crc kubenswrapper[4971]: I0309 09:51:05.112530 4971 scope.go:117] "RemoveContainer" containerID="31610ab8cdf335343919d37271e3ad98627b76e135511459bf3992633eea2899" Mar 09 09:51:05 crc kubenswrapper[4971]: I0309 09:51:05.166790 4971 scope.go:117] "RemoveContainer" containerID="645a97b2a1faf42b1dacd4e05546c5e4002c02315607b9596cc2c5be2517c0e7" Mar 09 09:51:05 crc kubenswrapper[4971]: I0309 09:51:05.239729 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wm52d" Mar 09 09:51:05 crc kubenswrapper[4971]: I0309 09:51:05.239793 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wm52d" Mar 09 09:51:05 crc kubenswrapper[4971]: I0309 09:51:05.282593 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wm52d" Mar 09 09:51:06 crc kubenswrapper[4971]: I0309 09:51:06.132627 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wm52d" Mar 09 09:51:06 crc kubenswrapper[4971]: I0309 09:51:06.506541 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4hqzd"] Mar 09 09:51:06 crc kubenswrapper[4971]: I0309 09:51:06.507040 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4hqzd" podUID="59f43291-066e-4226-9f06-c73014d8f899" containerName="registry-server" containerID="cri-o://d71bf64f5bdd0ca8aece2e8acd9f32c9121446833e140a5449b678fbf2c1327a" gracePeriod=2 Mar 09 09:51:07 crc kubenswrapper[4971]: I0309 09:51:07.907926 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wm52d"] Mar 09 09:51:08 crc kubenswrapper[4971]: I0309 09:51:08.111847 4971 generic.go:334] "Generic (PLEG): container finished" podID="59f43291-066e-4226-9f06-c73014d8f899" containerID="d71bf64f5bdd0ca8aece2e8acd9f32c9121446833e140a5449b678fbf2c1327a" exitCode=0 Mar 09 09:51:08 crc kubenswrapper[4971]: I0309 09:51:08.111918 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4hqzd" event={"ID":"59f43291-066e-4226-9f06-c73014d8f899","Type":"ContainerDied","Data":"d71bf64f5bdd0ca8aece2e8acd9f32c9121446833e140a5449b678fbf2c1327a"} Mar 09 09:51:08 crc kubenswrapper[4971]: I0309 09:51:08.116646 4971 generic.go:334] "Generic (PLEG): container finished" podID="17af7be5-5a55-4a79-b05f-098df11f2550" containerID="2d9637d355dbf9e3eb1b55f43bcdece9da435ade7e950c9dcf4dfe0c05c04d65" exitCode=1 Mar 09 09:51:08 crc kubenswrapper[4971]: I0309 09:51:08.116745 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" event={"ID":"17af7be5-5a55-4a79-b05f-098df11f2550","Type":"ContainerDied","Data":"2d9637d355dbf9e3eb1b55f43bcdece9da435ade7e950c9dcf4dfe0c05c04d65"} Mar 09 09:51:08 crc kubenswrapper[4971]: I0309 09:51:08.275377 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4hqzd" Mar 09 09:51:08 crc kubenswrapper[4971]: I0309 09:51:08.408293 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59f43291-066e-4226-9f06-c73014d8f899-utilities\") pod \"59f43291-066e-4226-9f06-c73014d8f899\" (UID: \"59f43291-066e-4226-9f06-c73014d8f899\") " Mar 09 09:51:08 crc kubenswrapper[4971]: I0309 09:51:08.408389 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59f43291-066e-4226-9f06-c73014d8f899-catalog-content\") pod \"59f43291-066e-4226-9f06-c73014d8f899\" (UID: \"59f43291-066e-4226-9f06-c73014d8f899\") " Mar 09 09:51:08 crc kubenswrapper[4971]: I0309 09:51:08.409237 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9mmd\" (UniqueName: \"kubernetes.io/projected/59f43291-066e-4226-9f06-c73014d8f899-kube-api-access-b9mmd\") pod \"59f43291-066e-4226-9f06-c73014d8f899\" (UID: \"59f43291-066e-4226-9f06-c73014d8f899\") " Mar 09 09:51:08 crc kubenswrapper[4971]: I0309 09:51:08.409417 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59f43291-066e-4226-9f06-c73014d8f899-utilities" (OuterVolumeSpecName: "utilities") pod "59f43291-066e-4226-9f06-c73014d8f899" (UID: "59f43291-066e-4226-9f06-c73014d8f899"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:51:08 crc kubenswrapper[4971]: I0309 09:51:08.409898 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59f43291-066e-4226-9f06-c73014d8f899-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:08 crc kubenswrapper[4971]: I0309 09:51:08.417881 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59f43291-066e-4226-9f06-c73014d8f899-kube-api-access-b9mmd" (OuterVolumeSpecName: "kube-api-access-b9mmd") pod "59f43291-066e-4226-9f06-c73014d8f899" (UID: "59f43291-066e-4226-9f06-c73014d8f899"). InnerVolumeSpecName "kube-api-access-b9mmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:51:08 crc kubenswrapper[4971]: I0309 09:51:08.436615 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59f43291-066e-4226-9f06-c73014d8f899-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "59f43291-066e-4226-9f06-c73014d8f899" (UID: "59f43291-066e-4226-9f06-c73014d8f899"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:51:08 crc kubenswrapper[4971]: I0309 09:51:08.511685 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9mmd\" (UniqueName: \"kubernetes.io/projected/59f43291-066e-4226-9f06-c73014d8f899-kube-api-access-b9mmd\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:08 crc kubenswrapper[4971]: I0309 09:51:08.511713 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59f43291-066e-4226-9f06-c73014d8f899-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.125259 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4hqzd" event={"ID":"59f43291-066e-4226-9f06-c73014d8f899","Type":"ContainerDied","Data":"c95d79f5ebe33e6e57da8c0eab45943db60698c435715de0354a24d2cb39f806"} Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.125325 4971 scope.go:117] "RemoveContainer" containerID="d71bf64f5bdd0ca8aece2e8acd9f32c9121446833e140a5449b678fbf2c1327a" Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.125453 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4hqzd" Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.125638 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wm52d" podUID="8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9" containerName="registry-server" containerID="cri-o://efc0e4b4cb1beb07beb0bd2fd405365f48469552d756e77f6ed72573f00dc80c" gracePeriod=2 Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.172661 4971 scope.go:117] "RemoveContainer" containerID="89160f568ace9e2c4d450b9b914c96263c4ccfd36673522d89d8d8619a41a24d" Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.178323 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4hqzd"] Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.178650 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4hqzd"] Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.193585 4971 scope.go:117] "RemoveContainer" containerID="69a5391bf4d4adea91b50225f87dec46d20bf0786abbe504a31e8ab7faf97ed1" Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.450294 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.500880 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm"] Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.525769 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17af7be5-5a55-4a79-b05f-098df11f2550-scripts\") pod \"17af7be5-5a55-4a79-b05f-098df11f2550\" (UID: \"17af7be5-5a55-4a79-b05f-098df11f2550\") " Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.526627 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/17af7be5-5a55-4a79-b05f-098df11f2550-etc-swift\") pod \"17af7be5-5a55-4a79-b05f-098df11f2550\" (UID: \"17af7be5-5a55-4a79-b05f-098df11f2550\") " Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.526778 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/17af7be5-5a55-4a79-b05f-098df11f2550-dispersionconf\") pod \"17af7be5-5a55-4a79-b05f-098df11f2550\" (UID: \"17af7be5-5a55-4a79-b05f-098df11f2550\") " Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.526899 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2xww\" (UniqueName: \"kubernetes.io/projected/17af7be5-5a55-4a79-b05f-098df11f2550-kube-api-access-s2xww\") pod \"17af7be5-5a55-4a79-b05f-098df11f2550\" (UID: \"17af7be5-5a55-4a79-b05f-098df11f2550\") " Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.526951 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/17af7be5-5a55-4a79-b05f-098df11f2550-swiftconf\") pod \"17af7be5-5a55-4a79-b05f-098df11f2550\" (UID: \"17af7be5-5a55-4a79-b05f-098df11f2550\") " Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.527062 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/17af7be5-5a55-4a79-b05f-098df11f2550-ring-data-devices\") pod \"17af7be5-5a55-4a79-b05f-098df11f2550\" (UID: \"17af7be5-5a55-4a79-b05f-098df11f2550\") " Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.527526 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17af7be5-5a55-4a79-b05f-098df11f2550-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "17af7be5-5a55-4a79-b05f-098df11f2550" (UID: "17af7be5-5a55-4a79-b05f-098df11f2550"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.527645 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/17af7be5-5a55-4a79-b05f-098df11f2550-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.528139 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17af7be5-5a55-4a79-b05f-098df11f2550-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "17af7be5-5a55-4a79-b05f-098df11f2550" (UID: "17af7be5-5a55-4a79-b05f-098df11f2550"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.531089 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17af7be5-5a55-4a79-b05f-098df11f2550-kube-api-access-s2xww" (OuterVolumeSpecName: "kube-api-access-s2xww") pod "17af7be5-5a55-4a79-b05f-098df11f2550" (UID: "17af7be5-5a55-4a79-b05f-098df11f2550"). InnerVolumeSpecName "kube-api-access-s2xww". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.548577 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17af7be5-5a55-4a79-b05f-098df11f2550-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "17af7be5-5a55-4a79-b05f-098df11f2550" (UID: "17af7be5-5a55-4a79-b05f-098df11f2550"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.548983 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17af7be5-5a55-4a79-b05f-098df11f2550-scripts" (OuterVolumeSpecName: "scripts") pod "17af7be5-5a55-4a79-b05f-098df11f2550" (UID: "17af7be5-5a55-4a79-b05f-098df11f2550"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.551977 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17af7be5-5a55-4a79-b05f-098df11f2550-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "17af7be5-5a55-4a79-b05f-098df11f2550" (UID: "17af7be5-5a55-4a79-b05f-098df11f2550"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.628638 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2xww\" (UniqueName: \"kubernetes.io/projected/17af7be5-5a55-4a79-b05f-098df11f2550-kube-api-access-s2xww\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.628678 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/17af7be5-5a55-4a79-b05f-098df11f2550-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.628690 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/17af7be5-5a55-4a79-b05f-098df11f2550-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.628700 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17af7be5-5a55-4a79-b05f-098df11f2550-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:09 crc kubenswrapper[4971]: I0309 09:51:09.628708 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/17af7be5-5a55-4a79-b05f-098df11f2550-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.012507 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wm52d" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.133908 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" event={"ID":"17af7be5-5a55-4a79-b05f-098df11f2550","Type":"ContainerDied","Data":"f8fa584c9c3dfc79e7ba2734f99073c5534fc78ff307774167e694821e25716e"} Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.133958 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8fa584c9c3dfc79e7ba2734f99073c5534fc78ff307774167e694821e25716e" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.134042 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.141402 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9-catalog-content\") pod \"8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9\" (UID: \"8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9\") " Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.141540 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr5k4\" (UniqueName: \"kubernetes.io/projected/8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9-kube-api-access-lr5k4\") pod \"8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9\" (UID: \"8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9\") " Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.141756 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9-utilities\") pod \"8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9\" (UID: \"8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9\") " Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.142670 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9-utilities" (OuterVolumeSpecName: "utilities") pod "8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9" (UID: "8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.148186 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9-kube-api-access-lr5k4" (OuterVolumeSpecName: "kube-api-access-lr5k4") pod "8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9" (UID: "8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9"). InnerVolumeSpecName "kube-api-access-lr5k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.148946 4971 generic.go:334] "Generic (PLEG): container finished" podID="8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9" containerID="efc0e4b4cb1beb07beb0bd2fd405365f48469552d756e77f6ed72573f00dc80c" exitCode=0 Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.149071 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wm52d" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.149208 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wm52d" event={"ID":"8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9","Type":"ContainerDied","Data":"efc0e4b4cb1beb07beb0bd2fd405365f48469552d756e77f6ed72573f00dc80c"} Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.149285 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wm52d" event={"ID":"8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9","Type":"ContainerDied","Data":"c2b711da18b32c6b2a0789f3a5ec2c995c6fc771fe49883e4006375a377a577c"} Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.149310 4971 scope.go:117] "RemoveContainer" containerID="efc0e4b4cb1beb07beb0bd2fd405365f48469552d756e77f6ed72573f00dc80c" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.170923 4971 scope.go:117] "RemoveContainer" containerID="370ac44f2b28df09f0de1cd4553fa8a40fa23e0fd0bfe5a3314f5aeae9207756" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.176065 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fs2pm"] Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.200427 4971 scope.go:117] "RemoveContainer" containerID="87ef4b54add85b1c11a2bd760d52b16682ed01d614a7c3e9ce13cc9efc44805d" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.225870 4971 scope.go:117] "RemoveContainer" containerID="efc0e4b4cb1beb07beb0bd2fd405365f48469552d756e77f6ed72573f00dc80c" Mar 09 09:51:10 crc kubenswrapper[4971]: E0309 09:51:10.226805 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efc0e4b4cb1beb07beb0bd2fd405365f48469552d756e77f6ed72573f00dc80c\": container with ID starting with efc0e4b4cb1beb07beb0bd2fd405365f48469552d756e77f6ed72573f00dc80c not found: ID does not exist" containerID="efc0e4b4cb1beb07beb0bd2fd405365f48469552d756e77f6ed72573f00dc80c" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.226855 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efc0e4b4cb1beb07beb0bd2fd405365f48469552d756e77f6ed72573f00dc80c"} err="failed to get container status \"efc0e4b4cb1beb07beb0bd2fd405365f48469552d756e77f6ed72573f00dc80c\": rpc error: code = NotFound desc = could not find container \"efc0e4b4cb1beb07beb0bd2fd405365f48469552d756e77f6ed72573f00dc80c\": container with ID starting with efc0e4b4cb1beb07beb0bd2fd405365f48469552d756e77f6ed72573f00dc80c not found: ID does not exist" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.226888 4971 scope.go:117] "RemoveContainer" containerID="370ac44f2b28df09f0de1cd4553fa8a40fa23e0fd0bfe5a3314f5aeae9207756" Mar 09 09:51:10 crc kubenswrapper[4971]: E0309 09:51:10.227592 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"370ac44f2b28df09f0de1cd4553fa8a40fa23e0fd0bfe5a3314f5aeae9207756\": container with ID starting with 370ac44f2b28df09f0de1cd4553fa8a40fa23e0fd0bfe5a3314f5aeae9207756 not found: ID does not exist" containerID="370ac44f2b28df09f0de1cd4553fa8a40fa23e0fd0bfe5a3314f5aeae9207756" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.227648 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"370ac44f2b28df09f0de1cd4553fa8a40fa23e0fd0bfe5a3314f5aeae9207756"} err="failed to get container status \"370ac44f2b28df09f0de1cd4553fa8a40fa23e0fd0bfe5a3314f5aeae9207756\": rpc error: code = NotFound desc = could not find container \"370ac44f2b28df09f0de1cd4553fa8a40fa23e0fd0bfe5a3314f5aeae9207756\": container with ID starting with 370ac44f2b28df09f0de1cd4553fa8a40fa23e0fd0bfe5a3314f5aeae9207756 not found: ID does not exist" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.227680 4971 scope.go:117] "RemoveContainer" containerID="87ef4b54add85b1c11a2bd760d52b16682ed01d614a7c3e9ce13cc9efc44805d" Mar 09 09:51:10 crc kubenswrapper[4971]: E0309 09:51:10.228207 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87ef4b54add85b1c11a2bd760d52b16682ed01d614a7c3e9ce13cc9efc44805d\": container with ID starting with 87ef4b54add85b1c11a2bd760d52b16682ed01d614a7c3e9ce13cc9efc44805d not found: ID does not exist" containerID="87ef4b54add85b1c11a2bd760d52b16682ed01d614a7c3e9ce13cc9efc44805d" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.228251 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87ef4b54add85b1c11a2bd760d52b16682ed01d614a7c3e9ce13cc9efc44805d"} err="failed to get container status \"87ef4b54add85b1c11a2bd760d52b16682ed01d614a7c3e9ce13cc9efc44805d\": rpc error: code = NotFound desc = could not find container \"87ef4b54add85b1c11a2bd760d52b16682ed01d614a7c3e9ce13cc9efc44805d\": container with ID starting with 87ef4b54add85b1c11a2bd760d52b16682ed01d614a7c3e9ce13cc9efc44805d not found: ID does not exist" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.243729 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr5k4\" (UniqueName: \"kubernetes.io/projected/8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9-kube-api-access-lr5k4\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.243767 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.290278 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9" (UID: "8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.345853 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.480505 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wm52d"] Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.487565 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wm52d"] Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.620298 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8"] Mar 09 09:51:10 crc kubenswrapper[4971]: E0309 09:51:10.620703 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f43291-066e-4226-9f06-c73014d8f899" containerName="extract-content" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.620726 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f43291-066e-4226-9f06-c73014d8f899" containerName="extract-content" Mar 09 09:51:10 crc kubenswrapper[4971]: E0309 09:51:10.620743 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17af7be5-5a55-4a79-b05f-098df11f2550" containerName="swift-ring-rebalance" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.620750 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="17af7be5-5a55-4a79-b05f-098df11f2550" containerName="swift-ring-rebalance" Mar 09 09:51:10 crc kubenswrapper[4971]: E0309 09:51:10.620757 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f43291-066e-4226-9f06-c73014d8f899" containerName="extract-utilities" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.620765 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f43291-066e-4226-9f06-c73014d8f899" containerName="extract-utilities" Mar 09 09:51:10 crc kubenswrapper[4971]: E0309 09:51:10.620778 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9" containerName="registry-server" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.620784 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9" containerName="registry-server" Mar 09 09:51:10 crc kubenswrapper[4971]: E0309 09:51:10.620795 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9" containerName="extract-content" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.620801 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9" containerName="extract-content" Mar 09 09:51:10 crc kubenswrapper[4971]: E0309 09:51:10.620821 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9" containerName="extract-utilities" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.620828 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9" containerName="extract-utilities" Mar 09 09:51:10 crc kubenswrapper[4971]: E0309 09:51:10.620843 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f43291-066e-4226-9f06-c73014d8f899" containerName="registry-server" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.620848 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f43291-066e-4226-9f06-c73014d8f899" containerName="registry-server" Mar 09 09:51:10 crc kubenswrapper[4971]: E0309 09:51:10.620862 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c" containerName="swift-ring-rebalance" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.620870 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c" containerName="swift-ring-rebalance" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.621081 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9" containerName="registry-server" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.621098 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="17af7be5-5a55-4a79-b05f-098df11f2550" containerName="swift-ring-rebalance" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.621114 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="59f43291-066e-4226-9f06-c73014d8f899" containerName="registry-server" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.621125 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e23c3ee-7c4a-4ae5-85f6-5935fd8c856c" containerName="swift-ring-rebalance" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.621680 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.624763 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.628432 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.633969 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8"] Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.751389 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/df13127f-438e-4920-a00f-670097dbe370-swiftconf\") pod \"swift-ring-rebalance-debug-8zwj8\" (UID: \"df13127f-438e-4920-a00f-670097dbe370\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.751434 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/df13127f-438e-4920-a00f-670097dbe370-dispersionconf\") pod \"swift-ring-rebalance-debug-8zwj8\" (UID: \"df13127f-438e-4920-a00f-670097dbe370\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.751455 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df13127f-438e-4920-a00f-670097dbe370-scripts\") pod \"swift-ring-rebalance-debug-8zwj8\" (UID: \"df13127f-438e-4920-a00f-670097dbe370\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.751485 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/df13127f-438e-4920-a00f-670097dbe370-ring-data-devices\") pod \"swift-ring-rebalance-debug-8zwj8\" (UID: \"df13127f-438e-4920-a00f-670097dbe370\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.751746 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frvgx\" (UniqueName: \"kubernetes.io/projected/df13127f-438e-4920-a00f-670097dbe370-kube-api-access-frvgx\") pod \"swift-ring-rebalance-debug-8zwj8\" (UID: \"df13127f-438e-4920-a00f-670097dbe370\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.751798 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/df13127f-438e-4920-a00f-670097dbe370-etc-swift\") pod \"swift-ring-rebalance-debug-8zwj8\" (UID: \"df13127f-438e-4920-a00f-670097dbe370\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.853226 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frvgx\" (UniqueName: \"kubernetes.io/projected/df13127f-438e-4920-a00f-670097dbe370-kube-api-access-frvgx\") pod \"swift-ring-rebalance-debug-8zwj8\" (UID: \"df13127f-438e-4920-a00f-670097dbe370\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.853293 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/df13127f-438e-4920-a00f-670097dbe370-etc-swift\") pod \"swift-ring-rebalance-debug-8zwj8\" (UID: \"df13127f-438e-4920-a00f-670097dbe370\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.853401 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/df13127f-438e-4920-a00f-670097dbe370-swiftconf\") pod \"swift-ring-rebalance-debug-8zwj8\" (UID: \"df13127f-438e-4920-a00f-670097dbe370\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.853426 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/df13127f-438e-4920-a00f-670097dbe370-dispersionconf\") pod \"swift-ring-rebalance-debug-8zwj8\" (UID: \"df13127f-438e-4920-a00f-670097dbe370\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.853453 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df13127f-438e-4920-a00f-670097dbe370-scripts\") pod \"swift-ring-rebalance-debug-8zwj8\" (UID: \"df13127f-438e-4920-a00f-670097dbe370\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.853488 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/df13127f-438e-4920-a00f-670097dbe370-ring-data-devices\") pod \"swift-ring-rebalance-debug-8zwj8\" (UID: \"df13127f-438e-4920-a00f-670097dbe370\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.853787 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/df13127f-438e-4920-a00f-670097dbe370-etc-swift\") pod \"swift-ring-rebalance-debug-8zwj8\" (UID: \"df13127f-438e-4920-a00f-670097dbe370\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.854405 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/df13127f-438e-4920-a00f-670097dbe370-ring-data-devices\") pod \"swift-ring-rebalance-debug-8zwj8\" (UID: \"df13127f-438e-4920-a00f-670097dbe370\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.854426 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df13127f-438e-4920-a00f-670097dbe370-scripts\") pod \"swift-ring-rebalance-debug-8zwj8\" (UID: \"df13127f-438e-4920-a00f-670097dbe370\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.857901 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/df13127f-438e-4920-a00f-670097dbe370-dispersionconf\") pod \"swift-ring-rebalance-debug-8zwj8\" (UID: \"df13127f-438e-4920-a00f-670097dbe370\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.858655 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/df13127f-438e-4920-a00f-670097dbe370-swiftconf\") pod \"swift-ring-rebalance-debug-8zwj8\" (UID: \"df13127f-438e-4920-a00f-670097dbe370\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.885970 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frvgx\" (UniqueName: \"kubernetes.io/projected/df13127f-438e-4920-a00f-670097dbe370-kube-api-access-frvgx\") pod \"swift-ring-rebalance-debug-8zwj8\" (UID: \"df13127f-438e-4920-a00f-670097dbe370\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" Mar 09 09:51:10 crc kubenswrapper[4971]: I0309 09:51:10.941406 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" Mar 09 09:51:11 crc kubenswrapper[4971]: I0309 09:51:11.152902 4971 scope.go:117] "RemoveContainer" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" Mar 09 09:51:11 crc kubenswrapper[4971]: E0309 09:51:11.153428 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 09:51:11 crc kubenswrapper[4971]: I0309 09:51:11.163481 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17af7be5-5a55-4a79-b05f-098df11f2550" path="/var/lib/kubelet/pods/17af7be5-5a55-4a79-b05f-098df11f2550/volumes" Mar 09 09:51:11 crc kubenswrapper[4971]: I0309 09:51:11.163966 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59f43291-066e-4226-9f06-c73014d8f899" path="/var/lib/kubelet/pods/59f43291-066e-4226-9f06-c73014d8f899/volumes" Mar 09 09:51:11 crc kubenswrapper[4971]: I0309 09:51:11.164725 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9" path="/var/lib/kubelet/pods/8eebc3a1-432b-4f2c-a94c-c7abf5b3bda9/volumes" Mar 09 09:51:11 crc kubenswrapper[4971]: I0309 09:51:11.346326 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8"] Mar 09 09:51:12 crc kubenswrapper[4971]: I0309 09:51:12.182887 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" event={"ID":"df13127f-438e-4920-a00f-670097dbe370","Type":"ContainerStarted","Data":"811d8a88eb980fa2f3bd981bce5a50e23c55cd765c9080394d71afcf54e7a9cf"} Mar 09 09:51:12 crc kubenswrapper[4971]: I0309 09:51:12.184509 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" event={"ID":"df13127f-438e-4920-a00f-670097dbe370","Type":"ContainerStarted","Data":"ccc88601127433dd1542864527d211decb11a663e22f509ca71b8adbe709ac21"} Mar 09 09:51:12 crc kubenswrapper[4971]: I0309 09:51:12.202522 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" podStartSLOduration=2.202499056 podStartE2EDuration="2.202499056s" podCreationTimestamp="2026-03-09 09:51:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:51:12.202227978 +0000 UTC m=+1875.762155808" watchObservedRunningTime="2026-03-09 09:51:12.202499056 +0000 UTC m=+1875.762426876" Mar 09 09:51:18 crc kubenswrapper[4971]: I0309 09:51:18.237457 4971 generic.go:334] "Generic (PLEG): container finished" podID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerID="bd08ba1a76889c53f7f72fa52eabc950709d56d78c23c1a9da6fd3dbfe751148" exitCode=137 Mar 09 09:51:18 crc kubenswrapper[4971]: I0309 09:51:18.237690 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerDied","Data":"bd08ba1a76889c53f7f72fa52eabc950709d56d78c23c1a9da6fd3dbfe751148"} Mar 09 09:51:18 crc kubenswrapper[4971]: I0309 09:51:18.370976 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:51:18 crc kubenswrapper[4971]: I0309 09:51:18.480490 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"7cf1281b-f79b-4219-902e-eea6fb707cb4\" (UID: \"7cf1281b-f79b-4219-902e-eea6fb707cb4\") " Mar 09 09:51:18 crc kubenswrapper[4971]: I0309 09:51:18.480703 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7cf1281b-f79b-4219-902e-eea6fb707cb4-cache\") pod \"7cf1281b-f79b-4219-902e-eea6fb707cb4\" (UID: \"7cf1281b-f79b-4219-902e-eea6fb707cb4\") " Mar 09 09:51:18 crc kubenswrapper[4971]: I0309 09:51:18.480753 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-etc-swift\") pod \"7cf1281b-f79b-4219-902e-eea6fb707cb4\" (UID: \"7cf1281b-f79b-4219-902e-eea6fb707cb4\") " Mar 09 09:51:18 crc kubenswrapper[4971]: I0309 09:51:18.480789 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7cf1281b-f79b-4219-902e-eea6fb707cb4-lock\") pod \"7cf1281b-f79b-4219-902e-eea6fb707cb4\" (UID: \"7cf1281b-f79b-4219-902e-eea6fb707cb4\") " Mar 09 09:51:18 crc kubenswrapper[4971]: I0309 09:51:18.480845 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs8hm\" (UniqueName: \"kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-kube-api-access-gs8hm\") pod \"7cf1281b-f79b-4219-902e-eea6fb707cb4\" (UID: \"7cf1281b-f79b-4219-902e-eea6fb707cb4\") " Mar 09 09:51:18 crc kubenswrapper[4971]: I0309 09:51:18.481799 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cf1281b-f79b-4219-902e-eea6fb707cb4-lock" (OuterVolumeSpecName: "lock") pod "7cf1281b-f79b-4219-902e-eea6fb707cb4" (UID: "7cf1281b-f79b-4219-902e-eea6fb707cb4"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:51:18 crc kubenswrapper[4971]: I0309 09:51:18.482153 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cf1281b-f79b-4219-902e-eea6fb707cb4-cache" (OuterVolumeSpecName: "cache") pod "7cf1281b-f79b-4219-902e-eea6fb707cb4" (UID: "7cf1281b-f79b-4219-902e-eea6fb707cb4"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:51:18 crc kubenswrapper[4971]: I0309 09:51:18.490895 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "swift") pod "7cf1281b-f79b-4219-902e-eea6fb707cb4" (UID: "7cf1281b-f79b-4219-902e-eea6fb707cb4"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 09:51:18 crc kubenswrapper[4971]: I0309 09:51:18.490912 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7cf1281b-f79b-4219-902e-eea6fb707cb4" (UID: "7cf1281b-f79b-4219-902e-eea6fb707cb4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:51:18 crc kubenswrapper[4971]: I0309 09:51:18.493122 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-kube-api-access-gs8hm" (OuterVolumeSpecName: "kube-api-access-gs8hm") pod "7cf1281b-f79b-4219-902e-eea6fb707cb4" (UID: "7cf1281b-f79b-4219-902e-eea6fb707cb4"). InnerVolumeSpecName "kube-api-access-gs8hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:51:18 crc kubenswrapper[4971]: I0309 09:51:18.582112 4971 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7cf1281b-f79b-4219-902e-eea6fb707cb4-cache\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:18 crc kubenswrapper[4971]: I0309 09:51:18.582159 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:18 crc kubenswrapper[4971]: I0309 09:51:18.582173 4971 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7cf1281b-f79b-4219-902e-eea6fb707cb4-lock\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:18 crc kubenswrapper[4971]: I0309 09:51:18.582184 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs8hm\" (UniqueName: \"kubernetes.io/projected/7cf1281b-f79b-4219-902e-eea6fb707cb4-kube-api-access-gs8hm\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:18 crc kubenswrapper[4971]: I0309 09:51:18.582222 4971 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 09 09:51:18 crc kubenswrapper[4971]: I0309 09:51:18.596396 4971 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 09 09:51:18 crc kubenswrapper[4971]: I0309 09:51:18.683316 4971 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.252024 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"7cf1281b-f79b-4219-902e-eea6fb707cb4","Type":"ContainerDied","Data":"fb1da211386c13b25b034b538da4de69f4cde740ab8f03c5cb0885f93c546dfa"} Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.252090 4971 scope.go:117] "RemoveContainer" containerID="896548583b712817fd9ac2457aa52b694754b489cd6ec4d51bcf7a2f46d8c65d" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.252112 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.279279 4971 scope.go:117] "RemoveContainer" containerID="bd08ba1a76889c53f7f72fa52eabc950709d56d78c23c1a9da6fd3dbfe751148" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.283322 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.288159 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.297125 4971 scope.go:117] "RemoveContainer" containerID="977e8188d1d5d046c5f1d39c7dac78e43224aaeb87daa3d4bee54aee924a5664" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.314899 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 09:51:19 crc kubenswrapper[4971]: E0309 09:51:19.315222 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="container-auditor" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.315242 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="container-auditor" Mar 09 09:51:19 crc kubenswrapper[4971]: E0309 09:51:19.315260 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="object-replicator" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.315267 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="object-replicator" Mar 09 09:51:19 crc kubenswrapper[4971]: E0309 09:51:19.315281 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="account-auditor" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.315289 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="account-auditor" Mar 09 09:51:19 crc kubenswrapper[4971]: E0309 09:51:19.315306 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="account-replicator" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.315313 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="account-replicator" Mar 09 09:51:19 crc kubenswrapper[4971]: E0309 09:51:19.315321 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="container-sharder" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.315330 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="container-sharder" Mar 09 09:51:19 crc kubenswrapper[4971]: E0309 09:51:19.315339 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="object-expirer" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.317234 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="object-expirer" Mar 09 09:51:19 crc kubenswrapper[4971]: E0309 09:51:19.317264 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="rsync" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.317272 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="rsync" Mar 09 09:51:19 crc kubenswrapper[4971]: E0309 09:51:19.317292 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="object-auditor" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.317300 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="object-auditor" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.316035 4971 scope.go:117] "RemoveContainer" containerID="8173809deda0ceecec61a266c95370ae3e2c5629f0fc60a80c3c4c47b3894635" Mar 09 09:51:19 crc kubenswrapper[4971]: E0309 09:51:19.317312 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="object-updater" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.317399 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="object-updater" Mar 09 09:51:19 crc kubenswrapper[4971]: E0309 09:51:19.317432 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="account-server" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.317441 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="account-server" Mar 09 09:51:19 crc kubenswrapper[4971]: E0309 09:51:19.317464 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="container-updater" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.317471 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="container-updater" Mar 09 09:51:19 crc kubenswrapper[4971]: E0309 09:51:19.317495 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="container-replicator" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.317503 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="container-replicator" Mar 09 09:51:19 crc kubenswrapper[4971]: E0309 09:51:19.317515 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="object-server" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.317522 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="object-server" Mar 09 09:51:19 crc kubenswrapper[4971]: E0309 09:51:19.317532 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="account-reaper" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.317542 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="account-reaper" Mar 09 09:51:19 crc kubenswrapper[4971]: E0309 09:51:19.317554 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="swift-recon-cron" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.317561 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="swift-recon-cron" Mar 09 09:51:19 crc kubenswrapper[4971]: E0309 09:51:19.317580 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="container-server" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.317587 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="container-server" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.317877 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="object-replicator" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.317895 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="container-sharder" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.317909 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="object-auditor" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.317921 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="account-server" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.317933 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="container-auditor" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.317945 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="account-reaper" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.317954 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="container-updater" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.317966 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="object-updater" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.317978 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="container-replicator" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.317986 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="swift-recon-cron" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.317997 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="account-auditor" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.318015 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="container-server" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.318024 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="object-expirer" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.318031 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="rsync" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.318043 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="account-replicator" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.318054 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" containerName="object-server" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.322484 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.341765 4971 scope.go:117] "RemoveContainer" containerID="0862fe7dba311c6c6b693d2c332d0b86b3f53acee8d036abe908aff5cb6d6e8f" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.347083 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.371298 4971 scope.go:117] "RemoveContainer" containerID="d7adf37c0f5ab81fc50a28e521f676d1489460683f4b8dfc2e36052a19f9d07e" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.398241 4971 scope.go:117] "RemoveContainer" containerID="e0fd9fa4462f1853a43e7cba4a34485ff301c975cb23eef544f0c47106ccc0de" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.412437 4971 scope.go:117] "RemoveContainer" containerID="24779d2ef2862ffd4c9ec9f48e564d2007ab125e37e3d3b4e67d1ea10b04135f" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.430034 4971 scope.go:117] "RemoveContainer" containerID="e990ced9d0c8867ebf7e37598faa5157c49269703059f5157a1ed4694f8990a2" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.450675 4971 scope.go:117] "RemoveContainer" containerID="439626a0c50fc4086e7623b9d44cec2d3c1789da2fdcfd975bd6b2c21ff67bde" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.467018 4971 scope.go:117] "RemoveContainer" containerID="6cac6b8f712e2749ddb02ae9fa9c26cde95ba42afc84e1c1ab4eb15dda4699ae" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.482444 4971 scope.go:117] "RemoveContainer" containerID="384d552cdcfa77a481f9fa7d9339755f30189066b1c0db676dd5af57ff9f3d86" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.495724 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/698abc6e-c9eb-4568-8639-8c10c5958c3c-cache\") pod \"swift-storage-0\" (UID: \"698abc6e-c9eb-4568-8639-8c10c5958c3c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.495857 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/698abc6e-c9eb-4568-8639-8c10c5958c3c-etc-swift\") pod \"swift-storage-0\" (UID: \"698abc6e-c9eb-4568-8639-8c10c5958c3c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.495902 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnpmb\" (UniqueName: \"kubernetes.io/projected/698abc6e-c9eb-4568-8639-8c10c5958c3c-kube-api-access-jnpmb\") pod \"swift-storage-0\" (UID: \"698abc6e-c9eb-4568-8639-8c10c5958c3c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.495964 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"698abc6e-c9eb-4568-8639-8c10c5958c3c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.496005 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/698abc6e-c9eb-4568-8639-8c10c5958c3c-lock\") pod \"swift-storage-0\" (UID: \"698abc6e-c9eb-4568-8639-8c10c5958c3c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.497530 4971 scope.go:117] "RemoveContainer" containerID="b6a6726c2672f3f891932c39730d7bf24ebbbaf74b224a8e384a86cc59e3866f" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.518929 4971 scope.go:117] "RemoveContainer" containerID="10310cdbcf68045954b30655355f31ab5609c827bd113ceb6f36c7acdb67568b" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.536282 4971 scope.go:117] "RemoveContainer" containerID="33d9df249f56981ed105c7cd6de3c253f19a8190ec6977cac54400d478cb03e7" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.557021 4971 scope.go:117] "RemoveContainer" containerID="70a4736032a91173a8081a9d98939447d2f1ecece350b377d82dde74455e3069" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.597528 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/698abc6e-c9eb-4568-8639-8c10c5958c3c-etc-swift\") pod \"swift-storage-0\" (UID: \"698abc6e-c9eb-4568-8639-8c10c5958c3c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.597610 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnpmb\" (UniqueName: \"kubernetes.io/projected/698abc6e-c9eb-4568-8639-8c10c5958c3c-kube-api-access-jnpmb\") pod \"swift-storage-0\" (UID: \"698abc6e-c9eb-4568-8639-8c10c5958c3c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.597679 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"698abc6e-c9eb-4568-8639-8c10c5958c3c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.597703 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/698abc6e-c9eb-4568-8639-8c10c5958c3c-lock\") pod \"swift-storage-0\" (UID: \"698abc6e-c9eb-4568-8639-8c10c5958c3c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.597737 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/698abc6e-c9eb-4568-8639-8c10c5958c3c-cache\") pod \"swift-storage-0\" (UID: \"698abc6e-c9eb-4568-8639-8c10c5958c3c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.598145 4971 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"698abc6e-c9eb-4568-8639-8c10c5958c3c\") device mount path \"/mnt/openstack/pv02\"" pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.598222 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/698abc6e-c9eb-4568-8639-8c10c5958c3c-cache\") pod \"swift-storage-0\" (UID: \"698abc6e-c9eb-4568-8639-8c10c5958c3c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.598484 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/698abc6e-c9eb-4568-8639-8c10c5958c3c-lock\") pod \"swift-storage-0\" (UID: \"698abc6e-c9eb-4568-8639-8c10c5958c3c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.612902 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/698abc6e-c9eb-4568-8639-8c10c5958c3c-etc-swift\") pod \"swift-storage-0\" (UID: \"698abc6e-c9eb-4568-8639-8c10c5958c3c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.615660 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnpmb\" (UniqueName: \"kubernetes.io/projected/698abc6e-c9eb-4568-8639-8c10c5958c3c-kube-api-access-jnpmb\") pod \"swift-storage-0\" (UID: \"698abc6e-c9eb-4568-8639-8c10c5958c3c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.618223 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"698abc6e-c9eb-4568-8639-8c10c5958c3c\") " pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:51:19 crc kubenswrapper[4971]: I0309 09:51:19.644051 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 09 09:51:20 crc kubenswrapper[4971]: I0309 09:51:20.072686 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 09 09:51:20 crc kubenswrapper[4971]: I0309 09:51:20.268055 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"698abc6e-c9eb-4568-8639-8c10c5958c3c","Type":"ContainerStarted","Data":"184d745dd63859adfb5dbedb3c552dea61da2a635886c6597034936984207c53"} Mar 09 09:51:20 crc kubenswrapper[4971]: I0309 09:51:20.268116 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"698abc6e-c9eb-4568-8639-8c10c5958c3c","Type":"ContainerStarted","Data":"25ee6f93523389e63ba65ae2403994ce375c8e989868e08a91f66d48031c9b8b"} Mar 09 09:51:21 crc kubenswrapper[4971]: I0309 09:51:21.165533 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cf1281b-f79b-4219-902e-eea6fb707cb4" path="/var/lib/kubelet/pods/7cf1281b-f79b-4219-902e-eea6fb707cb4/volumes" Mar 09 09:51:21 crc kubenswrapper[4971]: I0309 09:51:21.285569 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"698abc6e-c9eb-4568-8639-8c10c5958c3c","Type":"ContainerStarted","Data":"3525cfe3c8a7c660c13cbc50dbfe27010db8ccb1eeebb3b55513598cdceec044"} Mar 09 09:51:21 crc kubenswrapper[4971]: I0309 09:51:21.286717 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"698abc6e-c9eb-4568-8639-8c10c5958c3c","Type":"ContainerStarted","Data":"8f983a55608f7b6ece36cb6933e62130700ad15eff4da6bc675a880fc956fa0b"} Mar 09 09:51:21 crc kubenswrapper[4971]: I0309 09:51:21.286822 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"698abc6e-c9eb-4568-8639-8c10c5958c3c","Type":"ContainerStarted","Data":"3a6e3312c0bed9a7e07be30ae2c22e921097fd51e4e0021cb15c53a992876ecc"} Mar 09 09:51:21 crc kubenswrapper[4971]: I0309 09:51:21.286993 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"698abc6e-c9eb-4568-8639-8c10c5958c3c","Type":"ContainerStarted","Data":"f99076e2513c865dd356f394308cdbf1125829179e61e0a304ccd74732548136"} Mar 09 09:51:21 crc kubenswrapper[4971]: I0309 09:51:21.287102 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"698abc6e-c9eb-4568-8639-8c10c5958c3c","Type":"ContainerStarted","Data":"1f5a79caf2bf8bef64609e37f13f16d81c26165850bc7f84af51c957a508452f"} Mar 09 09:51:21 crc kubenswrapper[4971]: I0309 09:51:21.287169 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"698abc6e-c9eb-4568-8639-8c10c5958c3c","Type":"ContainerStarted","Data":"2f016c29e2b7a0fb34cb53b2970ddb41653ac90c52806d062022dc7ae153516e"} Mar 09 09:51:22 crc kubenswrapper[4971]: I0309 09:51:22.151976 4971 scope.go:117] "RemoveContainer" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" Mar 09 09:51:22 crc kubenswrapper[4971]: E0309 09:51:22.152683 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 09:51:22 crc kubenswrapper[4971]: I0309 09:51:22.306312 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"698abc6e-c9eb-4568-8639-8c10c5958c3c","Type":"ContainerStarted","Data":"5dea734901d0f6c48e866167befcb6433812ab96a505c0661ffd0df8cd294e94"} Mar 09 09:51:22 crc kubenswrapper[4971]: I0309 09:51:22.306387 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"698abc6e-c9eb-4568-8639-8c10c5958c3c","Type":"ContainerStarted","Data":"5f5dd89654071c3a185cd704008f682d4e10bb50d0d4f0c857cc0707a8e2d806"} Mar 09 09:51:22 crc kubenswrapper[4971]: I0309 09:51:22.306401 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"698abc6e-c9eb-4568-8639-8c10c5958c3c","Type":"ContainerStarted","Data":"b309787bdf21d934f071e7747a66a914c1efbc0a40d03746801b79b7e29436ac"} Mar 09 09:51:22 crc kubenswrapper[4971]: I0309 09:51:22.306413 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"698abc6e-c9eb-4568-8639-8c10c5958c3c","Type":"ContainerStarted","Data":"45fc7bb64171541ade2615a4b428ce355b523a932ca7ede6577e5fe111c23584"} Mar 09 09:51:22 crc kubenswrapper[4971]: I0309 09:51:22.306424 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"698abc6e-c9eb-4568-8639-8c10c5958c3c","Type":"ContainerStarted","Data":"1b3b31498159db0d1c3e98b6d43d4e2cc0558b3b8bde27316840bd712b7b86c0"} Mar 09 09:51:23 crc kubenswrapper[4971]: I0309 09:51:23.330240 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"698abc6e-c9eb-4568-8639-8c10c5958c3c","Type":"ContainerStarted","Data":"97c8cf87514f461eb4b49b48b14f2eb2ea87ee8f41566d3d658d596c1dab63bb"} Mar 09 09:51:23 crc kubenswrapper[4971]: I0309 09:51:23.330288 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"698abc6e-c9eb-4568-8639-8c10c5958c3c","Type":"ContainerStarted","Data":"9c51b3fae8e1e7a1f024696cc729dd5fa16ddd70b395838c611928e506e50c33"} Mar 09 09:51:23 crc kubenswrapper[4971]: I0309 09:51:23.330297 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"698abc6e-c9eb-4568-8639-8c10c5958c3c","Type":"ContainerStarted","Data":"77c59756070d308c0ee9724821573982bccdea6e3716a587d2ea7909685a677d"} Mar 09 09:51:23 crc kubenswrapper[4971]: I0309 09:51:23.365740 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-0" podStartSLOduration=4.365720418 podStartE2EDuration="4.365720418s" podCreationTimestamp="2026-03-09 09:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:51:23.364114152 +0000 UTC m=+1886.924041962" watchObservedRunningTime="2026-03-09 09:51:23.365720418 +0000 UTC m=+1886.925648228" Mar 09 09:51:26 crc kubenswrapper[4971]: I0309 09:51:26.359918 4971 generic.go:334] "Generic (PLEG): container finished" podID="df13127f-438e-4920-a00f-670097dbe370" containerID="811d8a88eb980fa2f3bd981bce5a50e23c55cd765c9080394d71afcf54e7a9cf" exitCode=0 Mar 09 09:51:26 crc kubenswrapper[4971]: I0309 09:51:26.360013 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" event={"ID":"df13127f-438e-4920-a00f-670097dbe370","Type":"ContainerDied","Data":"811d8a88eb980fa2f3bd981bce5a50e23c55cd765c9080394d71afcf54e7a9cf"} Mar 09 09:51:27 crc kubenswrapper[4971]: I0309 09:51:27.648045 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" Mar 09 09:51:27 crc kubenswrapper[4971]: I0309 09:51:27.680941 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8"] Mar 09 09:51:27 crc kubenswrapper[4971]: I0309 09:51:27.767680 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df13127f-438e-4920-a00f-670097dbe370-scripts\") pod \"df13127f-438e-4920-a00f-670097dbe370\" (UID: \"df13127f-438e-4920-a00f-670097dbe370\") " Mar 09 09:51:27 crc kubenswrapper[4971]: I0309 09:51:27.768046 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/df13127f-438e-4920-a00f-670097dbe370-etc-swift\") pod \"df13127f-438e-4920-a00f-670097dbe370\" (UID: \"df13127f-438e-4920-a00f-670097dbe370\") " Mar 09 09:51:27 crc kubenswrapper[4971]: I0309 09:51:27.768120 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/df13127f-438e-4920-a00f-670097dbe370-swiftconf\") pod \"df13127f-438e-4920-a00f-670097dbe370\" (UID: \"df13127f-438e-4920-a00f-670097dbe370\") " Mar 09 09:51:27 crc kubenswrapper[4971]: I0309 09:51:27.768141 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/df13127f-438e-4920-a00f-670097dbe370-ring-data-devices\") pod \"df13127f-438e-4920-a00f-670097dbe370\" (UID: \"df13127f-438e-4920-a00f-670097dbe370\") " Mar 09 09:51:27 crc kubenswrapper[4971]: I0309 09:51:27.768186 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frvgx\" (UniqueName: \"kubernetes.io/projected/df13127f-438e-4920-a00f-670097dbe370-kube-api-access-frvgx\") pod \"df13127f-438e-4920-a00f-670097dbe370\" (UID: \"df13127f-438e-4920-a00f-670097dbe370\") " Mar 09 09:51:27 crc kubenswrapper[4971]: I0309 09:51:27.768270 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/df13127f-438e-4920-a00f-670097dbe370-dispersionconf\") pod \"df13127f-438e-4920-a00f-670097dbe370\" (UID: \"df13127f-438e-4920-a00f-670097dbe370\") " Mar 09 09:51:27 crc kubenswrapper[4971]: I0309 09:51:27.769180 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df13127f-438e-4920-a00f-670097dbe370-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "df13127f-438e-4920-a00f-670097dbe370" (UID: "df13127f-438e-4920-a00f-670097dbe370"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:51:27 crc kubenswrapper[4971]: I0309 09:51:27.770089 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df13127f-438e-4920-a00f-670097dbe370-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "df13127f-438e-4920-a00f-670097dbe370" (UID: "df13127f-438e-4920-a00f-670097dbe370"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:51:27 crc kubenswrapper[4971]: I0309 09:51:27.773676 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df13127f-438e-4920-a00f-670097dbe370-kube-api-access-frvgx" (OuterVolumeSpecName: "kube-api-access-frvgx") pod "df13127f-438e-4920-a00f-670097dbe370" (UID: "df13127f-438e-4920-a00f-670097dbe370"). InnerVolumeSpecName "kube-api-access-frvgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:51:27 crc kubenswrapper[4971]: I0309 09:51:27.788117 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df13127f-438e-4920-a00f-670097dbe370-scripts" (OuterVolumeSpecName: "scripts") pod "df13127f-438e-4920-a00f-670097dbe370" (UID: "df13127f-438e-4920-a00f-670097dbe370"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:51:27 crc kubenswrapper[4971]: I0309 09:51:27.790599 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df13127f-438e-4920-a00f-670097dbe370-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "df13127f-438e-4920-a00f-670097dbe370" (UID: "df13127f-438e-4920-a00f-670097dbe370"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:51:27 crc kubenswrapper[4971]: I0309 09:51:27.798211 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df13127f-438e-4920-a00f-670097dbe370-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "df13127f-438e-4920-a00f-670097dbe370" (UID: "df13127f-438e-4920-a00f-670097dbe370"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:51:27 crc kubenswrapper[4971]: I0309 09:51:27.869991 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/df13127f-438e-4920-a00f-670097dbe370-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:27 crc kubenswrapper[4971]: I0309 09:51:27.870230 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df13127f-438e-4920-a00f-670097dbe370-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:27 crc kubenswrapper[4971]: I0309 09:51:27.870296 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/df13127f-438e-4920-a00f-670097dbe370-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:27 crc kubenswrapper[4971]: I0309 09:51:27.870406 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/df13127f-438e-4920-a00f-670097dbe370-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:27 crc kubenswrapper[4971]: I0309 09:51:27.870473 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/df13127f-438e-4920-a00f-670097dbe370-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:27 crc kubenswrapper[4971]: I0309 09:51:27.870542 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frvgx\" (UniqueName: \"kubernetes.io/projected/df13127f-438e-4920-a00f-670097dbe370-kube-api-access-frvgx\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.092836 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd"] Mar 09 09:51:28 crc kubenswrapper[4971]: E0309 09:51:28.095691 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df13127f-438e-4920-a00f-670097dbe370" containerName="swift-ring-rebalance" Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.095725 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="df13127f-438e-4920-a00f-670097dbe370" containerName="swift-ring-rebalance" Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.095945 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="df13127f-438e-4920-a00f-670097dbe370" containerName="swift-ring-rebalance" Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.096817 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.103872 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd"] Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.277207 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-etc-swift\") pod \"swift-ring-rebalance-debug-hj5qd\" (UID: \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.277268 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gn6w\" (UniqueName: \"kubernetes.io/projected/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-kube-api-access-5gn6w\") pod \"swift-ring-rebalance-debug-hj5qd\" (UID: \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.277295 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-ring-data-devices\") pod \"swift-ring-rebalance-debug-hj5qd\" (UID: \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.277343 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-scripts\") pod \"swift-ring-rebalance-debug-hj5qd\" (UID: \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.277403 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-swiftconf\") pod \"swift-ring-rebalance-debug-hj5qd\" (UID: \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.277452 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-dispersionconf\") pod \"swift-ring-rebalance-debug-hj5qd\" (UID: \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.378434 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-swiftconf\") pod \"swift-ring-rebalance-debug-hj5qd\" (UID: \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.378518 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-dispersionconf\") pod \"swift-ring-rebalance-debug-hj5qd\" (UID: \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.378598 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-etc-swift\") pod \"swift-ring-rebalance-debug-hj5qd\" (UID: \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.378624 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gn6w\" (UniqueName: \"kubernetes.io/projected/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-kube-api-access-5gn6w\") pod \"swift-ring-rebalance-debug-hj5qd\" (UID: \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.378648 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-ring-data-devices\") pod \"swift-ring-rebalance-debug-hj5qd\" (UID: \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.378689 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-scripts\") pod \"swift-ring-rebalance-debug-hj5qd\" (UID: \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.379137 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-etc-swift\") pod \"swift-ring-rebalance-debug-hj5qd\" (UID: \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.379319 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" event={"ID":"df13127f-438e-4920-a00f-670097dbe370","Type":"ContainerDied","Data":"ccc88601127433dd1542864527d211decb11a663e22f509ca71b8adbe709ac21"} Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.379389 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccc88601127433dd1542864527d211decb11a663e22f509ca71b8adbe709ac21" Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.379463 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8" Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.379605 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-scripts\") pod \"swift-ring-rebalance-debug-hj5qd\" (UID: \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.379981 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-ring-data-devices\") pod \"swift-ring-rebalance-debug-hj5qd\" (UID: \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.383803 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-dispersionconf\") pod \"swift-ring-rebalance-debug-hj5qd\" (UID: \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.387556 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-swiftconf\") pod \"swift-ring-rebalance-debug-hj5qd\" (UID: \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.402000 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gn6w\" (UniqueName: \"kubernetes.io/projected/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-kube-api-access-5gn6w\") pod \"swift-ring-rebalance-debug-hj5qd\" (UID: \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.420916 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.452344 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd"] Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.461544 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8zwj8"] Mar 09 09:51:28 crc kubenswrapper[4971]: I0309 09:51:28.873467 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd"] Mar 09 09:51:28 crc kubenswrapper[4971]: W0309 09:51:28.874794 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2aec992a_0f1f_4c6f_8a4f_54b10ef43466.slice/crio-9be2989335d5110f48da0a1195c709d0a024abf72b3f54fe67f95270194e1678 WatchSource:0}: Error finding container 9be2989335d5110f48da0a1195c709d0a024abf72b3f54fe67f95270194e1678: Status 404 returned error can't find the container with id 9be2989335d5110f48da0a1195c709d0a024abf72b3f54fe67f95270194e1678 Mar 09 09:51:29 crc kubenswrapper[4971]: I0309 09:51:29.162623 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df13127f-438e-4920-a00f-670097dbe370" path="/var/lib/kubelet/pods/df13127f-438e-4920-a00f-670097dbe370/volumes" Mar 09 09:51:29 crc kubenswrapper[4971]: I0309 09:51:29.388719 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" event={"ID":"2aec992a-0f1f-4c6f-8a4f-54b10ef43466","Type":"ContainerStarted","Data":"25d5baa9f6be70f1ee09448daa9c66aa1baa7cc0fd9c3663db520600dc910133"} Mar 09 09:51:29 crc kubenswrapper[4971]: I0309 09:51:29.388780 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" event={"ID":"2aec992a-0f1f-4c6f-8a4f-54b10ef43466","Type":"ContainerStarted","Data":"9be2989335d5110f48da0a1195c709d0a024abf72b3f54fe67f95270194e1678"} Mar 09 09:51:29 crc kubenswrapper[4971]: I0309 09:51:29.388912 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" podUID="2aec992a-0f1f-4c6f-8a4f-54b10ef43466" containerName="swift-ring-rebalance" containerID="cri-o://25d5baa9f6be70f1ee09448daa9c66aa1baa7cc0fd9c3663db520600dc910133" gracePeriod=30 Mar 09 09:51:29 crc kubenswrapper[4971]: I0309 09:51:29.410778 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" podStartSLOduration=1.41075491 podStartE2EDuration="1.41075491s" podCreationTimestamp="2026-03-09 09:51:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:51:29.40730789 +0000 UTC m=+1892.967235710" watchObservedRunningTime="2026-03-09 09:51:29.41075491 +0000 UTC m=+1892.970682720" Mar 09 09:51:30 crc kubenswrapper[4971]: I0309 09:51:30.786077 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" Mar 09 09:51:30 crc kubenswrapper[4971]: I0309 09:51:30.941437 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-swiftconf\") pod \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\" (UID: \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\") " Mar 09 09:51:30 crc kubenswrapper[4971]: I0309 09:51:30.941554 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-etc-swift\") pod \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\" (UID: \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\") " Mar 09 09:51:30 crc kubenswrapper[4971]: I0309 09:51:30.941602 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-ring-data-devices\") pod \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\" (UID: \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\") " Mar 09 09:51:30 crc kubenswrapper[4971]: I0309 09:51:30.941649 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-scripts\") pod \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\" (UID: \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\") " Mar 09 09:51:30 crc kubenswrapper[4971]: I0309 09:51:30.941703 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gn6w\" (UniqueName: \"kubernetes.io/projected/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-kube-api-access-5gn6w\") pod \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\" (UID: \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\") " Mar 09 09:51:30 crc kubenswrapper[4971]: I0309 09:51:30.941824 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-dispersionconf\") pod \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\" (UID: \"2aec992a-0f1f-4c6f-8a4f-54b10ef43466\") " Mar 09 09:51:30 crc kubenswrapper[4971]: I0309 09:51:30.942669 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "2aec992a-0f1f-4c6f-8a4f-54b10ef43466" (UID: "2aec992a-0f1f-4c6f-8a4f-54b10ef43466"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:51:30 crc kubenswrapper[4971]: I0309 09:51:30.942905 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2aec992a-0f1f-4c6f-8a4f-54b10ef43466" (UID: "2aec992a-0f1f-4c6f-8a4f-54b10ef43466"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:51:30 crc kubenswrapper[4971]: I0309 09:51:30.947044 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-kube-api-access-5gn6w" (OuterVolumeSpecName: "kube-api-access-5gn6w") pod "2aec992a-0f1f-4c6f-8a4f-54b10ef43466" (UID: "2aec992a-0f1f-4c6f-8a4f-54b10ef43466"). InnerVolumeSpecName "kube-api-access-5gn6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:51:30 crc kubenswrapper[4971]: I0309 09:51:30.963277 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-scripts" (OuterVolumeSpecName: "scripts") pod "2aec992a-0f1f-4c6f-8a4f-54b10ef43466" (UID: "2aec992a-0f1f-4c6f-8a4f-54b10ef43466"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:51:30 crc kubenswrapper[4971]: I0309 09:51:30.964934 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "2aec992a-0f1f-4c6f-8a4f-54b10ef43466" (UID: "2aec992a-0f1f-4c6f-8a4f-54b10ef43466"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:51:30 crc kubenswrapper[4971]: I0309 09:51:30.965824 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "2aec992a-0f1f-4c6f-8a4f-54b10ef43466" (UID: "2aec992a-0f1f-4c6f-8a4f-54b10ef43466"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:51:31 crc kubenswrapper[4971]: I0309 09:51:31.043578 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gn6w\" (UniqueName: \"kubernetes.io/projected/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-kube-api-access-5gn6w\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:31 crc kubenswrapper[4971]: I0309 09:51:31.043627 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:31 crc kubenswrapper[4971]: I0309 09:51:31.043650 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:31 crc kubenswrapper[4971]: I0309 09:51:31.043663 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:31 crc kubenswrapper[4971]: I0309 09:51:31.043674 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:31 crc kubenswrapper[4971]: I0309 09:51:31.043685 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2aec992a-0f1f-4c6f-8a4f-54b10ef43466-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:31 crc kubenswrapper[4971]: I0309 09:51:31.406610 4971 generic.go:334] "Generic (PLEG): container finished" podID="2aec992a-0f1f-4c6f-8a4f-54b10ef43466" containerID="25d5baa9f6be70f1ee09448daa9c66aa1baa7cc0fd9c3663db520600dc910133" exitCode=0 Mar 09 09:51:31 crc kubenswrapper[4971]: I0309 09:51:31.406679 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" Mar 09 09:51:31 crc kubenswrapper[4971]: I0309 09:51:31.406699 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" event={"ID":"2aec992a-0f1f-4c6f-8a4f-54b10ef43466","Type":"ContainerDied","Data":"25d5baa9f6be70f1ee09448daa9c66aa1baa7cc0fd9c3663db520600dc910133"} Mar 09 09:51:31 crc kubenswrapper[4971]: I0309 09:51:31.407110 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" event={"ID":"2aec992a-0f1f-4c6f-8a4f-54b10ef43466","Type":"ContainerDied","Data":"9be2989335d5110f48da0a1195c709d0a024abf72b3f54fe67f95270194e1678"} Mar 09 09:51:31 crc kubenswrapper[4971]: I0309 09:51:31.407154 4971 scope.go:117] "RemoveContainer" containerID="25d5baa9f6be70f1ee09448daa9c66aa1baa7cc0fd9c3663db520600dc910133" Mar 09 09:51:31 crc kubenswrapper[4971]: I0309 09:51:31.420204 4971 status_manager.go:907] "Failed to delete status for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd" err="pods \"swift-ring-rebalance-debug-hj5qd\" not found" Mar 09 09:51:31 crc kubenswrapper[4971]: I0309 09:51:31.431629 4971 scope.go:117] "RemoveContainer" containerID="25d5baa9f6be70f1ee09448daa9c66aa1baa7cc0fd9c3663db520600dc910133" Mar 09 09:51:31 crc kubenswrapper[4971]: E0309 09:51:31.432056 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25d5baa9f6be70f1ee09448daa9c66aa1baa7cc0fd9c3663db520600dc910133\": container with ID starting with 25d5baa9f6be70f1ee09448daa9c66aa1baa7cc0fd9c3663db520600dc910133 not found: ID does not exist" containerID="25d5baa9f6be70f1ee09448daa9c66aa1baa7cc0fd9c3663db520600dc910133" Mar 09 09:51:31 crc kubenswrapper[4971]: I0309 09:51:31.432100 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25d5baa9f6be70f1ee09448daa9c66aa1baa7cc0fd9c3663db520600dc910133"} err="failed to get container status \"25d5baa9f6be70f1ee09448daa9c66aa1baa7cc0fd9c3663db520600dc910133\": rpc error: code = NotFound desc = could not find container \"25d5baa9f6be70f1ee09448daa9c66aa1baa7cc0fd9c3663db520600dc910133\": container with ID starting with 25d5baa9f6be70f1ee09448daa9c66aa1baa7cc0fd9c3663db520600dc910133 not found: ID does not exist" Mar 09 09:51:31 crc kubenswrapper[4971]: I0309 09:51:31.432645 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd"] Mar 09 09:51:31 crc kubenswrapper[4971]: I0309 09:51:31.441790 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hj5qd"] Mar 09 09:51:32 crc kubenswrapper[4971]: I0309 09:51:32.554608 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb"] Mar 09 09:51:32 crc kubenswrapper[4971]: E0309 09:51:32.554954 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aec992a-0f1f-4c6f-8a4f-54b10ef43466" containerName="swift-ring-rebalance" Mar 09 09:51:32 crc kubenswrapper[4971]: I0309 09:51:32.554969 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aec992a-0f1f-4c6f-8a4f-54b10ef43466" containerName="swift-ring-rebalance" Mar 09 09:51:32 crc kubenswrapper[4971]: I0309 09:51:32.555156 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aec992a-0f1f-4c6f-8a4f-54b10ef43466" containerName="swift-ring-rebalance" Mar 09 09:51:32 crc kubenswrapper[4971]: I0309 09:51:32.555746 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb" Mar 09 09:51:32 crc kubenswrapper[4971]: I0309 09:51:32.558648 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:51:32 crc kubenswrapper[4971]: I0309 09:51:32.559375 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:51:32 crc kubenswrapper[4971]: I0309 09:51:32.562449 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb"] Mar 09 09:51:32 crc kubenswrapper[4971]: I0309 09:51:32.564467 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mvlt\" (UniqueName: \"kubernetes.io/projected/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-kube-api-access-4mvlt\") pod \"swift-ring-rebalance-debug-sh7qb\" (UID: \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb" Mar 09 09:51:32 crc kubenswrapper[4971]: I0309 09:51:32.564514 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-swiftconf\") pod \"swift-ring-rebalance-debug-sh7qb\" (UID: \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb" Mar 09 09:51:32 crc kubenswrapper[4971]: I0309 09:51:32.564557 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-dispersionconf\") pod \"swift-ring-rebalance-debug-sh7qb\" (UID: \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb" Mar 09 09:51:32 crc kubenswrapper[4971]: I0309 09:51:32.564590 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-ring-data-devices\") pod \"swift-ring-rebalance-debug-sh7qb\" (UID: \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb" Mar 09 09:51:32 crc kubenswrapper[4971]: I0309 09:51:32.564631 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-etc-swift\") pod \"swift-ring-rebalance-debug-sh7qb\" (UID: \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb" Mar 09 09:51:32 crc kubenswrapper[4971]: I0309 09:51:32.564846 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-scripts\") pod \"swift-ring-rebalance-debug-sh7qb\" (UID: \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb" Mar 09 09:51:32 crc kubenswrapper[4971]: I0309 09:51:32.665780 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-scripts\") pod \"swift-ring-rebalance-debug-sh7qb\" (UID: \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb" Mar 09 09:51:32 crc kubenswrapper[4971]: I0309 09:51:32.665843 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mvlt\" (UniqueName: \"kubernetes.io/projected/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-kube-api-access-4mvlt\") pod \"swift-ring-rebalance-debug-sh7qb\" (UID: \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb" Mar 09 09:51:32 crc kubenswrapper[4971]: I0309 09:51:32.665867 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-swiftconf\") pod \"swift-ring-rebalance-debug-sh7qb\" (UID: \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb" Mar 09 09:51:32 crc kubenswrapper[4971]: I0309 09:51:32.665902 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-dispersionconf\") pod \"swift-ring-rebalance-debug-sh7qb\" (UID: \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb" Mar 09 09:51:32 crc kubenswrapper[4971]: I0309 09:51:32.665933 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-ring-data-devices\") pod \"swift-ring-rebalance-debug-sh7qb\" (UID: \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb" Mar 09 09:51:32 crc kubenswrapper[4971]: I0309 09:51:32.665974 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-etc-swift\") pod \"swift-ring-rebalance-debug-sh7qb\" (UID: \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb" Mar 09 09:51:32 crc kubenswrapper[4971]: I0309 09:51:32.666419 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-etc-swift\") pod \"swift-ring-rebalance-debug-sh7qb\" (UID: \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb" Mar 09 09:51:32 crc kubenswrapper[4971]: I0309 09:51:32.667518 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-scripts\") pod \"swift-ring-rebalance-debug-sh7qb\" (UID: \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb" Mar 09 09:51:32 crc kubenswrapper[4971]: I0309 09:51:32.667518 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-ring-data-devices\") pod \"swift-ring-rebalance-debug-sh7qb\" (UID: \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb" Mar 09 09:51:32 crc kubenswrapper[4971]: I0309 09:51:32.670701 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-swiftconf\") pod \"swift-ring-rebalance-debug-sh7qb\" (UID: \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb" Mar 09 09:51:32 crc kubenswrapper[4971]: I0309 09:51:32.671739 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-dispersionconf\") pod \"swift-ring-rebalance-debug-sh7qb\" (UID: \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb" Mar 09 09:51:32 crc kubenswrapper[4971]: I0309 09:51:32.683504 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mvlt\" (UniqueName: \"kubernetes.io/projected/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-kube-api-access-4mvlt\") pod \"swift-ring-rebalance-debug-sh7qb\" (UID: \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb" Mar 09 09:51:32 crc kubenswrapper[4971]: I0309 09:51:32.872122 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb" Mar 09 09:51:33 crc kubenswrapper[4971]: I0309 09:51:33.160245 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aec992a-0f1f-4c6f-8a4f-54b10ef43466" path="/var/lib/kubelet/pods/2aec992a-0f1f-4c6f-8a4f-54b10ef43466/volumes" Mar 09 09:51:33 crc kubenswrapper[4971]: I0309 09:51:33.288160 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb"] Mar 09 09:51:33 crc kubenswrapper[4971]: I0309 09:51:33.428774 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb" event={"ID":"92f92d4a-7eec-4afc-8fe7-e4bc06e86471","Type":"ContainerStarted","Data":"f0c6d705508d9202751055dd5dba7d2e59a852553fbec2358f800b4d027482f0"} Mar 09 09:51:34 crc kubenswrapper[4971]: I0309 09:51:34.437711 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb" event={"ID":"92f92d4a-7eec-4afc-8fe7-e4bc06e86471","Type":"ContainerStarted","Data":"b94a8e52446da69300d13c56da32b0775c2404c9cc50014310c52519c48e45b6"} Mar 09 09:51:34 crc kubenswrapper[4971]: I0309 09:51:34.459834 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb" podStartSLOduration=2.459815148 podStartE2EDuration="2.459815148s" podCreationTimestamp="2026-03-09 09:51:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:51:34.455288737 +0000 UTC m=+1898.015216547" watchObservedRunningTime="2026-03-09 09:51:34.459815148 +0000 UTC m=+1898.019742958" Mar 09 09:51:35 crc kubenswrapper[4971]: I0309 09:51:35.457521 4971 generic.go:334] "Generic (PLEG): container finished" podID="92f92d4a-7eec-4afc-8fe7-e4bc06e86471" containerID="b94a8e52446da69300d13c56da32b0775c2404c9cc50014310c52519c48e45b6" exitCode=0 Mar 09 09:51:35 crc kubenswrapper[4971]: I0309 09:51:35.457581 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb" event={"ID":"92f92d4a-7eec-4afc-8fe7-e4bc06e86471","Type":"ContainerDied","Data":"b94a8e52446da69300d13c56da32b0775c2404c9cc50014310c52519c48e45b6"} Mar 09 09:51:36 crc kubenswrapper[4971]: I0309 09:51:36.152218 4971 scope.go:117] "RemoveContainer" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" Mar 09 09:51:36 crc kubenswrapper[4971]: E0309 09:51:36.152526 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 09:51:36 crc kubenswrapper[4971]: I0309 09:51:36.722557 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb" Mar 09 09:51:36 crc kubenswrapper[4971]: I0309 09:51:36.756808 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb"] Mar 09 09:51:36 crc kubenswrapper[4971]: I0309 09:51:36.762331 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb"] Mar 09 09:51:36 crc kubenswrapper[4971]: I0309 09:51:36.826294 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-etc-swift\") pod \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\" (UID: \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\") " Mar 09 09:51:36 crc kubenswrapper[4971]: I0309 09:51:36.826463 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-swiftconf\") pod \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\" (UID: \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\") " Mar 09 09:51:36 crc kubenswrapper[4971]: I0309 09:51:36.826555 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mvlt\" (UniqueName: \"kubernetes.io/projected/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-kube-api-access-4mvlt\") pod \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\" (UID: \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\") " Mar 09 09:51:36 crc kubenswrapper[4971]: I0309 09:51:36.826602 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-dispersionconf\") pod \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\" (UID: \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\") " Mar 09 09:51:36 crc kubenswrapper[4971]: I0309 09:51:36.826626 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-ring-data-devices\") pod \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\" (UID: \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\") " Mar 09 09:51:36 crc kubenswrapper[4971]: I0309 09:51:36.826674 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-scripts\") pod \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\" (UID: \"92f92d4a-7eec-4afc-8fe7-e4bc06e86471\") " Mar 09 09:51:36 crc kubenswrapper[4971]: I0309 09:51:36.827341 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "92f92d4a-7eec-4afc-8fe7-e4bc06e86471" (UID: "92f92d4a-7eec-4afc-8fe7-e4bc06e86471"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:51:36 crc kubenswrapper[4971]: I0309 09:51:36.827969 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "92f92d4a-7eec-4afc-8fe7-e4bc06e86471" (UID: "92f92d4a-7eec-4afc-8fe7-e4bc06e86471"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:51:36 crc kubenswrapper[4971]: I0309 09:51:36.833025 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-kube-api-access-4mvlt" (OuterVolumeSpecName: "kube-api-access-4mvlt") pod "92f92d4a-7eec-4afc-8fe7-e4bc06e86471" (UID: "92f92d4a-7eec-4afc-8fe7-e4bc06e86471"). InnerVolumeSpecName "kube-api-access-4mvlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:51:36 crc kubenswrapper[4971]: I0309 09:51:36.849110 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-scripts" (OuterVolumeSpecName: "scripts") pod "92f92d4a-7eec-4afc-8fe7-e4bc06e86471" (UID: "92f92d4a-7eec-4afc-8fe7-e4bc06e86471"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:51:36 crc kubenswrapper[4971]: I0309 09:51:36.851673 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "92f92d4a-7eec-4afc-8fe7-e4bc06e86471" (UID: "92f92d4a-7eec-4afc-8fe7-e4bc06e86471"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:51:36 crc kubenswrapper[4971]: I0309 09:51:36.854885 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "92f92d4a-7eec-4afc-8fe7-e4bc06e86471" (UID: "92f92d4a-7eec-4afc-8fe7-e4bc06e86471"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:51:36 crc kubenswrapper[4971]: I0309 09:51:36.928338 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:36 crc kubenswrapper[4971]: I0309 09:51:36.928403 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mvlt\" (UniqueName: \"kubernetes.io/projected/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-kube-api-access-4mvlt\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:36 crc kubenswrapper[4971]: I0309 09:51:36.928417 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:36 crc kubenswrapper[4971]: I0309 09:51:36.928428 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:36 crc kubenswrapper[4971]: I0309 09:51:36.928437 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:36 crc kubenswrapper[4971]: I0309 09:51:36.928444 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/92f92d4a-7eec-4afc-8fe7-e4bc06e86471-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:37 crc kubenswrapper[4971]: I0309 09:51:37.162571 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92f92d4a-7eec-4afc-8fe7-e4bc06e86471" path="/var/lib/kubelet/pods/92f92d4a-7eec-4afc-8fe7-e4bc06e86471/volumes" Mar 09 09:51:37 crc kubenswrapper[4971]: I0309 09:51:37.473431 4971 scope.go:117] "RemoveContainer" containerID="b94a8e52446da69300d13c56da32b0775c2404c9cc50014310c52519c48e45b6" Mar 09 09:51:37 crc kubenswrapper[4971]: I0309 09:51:37.473480 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sh7qb" Mar 09 09:51:37 crc kubenswrapper[4971]: I0309 09:51:37.903557 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c58mx"] Mar 09 09:51:37 crc kubenswrapper[4971]: E0309 09:51:37.904469 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92f92d4a-7eec-4afc-8fe7-e4bc06e86471" containerName="swift-ring-rebalance" Mar 09 09:51:37 crc kubenswrapper[4971]: I0309 09:51:37.904556 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="92f92d4a-7eec-4afc-8fe7-e4bc06e86471" containerName="swift-ring-rebalance" Mar 09 09:51:37 crc kubenswrapper[4971]: I0309 09:51:37.904786 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="92f92d4a-7eec-4afc-8fe7-e4bc06e86471" containerName="swift-ring-rebalance" Mar 09 09:51:37 crc kubenswrapper[4971]: I0309 09:51:37.905522 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c58mx" Mar 09 09:51:37 crc kubenswrapper[4971]: I0309 09:51:37.907860 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:51:37 crc kubenswrapper[4971]: I0309 09:51:37.908129 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:51:37 crc kubenswrapper[4971]: I0309 09:51:37.928085 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c58mx"] Mar 09 09:51:38 crc kubenswrapper[4971]: I0309 09:51:38.051576 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7bc562d1-85fa-4011-b00d-70eeea70bb3d-etc-swift\") pod \"swift-ring-rebalance-debug-c58mx\" (UID: \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c58mx" Mar 09 09:51:38 crc kubenswrapper[4971]: I0309 09:51:38.051622 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mm7q\" (UniqueName: \"kubernetes.io/projected/7bc562d1-85fa-4011-b00d-70eeea70bb3d-kube-api-access-7mm7q\") pod \"swift-ring-rebalance-debug-c58mx\" (UID: \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c58mx" Mar 09 09:51:38 crc kubenswrapper[4971]: I0309 09:51:38.051655 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7bc562d1-85fa-4011-b00d-70eeea70bb3d-scripts\") pod \"swift-ring-rebalance-debug-c58mx\" (UID: \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c58mx" Mar 09 09:51:38 crc kubenswrapper[4971]: I0309 09:51:38.051671 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7bc562d1-85fa-4011-b00d-70eeea70bb3d-swiftconf\") pod \"swift-ring-rebalance-debug-c58mx\" (UID: \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c58mx" Mar 09 09:51:38 crc kubenswrapper[4971]: I0309 09:51:38.051726 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7bc562d1-85fa-4011-b00d-70eeea70bb3d-ring-data-devices\") pod \"swift-ring-rebalance-debug-c58mx\" (UID: \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c58mx" Mar 09 09:51:38 crc kubenswrapper[4971]: I0309 09:51:38.051742 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7bc562d1-85fa-4011-b00d-70eeea70bb3d-dispersionconf\") pod \"swift-ring-rebalance-debug-c58mx\" (UID: \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c58mx" Mar 09 09:51:38 crc kubenswrapper[4971]: I0309 09:51:38.152806 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7bc562d1-85fa-4011-b00d-70eeea70bb3d-etc-swift\") pod \"swift-ring-rebalance-debug-c58mx\" (UID: \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c58mx" Mar 09 09:51:38 crc kubenswrapper[4971]: I0309 09:51:38.152868 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mm7q\" (UniqueName: \"kubernetes.io/projected/7bc562d1-85fa-4011-b00d-70eeea70bb3d-kube-api-access-7mm7q\") pod \"swift-ring-rebalance-debug-c58mx\" (UID: \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c58mx" Mar 09 09:51:38 crc kubenswrapper[4971]: I0309 09:51:38.152915 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7bc562d1-85fa-4011-b00d-70eeea70bb3d-scripts\") pod \"swift-ring-rebalance-debug-c58mx\" (UID: \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c58mx" Mar 09 09:51:38 crc kubenswrapper[4971]: I0309 09:51:38.152945 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7bc562d1-85fa-4011-b00d-70eeea70bb3d-swiftconf\") pod \"swift-ring-rebalance-debug-c58mx\" (UID: \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c58mx" Mar 09 09:51:38 crc kubenswrapper[4971]: I0309 09:51:38.153021 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7bc562d1-85fa-4011-b00d-70eeea70bb3d-ring-data-devices\") pod \"swift-ring-rebalance-debug-c58mx\" (UID: \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c58mx" Mar 09 09:51:38 crc kubenswrapper[4971]: I0309 09:51:38.153044 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7bc562d1-85fa-4011-b00d-70eeea70bb3d-dispersionconf\") pod \"swift-ring-rebalance-debug-c58mx\" (UID: \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c58mx" Mar 09 09:51:38 crc kubenswrapper[4971]: I0309 09:51:38.153308 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7bc562d1-85fa-4011-b00d-70eeea70bb3d-etc-swift\") pod \"swift-ring-rebalance-debug-c58mx\" (UID: \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c58mx" Mar 09 09:51:38 crc kubenswrapper[4971]: I0309 09:51:38.153779 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7bc562d1-85fa-4011-b00d-70eeea70bb3d-scripts\") pod \"swift-ring-rebalance-debug-c58mx\" (UID: \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c58mx" Mar 09 09:51:38 crc kubenswrapper[4971]: I0309 09:51:38.153944 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7bc562d1-85fa-4011-b00d-70eeea70bb3d-ring-data-devices\") pod \"swift-ring-rebalance-debug-c58mx\" (UID: \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c58mx" Mar 09 09:51:38 crc kubenswrapper[4971]: I0309 09:51:38.157423 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7bc562d1-85fa-4011-b00d-70eeea70bb3d-dispersionconf\") pod \"swift-ring-rebalance-debug-c58mx\" (UID: \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c58mx" Mar 09 09:51:38 crc kubenswrapper[4971]: I0309 09:51:38.157478 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7bc562d1-85fa-4011-b00d-70eeea70bb3d-swiftconf\") pod \"swift-ring-rebalance-debug-c58mx\" (UID: \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c58mx" Mar 09 09:51:38 crc kubenswrapper[4971]: I0309 09:51:38.170200 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mm7q\" (UniqueName: \"kubernetes.io/projected/7bc562d1-85fa-4011-b00d-70eeea70bb3d-kube-api-access-7mm7q\") pod \"swift-ring-rebalance-debug-c58mx\" (UID: \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c58mx" Mar 09 09:51:38 crc kubenswrapper[4971]: I0309 09:51:38.221839 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c58mx" Mar 09 09:51:38 crc kubenswrapper[4971]: I0309 09:51:38.482832 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c58mx"] Mar 09 09:51:38 crc kubenswrapper[4971]: W0309 09:51:38.494746 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bc562d1_85fa_4011_b00d_70eeea70bb3d.slice/crio-f310e2d439ca00029fbb10edd47055ff7ab6bf4e2bd7dbed0c3f2197923878a6 WatchSource:0}: Error finding container f310e2d439ca00029fbb10edd47055ff7ab6bf4e2bd7dbed0c3f2197923878a6: Status 404 returned error can't find the container with id f310e2d439ca00029fbb10edd47055ff7ab6bf4e2bd7dbed0c3f2197923878a6 Mar 09 09:51:39 crc kubenswrapper[4971]: I0309 09:51:39.492384 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c58mx" event={"ID":"7bc562d1-85fa-4011-b00d-70eeea70bb3d","Type":"ContainerStarted","Data":"5c1e72fd942670f367d3ce5bd9f8f69f949fbcecb953d8a09b98b497b077550c"} Mar 09 09:51:39 crc kubenswrapper[4971]: I0309 09:51:39.492789 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c58mx" event={"ID":"7bc562d1-85fa-4011-b00d-70eeea70bb3d","Type":"ContainerStarted","Data":"f310e2d439ca00029fbb10edd47055ff7ab6bf4e2bd7dbed0c3f2197923878a6"} Mar 09 09:51:39 crc kubenswrapper[4971]: I0309 09:51:39.514988 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c58mx" podStartSLOduration=2.51496616 podStartE2EDuration="2.51496616s" podCreationTimestamp="2026-03-09 09:51:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:51:39.511668845 +0000 UTC m=+1903.071596655" watchObservedRunningTime="2026-03-09 09:51:39.51496616 +0000 UTC m=+1903.074893970" Mar 09 09:51:40 crc kubenswrapper[4971]: I0309 09:51:40.500142 4971 generic.go:334] "Generic (PLEG): container finished" podID="7bc562d1-85fa-4011-b00d-70eeea70bb3d" containerID="5c1e72fd942670f367d3ce5bd9f8f69f949fbcecb953d8a09b98b497b077550c" exitCode=0 Mar 09 09:51:40 crc kubenswrapper[4971]: I0309 09:51:40.500245 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c58mx" event={"ID":"7bc562d1-85fa-4011-b00d-70eeea70bb3d","Type":"ContainerDied","Data":"5c1e72fd942670f367d3ce5bd9f8f69f949fbcecb953d8a09b98b497b077550c"} Mar 09 09:51:41 crc kubenswrapper[4971]: I0309 09:51:41.761682 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c58mx" Mar 09 09:51:41 crc kubenswrapper[4971]: I0309 09:51:41.799873 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c58mx"] Mar 09 09:51:41 crc kubenswrapper[4971]: I0309 09:51:41.802934 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c58mx"] Mar 09 09:51:41 crc kubenswrapper[4971]: I0309 09:51:41.915764 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7bc562d1-85fa-4011-b00d-70eeea70bb3d-scripts\") pod \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\" (UID: \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\") " Mar 09 09:51:41 crc kubenswrapper[4971]: I0309 09:51:41.915827 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7bc562d1-85fa-4011-b00d-70eeea70bb3d-ring-data-devices\") pod \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\" (UID: \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\") " Mar 09 09:51:41 crc kubenswrapper[4971]: I0309 09:51:41.915858 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7bc562d1-85fa-4011-b00d-70eeea70bb3d-swiftconf\") pod \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\" (UID: \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\") " Mar 09 09:51:41 crc kubenswrapper[4971]: I0309 09:51:41.915898 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7bc562d1-85fa-4011-b00d-70eeea70bb3d-dispersionconf\") pod \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\" (UID: \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\") " Mar 09 09:51:41 crc kubenswrapper[4971]: I0309 09:51:41.915951 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7bc562d1-85fa-4011-b00d-70eeea70bb3d-etc-swift\") pod \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\" (UID: \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\") " Mar 09 09:51:41 crc kubenswrapper[4971]: I0309 09:51:41.915984 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mm7q\" (UniqueName: \"kubernetes.io/projected/7bc562d1-85fa-4011-b00d-70eeea70bb3d-kube-api-access-7mm7q\") pod \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\" (UID: \"7bc562d1-85fa-4011-b00d-70eeea70bb3d\") " Mar 09 09:51:41 crc kubenswrapper[4971]: I0309 09:51:41.917169 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bc562d1-85fa-4011-b00d-70eeea70bb3d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7bc562d1-85fa-4011-b00d-70eeea70bb3d" (UID: "7bc562d1-85fa-4011-b00d-70eeea70bb3d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:51:41 crc kubenswrapper[4971]: I0309 09:51:41.917407 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bc562d1-85fa-4011-b00d-70eeea70bb3d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7bc562d1-85fa-4011-b00d-70eeea70bb3d" (UID: "7bc562d1-85fa-4011-b00d-70eeea70bb3d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:51:41 crc kubenswrapper[4971]: I0309 09:51:41.921486 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bc562d1-85fa-4011-b00d-70eeea70bb3d-kube-api-access-7mm7q" (OuterVolumeSpecName: "kube-api-access-7mm7q") pod "7bc562d1-85fa-4011-b00d-70eeea70bb3d" (UID: "7bc562d1-85fa-4011-b00d-70eeea70bb3d"). InnerVolumeSpecName "kube-api-access-7mm7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:51:41 crc kubenswrapper[4971]: I0309 09:51:41.937033 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bc562d1-85fa-4011-b00d-70eeea70bb3d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7bc562d1-85fa-4011-b00d-70eeea70bb3d" (UID: "7bc562d1-85fa-4011-b00d-70eeea70bb3d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:51:41 crc kubenswrapper[4971]: I0309 09:51:41.938942 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bc562d1-85fa-4011-b00d-70eeea70bb3d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7bc562d1-85fa-4011-b00d-70eeea70bb3d" (UID: "7bc562d1-85fa-4011-b00d-70eeea70bb3d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:51:41 crc kubenswrapper[4971]: I0309 09:51:41.942000 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bc562d1-85fa-4011-b00d-70eeea70bb3d-scripts" (OuterVolumeSpecName: "scripts") pod "7bc562d1-85fa-4011-b00d-70eeea70bb3d" (UID: "7bc562d1-85fa-4011-b00d-70eeea70bb3d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:51:42 crc kubenswrapper[4971]: I0309 09:51:42.018199 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7bc562d1-85fa-4011-b00d-70eeea70bb3d-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:42 crc kubenswrapper[4971]: I0309 09:51:42.018236 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7bc562d1-85fa-4011-b00d-70eeea70bb3d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:42 crc kubenswrapper[4971]: I0309 09:51:42.018250 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7bc562d1-85fa-4011-b00d-70eeea70bb3d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:42 crc kubenswrapper[4971]: I0309 09:51:42.018258 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7bc562d1-85fa-4011-b00d-70eeea70bb3d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:42 crc kubenswrapper[4971]: I0309 09:51:42.018267 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7bc562d1-85fa-4011-b00d-70eeea70bb3d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:42 crc kubenswrapper[4971]: I0309 09:51:42.018277 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mm7q\" (UniqueName: \"kubernetes.io/projected/7bc562d1-85fa-4011-b00d-70eeea70bb3d-kube-api-access-7mm7q\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:42 crc kubenswrapper[4971]: I0309 09:51:42.518244 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f310e2d439ca00029fbb10edd47055ff7ab6bf4e2bd7dbed0c3f2197923878a6" Mar 09 09:51:42 crc kubenswrapper[4971]: I0309 09:51:42.518316 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c58mx" Mar 09 09:51:42 crc kubenswrapper[4971]: I0309 09:51:42.937534 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vspld"] Mar 09 09:51:42 crc kubenswrapper[4971]: E0309 09:51:42.937965 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bc562d1-85fa-4011-b00d-70eeea70bb3d" containerName="swift-ring-rebalance" Mar 09 09:51:42 crc kubenswrapper[4971]: I0309 09:51:42.937984 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bc562d1-85fa-4011-b00d-70eeea70bb3d" containerName="swift-ring-rebalance" Mar 09 09:51:42 crc kubenswrapper[4971]: I0309 09:51:42.938145 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bc562d1-85fa-4011-b00d-70eeea70bb3d" containerName="swift-ring-rebalance" Mar 09 09:51:42 crc kubenswrapper[4971]: I0309 09:51:42.938672 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vspld" Mar 09 09:51:42 crc kubenswrapper[4971]: I0309 09:51:42.940311 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:51:42 crc kubenswrapper[4971]: I0309 09:51:42.940555 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:51:42 crc kubenswrapper[4971]: I0309 09:51:42.951328 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vspld"] Mar 09 09:51:43 crc kubenswrapper[4971]: I0309 09:51:43.133892 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8502f401-cb20-49b7-a975-70a55805611d-etc-swift\") pod \"swift-ring-rebalance-debug-vspld\" (UID: \"8502f401-cb20-49b7-a975-70a55805611d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vspld" Mar 09 09:51:43 crc kubenswrapper[4971]: I0309 09:51:43.133991 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8502f401-cb20-49b7-a975-70a55805611d-ring-data-devices\") pod \"swift-ring-rebalance-debug-vspld\" (UID: \"8502f401-cb20-49b7-a975-70a55805611d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vspld" Mar 09 09:51:43 crc kubenswrapper[4971]: I0309 09:51:43.134028 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8502f401-cb20-49b7-a975-70a55805611d-swiftconf\") pod \"swift-ring-rebalance-debug-vspld\" (UID: \"8502f401-cb20-49b7-a975-70a55805611d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vspld" Mar 09 09:51:43 crc kubenswrapper[4971]: I0309 09:51:43.134121 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8502f401-cb20-49b7-a975-70a55805611d-dispersionconf\") pod \"swift-ring-rebalance-debug-vspld\" (UID: \"8502f401-cb20-49b7-a975-70a55805611d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vspld" Mar 09 09:51:43 crc kubenswrapper[4971]: I0309 09:51:43.134699 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9wp8\" (UniqueName: \"kubernetes.io/projected/8502f401-cb20-49b7-a975-70a55805611d-kube-api-access-d9wp8\") pod \"swift-ring-rebalance-debug-vspld\" (UID: \"8502f401-cb20-49b7-a975-70a55805611d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vspld" Mar 09 09:51:43 crc kubenswrapper[4971]: I0309 09:51:43.134961 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8502f401-cb20-49b7-a975-70a55805611d-scripts\") pod \"swift-ring-rebalance-debug-vspld\" (UID: \"8502f401-cb20-49b7-a975-70a55805611d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vspld" Mar 09 09:51:43 crc kubenswrapper[4971]: I0309 09:51:43.161205 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bc562d1-85fa-4011-b00d-70eeea70bb3d" path="/var/lib/kubelet/pods/7bc562d1-85fa-4011-b00d-70eeea70bb3d/volumes" Mar 09 09:51:43 crc kubenswrapper[4971]: I0309 09:51:43.236489 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8502f401-cb20-49b7-a975-70a55805611d-dispersionconf\") pod \"swift-ring-rebalance-debug-vspld\" (UID: \"8502f401-cb20-49b7-a975-70a55805611d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vspld" Mar 09 09:51:43 crc kubenswrapper[4971]: I0309 09:51:43.236571 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9wp8\" (UniqueName: \"kubernetes.io/projected/8502f401-cb20-49b7-a975-70a55805611d-kube-api-access-d9wp8\") pod \"swift-ring-rebalance-debug-vspld\" (UID: \"8502f401-cb20-49b7-a975-70a55805611d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vspld" Mar 09 09:51:43 crc kubenswrapper[4971]: I0309 09:51:43.236600 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8502f401-cb20-49b7-a975-70a55805611d-scripts\") pod \"swift-ring-rebalance-debug-vspld\" (UID: \"8502f401-cb20-49b7-a975-70a55805611d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vspld" Mar 09 09:51:43 crc kubenswrapper[4971]: I0309 09:51:43.237624 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8502f401-cb20-49b7-a975-70a55805611d-etc-swift\") pod \"swift-ring-rebalance-debug-vspld\" (UID: \"8502f401-cb20-49b7-a975-70a55805611d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vspld" Mar 09 09:51:43 crc kubenswrapper[4971]: I0309 09:51:43.237774 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8502f401-cb20-49b7-a975-70a55805611d-scripts\") pod \"swift-ring-rebalance-debug-vspld\" (UID: \"8502f401-cb20-49b7-a975-70a55805611d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vspld" Mar 09 09:51:43 crc kubenswrapper[4971]: I0309 09:51:43.237828 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8502f401-cb20-49b7-a975-70a55805611d-ring-data-devices\") pod \"swift-ring-rebalance-debug-vspld\" (UID: \"8502f401-cb20-49b7-a975-70a55805611d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vspld" Mar 09 09:51:43 crc kubenswrapper[4971]: I0309 09:51:43.237917 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8502f401-cb20-49b7-a975-70a55805611d-swiftconf\") pod \"swift-ring-rebalance-debug-vspld\" (UID: \"8502f401-cb20-49b7-a975-70a55805611d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vspld" Mar 09 09:51:43 crc kubenswrapper[4971]: I0309 09:51:43.237978 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8502f401-cb20-49b7-a975-70a55805611d-etc-swift\") pod \"swift-ring-rebalance-debug-vspld\" (UID: \"8502f401-cb20-49b7-a975-70a55805611d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vspld" Mar 09 09:51:43 crc kubenswrapper[4971]: I0309 09:51:43.238496 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8502f401-cb20-49b7-a975-70a55805611d-ring-data-devices\") pod \"swift-ring-rebalance-debug-vspld\" (UID: \"8502f401-cb20-49b7-a975-70a55805611d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vspld" Mar 09 09:51:43 crc kubenswrapper[4971]: I0309 09:51:43.240458 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8502f401-cb20-49b7-a975-70a55805611d-dispersionconf\") pod \"swift-ring-rebalance-debug-vspld\" (UID: \"8502f401-cb20-49b7-a975-70a55805611d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vspld" Mar 09 09:51:43 crc kubenswrapper[4971]: I0309 09:51:43.246873 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8502f401-cb20-49b7-a975-70a55805611d-swiftconf\") pod \"swift-ring-rebalance-debug-vspld\" (UID: \"8502f401-cb20-49b7-a975-70a55805611d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vspld" Mar 09 09:51:43 crc kubenswrapper[4971]: I0309 09:51:43.261202 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9wp8\" (UniqueName: \"kubernetes.io/projected/8502f401-cb20-49b7-a975-70a55805611d-kube-api-access-d9wp8\") pod \"swift-ring-rebalance-debug-vspld\" (UID: \"8502f401-cb20-49b7-a975-70a55805611d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vspld" Mar 09 09:51:43 crc kubenswrapper[4971]: I0309 09:51:43.555651 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vspld" Mar 09 09:51:44 crc kubenswrapper[4971]: I0309 09:51:44.002005 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vspld"] Mar 09 09:51:44 crc kubenswrapper[4971]: W0309 09:51:44.005023 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8502f401_cb20_49b7_a975_70a55805611d.slice/crio-95bfbe8a1ff64d9edf23c323c509fd825bf014caf1d34d9ba0786a14b3c08bf0 WatchSource:0}: Error finding container 95bfbe8a1ff64d9edf23c323c509fd825bf014caf1d34d9ba0786a14b3c08bf0: Status 404 returned error can't find the container with id 95bfbe8a1ff64d9edf23c323c509fd825bf014caf1d34d9ba0786a14b3c08bf0 Mar 09 09:51:44 crc kubenswrapper[4971]: I0309 09:51:44.531277 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vspld" event={"ID":"8502f401-cb20-49b7-a975-70a55805611d","Type":"ContainerStarted","Data":"8361c76e64613bfbde3d9b1bfda31f07e6371073a7f89a0edf787fefcf5ce3d5"} Mar 09 09:51:44 crc kubenswrapper[4971]: I0309 09:51:44.531575 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vspld" event={"ID":"8502f401-cb20-49b7-a975-70a55805611d","Type":"ContainerStarted","Data":"95bfbe8a1ff64d9edf23c323c509fd825bf014caf1d34d9ba0786a14b3c08bf0"} Mar 09 09:51:44 crc kubenswrapper[4971]: I0309 09:51:44.550960 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vspld" podStartSLOduration=2.550944831 podStartE2EDuration="2.550944831s" podCreationTimestamp="2026-03-09 09:51:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:51:44.548714376 +0000 UTC m=+1908.108642186" watchObservedRunningTime="2026-03-09 09:51:44.550944831 +0000 UTC m=+1908.110872641" Mar 09 09:51:45 crc kubenswrapper[4971]: I0309 09:51:45.539472 4971 generic.go:334] "Generic (PLEG): container finished" podID="8502f401-cb20-49b7-a975-70a55805611d" containerID="8361c76e64613bfbde3d9b1bfda31f07e6371073a7f89a0edf787fefcf5ce3d5" exitCode=0 Mar 09 09:51:45 crc kubenswrapper[4971]: I0309 09:51:45.539646 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vspld" event={"ID":"8502f401-cb20-49b7-a975-70a55805611d","Type":"ContainerDied","Data":"8361c76e64613bfbde3d9b1bfda31f07e6371073a7f89a0edf787fefcf5ce3d5"} Mar 09 09:51:46 crc kubenswrapper[4971]: I0309 09:51:46.823370 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vspld" Mar 09 09:51:46 crc kubenswrapper[4971]: I0309 09:51:46.892733 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8502f401-cb20-49b7-a975-70a55805611d-etc-swift\") pod \"8502f401-cb20-49b7-a975-70a55805611d\" (UID: \"8502f401-cb20-49b7-a975-70a55805611d\") " Mar 09 09:51:46 crc kubenswrapper[4971]: I0309 09:51:46.892814 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9wp8\" (UniqueName: \"kubernetes.io/projected/8502f401-cb20-49b7-a975-70a55805611d-kube-api-access-d9wp8\") pod \"8502f401-cb20-49b7-a975-70a55805611d\" (UID: \"8502f401-cb20-49b7-a975-70a55805611d\") " Mar 09 09:51:46 crc kubenswrapper[4971]: I0309 09:51:46.892851 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8502f401-cb20-49b7-a975-70a55805611d-swiftconf\") pod \"8502f401-cb20-49b7-a975-70a55805611d\" (UID: \"8502f401-cb20-49b7-a975-70a55805611d\") " Mar 09 09:51:46 crc kubenswrapper[4971]: I0309 09:51:46.892875 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8502f401-cb20-49b7-a975-70a55805611d-ring-data-devices\") pod \"8502f401-cb20-49b7-a975-70a55805611d\" (UID: \"8502f401-cb20-49b7-a975-70a55805611d\") " Mar 09 09:51:46 crc kubenswrapper[4971]: I0309 09:51:46.892967 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8502f401-cb20-49b7-a975-70a55805611d-dispersionconf\") pod \"8502f401-cb20-49b7-a975-70a55805611d\" (UID: \"8502f401-cb20-49b7-a975-70a55805611d\") " Mar 09 09:51:46 crc kubenswrapper[4971]: I0309 09:51:46.892996 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8502f401-cb20-49b7-a975-70a55805611d-scripts\") pod \"8502f401-cb20-49b7-a975-70a55805611d\" (UID: \"8502f401-cb20-49b7-a975-70a55805611d\") " Mar 09 09:51:46 crc kubenswrapper[4971]: I0309 09:51:46.893754 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8502f401-cb20-49b7-a975-70a55805611d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8502f401-cb20-49b7-a975-70a55805611d" (UID: "8502f401-cb20-49b7-a975-70a55805611d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:51:46 crc kubenswrapper[4971]: I0309 09:51:46.894122 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8502f401-cb20-49b7-a975-70a55805611d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8502f401-cb20-49b7-a975-70a55805611d" (UID: "8502f401-cb20-49b7-a975-70a55805611d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:51:46 crc kubenswrapper[4971]: I0309 09:51:46.915938 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vspld"] Mar 09 09:51:46 crc kubenswrapper[4971]: I0309 09:51:46.918944 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8502f401-cb20-49b7-a975-70a55805611d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8502f401-cb20-49b7-a975-70a55805611d" (UID: "8502f401-cb20-49b7-a975-70a55805611d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:51:46 crc kubenswrapper[4971]: I0309 09:51:46.920844 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8502f401-cb20-49b7-a975-70a55805611d-kube-api-access-d9wp8" (OuterVolumeSpecName: "kube-api-access-d9wp8") pod "8502f401-cb20-49b7-a975-70a55805611d" (UID: "8502f401-cb20-49b7-a975-70a55805611d"). InnerVolumeSpecName "kube-api-access-d9wp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:51:46 crc kubenswrapper[4971]: I0309 09:51:46.924020 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vspld"] Mar 09 09:51:46 crc kubenswrapper[4971]: I0309 09:51:46.929226 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8502f401-cb20-49b7-a975-70a55805611d-scripts" (OuterVolumeSpecName: "scripts") pod "8502f401-cb20-49b7-a975-70a55805611d" (UID: "8502f401-cb20-49b7-a975-70a55805611d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:51:46 crc kubenswrapper[4971]: I0309 09:51:46.947906 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8502f401-cb20-49b7-a975-70a55805611d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8502f401-cb20-49b7-a975-70a55805611d" (UID: "8502f401-cb20-49b7-a975-70a55805611d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:51:46 crc kubenswrapper[4971]: I0309 09:51:46.994712 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8502f401-cb20-49b7-a975-70a55805611d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:46 crc kubenswrapper[4971]: I0309 09:51:46.994755 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8502f401-cb20-49b7-a975-70a55805611d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:46 crc kubenswrapper[4971]: I0309 09:51:46.994774 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8502f401-cb20-49b7-a975-70a55805611d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:46 crc kubenswrapper[4971]: I0309 09:51:46.994786 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8502f401-cb20-49b7-a975-70a55805611d-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:46 crc kubenswrapper[4971]: I0309 09:51:46.994799 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8502f401-cb20-49b7-a975-70a55805611d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:46 crc kubenswrapper[4971]: I0309 09:51:46.994812 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9wp8\" (UniqueName: \"kubernetes.io/projected/8502f401-cb20-49b7-a975-70a55805611d-kube-api-access-d9wp8\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:47 crc kubenswrapper[4971]: I0309 09:51:47.168671 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8502f401-cb20-49b7-a975-70a55805611d" path="/var/lib/kubelet/pods/8502f401-cb20-49b7-a975-70a55805611d/volumes" Mar 09 09:51:47 crc kubenswrapper[4971]: I0309 09:51:47.556820 4971 scope.go:117] "RemoveContainer" containerID="8361c76e64613bfbde3d9b1bfda31f07e6371073a7f89a0edf787fefcf5ce3d5" Mar 09 09:51:47 crc kubenswrapper[4971]: I0309 09:51:47.557103 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vspld" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.044038 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tvncr"] Mar 09 09:51:48 crc kubenswrapper[4971]: E0309 09:51:48.044439 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8502f401-cb20-49b7-a975-70a55805611d" containerName="swift-ring-rebalance" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.044453 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8502f401-cb20-49b7-a975-70a55805611d" containerName="swift-ring-rebalance" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.044590 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8502f401-cb20-49b7-a975-70a55805611d" containerName="swift-ring-rebalance" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.045033 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tvncr" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.046611 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.046905 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.053769 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tvncr"] Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.109286 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-ring-data-devices\") pod \"swift-ring-rebalance-debug-tvncr\" (UID: \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tvncr" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.109359 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-swiftconf\") pod \"swift-ring-rebalance-debug-tvncr\" (UID: \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tvncr" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.109432 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-etc-swift\") pod \"swift-ring-rebalance-debug-tvncr\" (UID: \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tvncr" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.109448 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-scripts\") pod \"swift-ring-rebalance-debug-tvncr\" (UID: \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tvncr" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.109478 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-dispersionconf\") pod \"swift-ring-rebalance-debug-tvncr\" (UID: \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tvncr" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.109502 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6hmx\" (UniqueName: \"kubernetes.io/projected/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-kube-api-access-k6hmx\") pod \"swift-ring-rebalance-debug-tvncr\" (UID: \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tvncr" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.160427 4971 scope.go:117] "RemoveContainer" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" Mar 09 09:51:48 crc kubenswrapper[4971]: E0309 09:51:48.160864 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.210700 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-scripts\") pod \"swift-ring-rebalance-debug-tvncr\" (UID: \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tvncr" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.210766 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-etc-swift\") pod \"swift-ring-rebalance-debug-tvncr\" (UID: \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tvncr" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.210812 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-dispersionconf\") pod \"swift-ring-rebalance-debug-tvncr\" (UID: \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tvncr" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.210848 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6hmx\" (UniqueName: \"kubernetes.io/projected/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-kube-api-access-k6hmx\") pod \"swift-ring-rebalance-debug-tvncr\" (UID: \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tvncr" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.210918 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-ring-data-devices\") pod \"swift-ring-rebalance-debug-tvncr\" (UID: \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tvncr" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.210948 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-swiftconf\") pod \"swift-ring-rebalance-debug-tvncr\" (UID: \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tvncr" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.211371 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-etc-swift\") pod \"swift-ring-rebalance-debug-tvncr\" (UID: \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tvncr" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.211978 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-ring-data-devices\") pod \"swift-ring-rebalance-debug-tvncr\" (UID: \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tvncr" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.212278 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-scripts\") pod \"swift-ring-rebalance-debug-tvncr\" (UID: \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tvncr" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.217533 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-dispersionconf\") pod \"swift-ring-rebalance-debug-tvncr\" (UID: \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tvncr" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.231443 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-swiftconf\") pod \"swift-ring-rebalance-debug-tvncr\" (UID: \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tvncr" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.256698 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6hmx\" (UniqueName: \"kubernetes.io/projected/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-kube-api-access-k6hmx\") pod \"swift-ring-rebalance-debug-tvncr\" (UID: \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tvncr" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.399216 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tvncr" Mar 09 09:51:48 crc kubenswrapper[4971]: I0309 09:51:48.808240 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tvncr"] Mar 09 09:51:48 crc kubenswrapper[4971]: W0309 09:51:48.812977 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dff2cd8_cedf_4641_bd37_1f1c0b41f000.slice/crio-5f3ce81750075d24bbbcd23350a4ca99a2562acb30fe7ae674ed59fcf66e1d30 WatchSource:0}: Error finding container 5f3ce81750075d24bbbcd23350a4ca99a2562acb30fe7ae674ed59fcf66e1d30: Status 404 returned error can't find the container with id 5f3ce81750075d24bbbcd23350a4ca99a2562acb30fe7ae674ed59fcf66e1d30 Mar 09 09:51:49 crc kubenswrapper[4971]: I0309 09:51:49.584270 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tvncr" event={"ID":"8dff2cd8-cedf-4641-bd37-1f1c0b41f000","Type":"ContainerStarted","Data":"91d160aedd3f8d47dff2491523b23067338b142b6c2ebb92d658924ee0e820c2"} Mar 09 09:51:49 crc kubenswrapper[4971]: I0309 09:51:49.584368 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tvncr" event={"ID":"8dff2cd8-cedf-4641-bd37-1f1c0b41f000","Type":"ContainerStarted","Data":"5f3ce81750075d24bbbcd23350a4ca99a2562acb30fe7ae674ed59fcf66e1d30"} Mar 09 09:51:49 crc kubenswrapper[4971]: I0309 09:51:49.605323 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tvncr" podStartSLOduration=1.605307649 podStartE2EDuration="1.605307649s" podCreationTimestamp="2026-03-09 09:51:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:51:49.600415678 +0000 UTC m=+1913.160343488" watchObservedRunningTime="2026-03-09 09:51:49.605307649 +0000 UTC m=+1913.165235459" Mar 09 09:51:50 crc kubenswrapper[4971]: I0309 09:51:50.595475 4971 generic.go:334] "Generic (PLEG): container finished" podID="8dff2cd8-cedf-4641-bd37-1f1c0b41f000" containerID="91d160aedd3f8d47dff2491523b23067338b142b6c2ebb92d658924ee0e820c2" exitCode=0 Mar 09 09:51:50 crc kubenswrapper[4971]: I0309 09:51:50.595557 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tvncr" event={"ID":"8dff2cd8-cedf-4641-bd37-1f1c0b41f000","Type":"ContainerDied","Data":"91d160aedd3f8d47dff2491523b23067338b142b6c2ebb92d658924ee0e820c2"} Mar 09 09:51:51 crc kubenswrapper[4971]: I0309 09:51:51.852372 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tvncr" Mar 09 09:51:51 crc kubenswrapper[4971]: I0309 09:51:51.860157 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-etc-swift\") pod \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\" (UID: \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\") " Mar 09 09:51:51 crc kubenswrapper[4971]: I0309 09:51:51.860223 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-ring-data-devices\") pod \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\" (UID: \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\") " Mar 09 09:51:51 crc kubenswrapper[4971]: I0309 09:51:51.860254 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-swiftconf\") pod \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\" (UID: \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\") " Mar 09 09:51:51 crc kubenswrapper[4971]: I0309 09:51:51.860331 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-scripts\") pod \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\" (UID: \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\") " Mar 09 09:51:51 crc kubenswrapper[4971]: I0309 09:51:51.860432 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-dispersionconf\") pod \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\" (UID: \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\") " Mar 09 09:51:51 crc kubenswrapper[4971]: I0309 09:51:51.860517 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6hmx\" (UniqueName: \"kubernetes.io/projected/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-kube-api-access-k6hmx\") pod \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\" (UID: \"8dff2cd8-cedf-4641-bd37-1f1c0b41f000\") " Mar 09 09:51:51 crc kubenswrapper[4971]: I0309 09:51:51.860790 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8dff2cd8-cedf-4641-bd37-1f1c0b41f000" (UID: "8dff2cd8-cedf-4641-bd37-1f1c0b41f000"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:51:51 crc kubenswrapper[4971]: I0309 09:51:51.861035 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:51 crc kubenswrapper[4971]: I0309 09:51:51.861036 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8dff2cd8-cedf-4641-bd37-1f1c0b41f000" (UID: "8dff2cd8-cedf-4641-bd37-1f1c0b41f000"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:51:51 crc kubenswrapper[4971]: I0309 09:51:51.866896 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-kube-api-access-k6hmx" (OuterVolumeSpecName: "kube-api-access-k6hmx") pod "8dff2cd8-cedf-4641-bd37-1f1c0b41f000" (UID: "8dff2cd8-cedf-4641-bd37-1f1c0b41f000"). InnerVolumeSpecName "kube-api-access-k6hmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:51:51 crc kubenswrapper[4971]: I0309 09:51:51.901011 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-scripts" (OuterVolumeSpecName: "scripts") pod "8dff2cd8-cedf-4641-bd37-1f1c0b41f000" (UID: "8dff2cd8-cedf-4641-bd37-1f1c0b41f000"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:51:51 crc kubenswrapper[4971]: I0309 09:51:51.904530 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8dff2cd8-cedf-4641-bd37-1f1c0b41f000" (UID: "8dff2cd8-cedf-4641-bd37-1f1c0b41f000"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:51:51 crc kubenswrapper[4971]: I0309 09:51:51.905854 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tvncr"] Mar 09 09:51:51 crc kubenswrapper[4971]: I0309 09:51:51.908613 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8dff2cd8-cedf-4641-bd37-1f1c0b41f000" (UID: "8dff2cd8-cedf-4641-bd37-1f1c0b41f000"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:51:51 crc kubenswrapper[4971]: I0309 09:51:51.912300 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tvncr"] Mar 09 09:51:51 crc kubenswrapper[4971]: I0309 09:51:51.962585 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:51 crc kubenswrapper[4971]: I0309 09:51:51.962625 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:51 crc kubenswrapper[4971]: I0309 09:51:51.962639 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6hmx\" (UniqueName: \"kubernetes.io/projected/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-kube-api-access-k6hmx\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:51 crc kubenswrapper[4971]: I0309 09:51:51.962648 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:51 crc kubenswrapper[4971]: I0309 09:51:51.962656 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8dff2cd8-cedf-4641-bd37-1f1c0b41f000-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:52 crc kubenswrapper[4971]: I0309 09:51:52.610186 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f3ce81750075d24bbbcd23350a4ca99a2562acb30fe7ae674ed59fcf66e1d30" Mar 09 09:51:52 crc kubenswrapper[4971]: I0309 09:51:52.610261 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tvncr" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.008322 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5"] Mar 09 09:51:53 crc kubenswrapper[4971]: E0309 09:51:53.008679 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dff2cd8-cedf-4641-bd37-1f1c0b41f000" containerName="swift-ring-rebalance" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.008695 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dff2cd8-cedf-4641-bd37-1f1c0b41f000" containerName="swift-ring-rebalance" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.008839 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dff2cd8-cedf-4641-bd37-1f1c0b41f000" containerName="swift-ring-rebalance" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.009308 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.012534 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.012598 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.019049 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5"] Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.077223 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ea3e75e0-5062-42fa-91d8-f1276a924a65-etc-swift\") pod \"swift-ring-rebalance-debug-gmcb5\" (UID: \"ea3e75e0-5062-42fa-91d8-f1276a924a65\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.077288 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ea3e75e0-5062-42fa-91d8-f1276a924a65-swiftconf\") pod \"swift-ring-rebalance-debug-gmcb5\" (UID: \"ea3e75e0-5062-42fa-91d8-f1276a924a65\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.077381 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ea3e75e0-5062-42fa-91d8-f1276a924a65-dispersionconf\") pod \"swift-ring-rebalance-debug-gmcb5\" (UID: \"ea3e75e0-5062-42fa-91d8-f1276a924a65\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.077465 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ea3e75e0-5062-42fa-91d8-f1276a924a65-ring-data-devices\") pod \"swift-ring-rebalance-debug-gmcb5\" (UID: \"ea3e75e0-5062-42fa-91d8-f1276a924a65\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.077533 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea3e75e0-5062-42fa-91d8-f1276a924a65-scripts\") pod \"swift-ring-rebalance-debug-gmcb5\" (UID: \"ea3e75e0-5062-42fa-91d8-f1276a924a65\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.077559 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q66q8\" (UniqueName: \"kubernetes.io/projected/ea3e75e0-5062-42fa-91d8-f1276a924a65-kube-api-access-q66q8\") pod \"swift-ring-rebalance-debug-gmcb5\" (UID: \"ea3e75e0-5062-42fa-91d8-f1276a924a65\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.161401 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dff2cd8-cedf-4641-bd37-1f1c0b41f000" path="/var/lib/kubelet/pods/8dff2cd8-cedf-4641-bd37-1f1c0b41f000/volumes" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.178960 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ea3e75e0-5062-42fa-91d8-f1276a924a65-ring-data-devices\") pod \"swift-ring-rebalance-debug-gmcb5\" (UID: \"ea3e75e0-5062-42fa-91d8-f1276a924a65\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.179082 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea3e75e0-5062-42fa-91d8-f1276a924a65-scripts\") pod \"swift-ring-rebalance-debug-gmcb5\" (UID: \"ea3e75e0-5062-42fa-91d8-f1276a924a65\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.179125 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q66q8\" (UniqueName: \"kubernetes.io/projected/ea3e75e0-5062-42fa-91d8-f1276a924a65-kube-api-access-q66q8\") pod \"swift-ring-rebalance-debug-gmcb5\" (UID: \"ea3e75e0-5062-42fa-91d8-f1276a924a65\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.179196 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ea3e75e0-5062-42fa-91d8-f1276a924a65-etc-swift\") pod \"swift-ring-rebalance-debug-gmcb5\" (UID: \"ea3e75e0-5062-42fa-91d8-f1276a924a65\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.179243 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ea3e75e0-5062-42fa-91d8-f1276a924a65-swiftconf\") pod \"swift-ring-rebalance-debug-gmcb5\" (UID: \"ea3e75e0-5062-42fa-91d8-f1276a924a65\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.179302 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ea3e75e0-5062-42fa-91d8-f1276a924a65-dispersionconf\") pod \"swift-ring-rebalance-debug-gmcb5\" (UID: \"ea3e75e0-5062-42fa-91d8-f1276a924a65\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.179884 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ea3e75e0-5062-42fa-91d8-f1276a924a65-etc-swift\") pod \"swift-ring-rebalance-debug-gmcb5\" (UID: \"ea3e75e0-5062-42fa-91d8-f1276a924a65\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.180000 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea3e75e0-5062-42fa-91d8-f1276a924a65-scripts\") pod \"swift-ring-rebalance-debug-gmcb5\" (UID: \"ea3e75e0-5062-42fa-91d8-f1276a924a65\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.180439 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ea3e75e0-5062-42fa-91d8-f1276a924a65-ring-data-devices\") pod \"swift-ring-rebalance-debug-gmcb5\" (UID: \"ea3e75e0-5062-42fa-91d8-f1276a924a65\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.185236 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ea3e75e0-5062-42fa-91d8-f1276a924a65-dispersionconf\") pod \"swift-ring-rebalance-debug-gmcb5\" (UID: \"ea3e75e0-5062-42fa-91d8-f1276a924a65\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.187712 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ea3e75e0-5062-42fa-91d8-f1276a924a65-swiftconf\") pod \"swift-ring-rebalance-debug-gmcb5\" (UID: \"ea3e75e0-5062-42fa-91d8-f1276a924a65\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.200873 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q66q8\" (UniqueName: \"kubernetes.io/projected/ea3e75e0-5062-42fa-91d8-f1276a924a65-kube-api-access-q66q8\") pod \"swift-ring-rebalance-debug-gmcb5\" (UID: \"ea3e75e0-5062-42fa-91d8-f1276a924a65\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.325843 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5" Mar 09 09:51:53 crc kubenswrapper[4971]: I0309 09:51:53.756843 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5"] Mar 09 09:51:54 crc kubenswrapper[4971]: I0309 09:51:54.626250 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5" event={"ID":"ea3e75e0-5062-42fa-91d8-f1276a924a65","Type":"ContainerStarted","Data":"bbbe5ff453043f93e5bf841f40c66d7901b5e1cfe9ec79504e0bf8f782fb4605"} Mar 09 09:51:54 crc kubenswrapper[4971]: I0309 09:51:54.626292 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5" event={"ID":"ea3e75e0-5062-42fa-91d8-f1276a924a65","Type":"ContainerStarted","Data":"d50dc7d6a26fa8a3b20655eaa5bc9ef68501e7a7240f9a7962ccd05e90fb56f5"} Mar 09 09:51:54 crc kubenswrapper[4971]: I0309 09:51:54.649203 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5" podStartSLOduration=2.649182038 podStartE2EDuration="2.649182038s" podCreationTimestamp="2026-03-09 09:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:51:54.642151095 +0000 UTC m=+1918.202078915" watchObservedRunningTime="2026-03-09 09:51:54.649182038 +0000 UTC m=+1918.209109858" Mar 09 09:51:55 crc kubenswrapper[4971]: I0309 09:51:55.633648 4971 generic.go:334] "Generic (PLEG): container finished" podID="ea3e75e0-5062-42fa-91d8-f1276a924a65" containerID="bbbe5ff453043f93e5bf841f40c66d7901b5e1cfe9ec79504e0bf8f782fb4605" exitCode=0 Mar 09 09:51:55 crc kubenswrapper[4971]: I0309 09:51:55.633703 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5" event={"ID":"ea3e75e0-5062-42fa-91d8-f1276a924a65","Type":"ContainerDied","Data":"bbbe5ff453043f93e5bf841f40c66d7901b5e1cfe9ec79504e0bf8f782fb4605"} Mar 09 09:51:56 crc kubenswrapper[4971]: I0309 09:51:56.881851 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5" Mar 09 09:51:56 crc kubenswrapper[4971]: I0309 09:51:56.909695 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5"] Mar 09 09:51:56 crc kubenswrapper[4971]: I0309 09:51:56.915721 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5"] Mar 09 09:51:57 crc kubenswrapper[4971]: I0309 09:51:57.040923 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q66q8\" (UniqueName: \"kubernetes.io/projected/ea3e75e0-5062-42fa-91d8-f1276a924a65-kube-api-access-q66q8\") pod \"ea3e75e0-5062-42fa-91d8-f1276a924a65\" (UID: \"ea3e75e0-5062-42fa-91d8-f1276a924a65\") " Mar 09 09:51:57 crc kubenswrapper[4971]: I0309 09:51:57.041004 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ea3e75e0-5062-42fa-91d8-f1276a924a65-dispersionconf\") pod \"ea3e75e0-5062-42fa-91d8-f1276a924a65\" (UID: \"ea3e75e0-5062-42fa-91d8-f1276a924a65\") " Mar 09 09:51:57 crc kubenswrapper[4971]: I0309 09:51:57.041109 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea3e75e0-5062-42fa-91d8-f1276a924a65-scripts\") pod \"ea3e75e0-5062-42fa-91d8-f1276a924a65\" (UID: \"ea3e75e0-5062-42fa-91d8-f1276a924a65\") " Mar 09 09:51:57 crc kubenswrapper[4971]: I0309 09:51:57.041189 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ea3e75e0-5062-42fa-91d8-f1276a924a65-ring-data-devices\") pod \"ea3e75e0-5062-42fa-91d8-f1276a924a65\" (UID: \"ea3e75e0-5062-42fa-91d8-f1276a924a65\") " Mar 09 09:51:57 crc kubenswrapper[4971]: I0309 09:51:57.041251 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ea3e75e0-5062-42fa-91d8-f1276a924a65-etc-swift\") pod \"ea3e75e0-5062-42fa-91d8-f1276a924a65\" (UID: \"ea3e75e0-5062-42fa-91d8-f1276a924a65\") " Mar 09 09:51:57 crc kubenswrapper[4971]: I0309 09:51:57.041268 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ea3e75e0-5062-42fa-91d8-f1276a924a65-swiftconf\") pod \"ea3e75e0-5062-42fa-91d8-f1276a924a65\" (UID: \"ea3e75e0-5062-42fa-91d8-f1276a924a65\") " Mar 09 09:51:57 crc kubenswrapper[4971]: I0309 09:51:57.041820 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea3e75e0-5062-42fa-91d8-f1276a924a65-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ea3e75e0-5062-42fa-91d8-f1276a924a65" (UID: "ea3e75e0-5062-42fa-91d8-f1276a924a65"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:51:57 crc kubenswrapper[4971]: I0309 09:51:57.042043 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea3e75e0-5062-42fa-91d8-f1276a924a65-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ea3e75e0-5062-42fa-91d8-f1276a924a65" (UID: "ea3e75e0-5062-42fa-91d8-f1276a924a65"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:51:57 crc kubenswrapper[4971]: I0309 09:51:57.046277 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea3e75e0-5062-42fa-91d8-f1276a924a65-kube-api-access-q66q8" (OuterVolumeSpecName: "kube-api-access-q66q8") pod "ea3e75e0-5062-42fa-91d8-f1276a924a65" (UID: "ea3e75e0-5062-42fa-91d8-f1276a924a65"). InnerVolumeSpecName "kube-api-access-q66q8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:51:57 crc kubenswrapper[4971]: I0309 09:51:57.062092 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea3e75e0-5062-42fa-91d8-f1276a924a65-scripts" (OuterVolumeSpecName: "scripts") pod "ea3e75e0-5062-42fa-91d8-f1276a924a65" (UID: "ea3e75e0-5062-42fa-91d8-f1276a924a65"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:51:57 crc kubenswrapper[4971]: I0309 09:51:57.063741 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea3e75e0-5062-42fa-91d8-f1276a924a65-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ea3e75e0-5062-42fa-91d8-f1276a924a65" (UID: "ea3e75e0-5062-42fa-91d8-f1276a924a65"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:51:57 crc kubenswrapper[4971]: I0309 09:51:57.064337 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea3e75e0-5062-42fa-91d8-f1276a924a65-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ea3e75e0-5062-42fa-91d8-f1276a924a65" (UID: "ea3e75e0-5062-42fa-91d8-f1276a924a65"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:51:57 crc kubenswrapper[4971]: I0309 09:51:57.142693 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q66q8\" (UniqueName: \"kubernetes.io/projected/ea3e75e0-5062-42fa-91d8-f1276a924a65-kube-api-access-q66q8\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:57 crc kubenswrapper[4971]: I0309 09:51:57.142951 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ea3e75e0-5062-42fa-91d8-f1276a924a65-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:57 crc kubenswrapper[4971]: I0309 09:51:57.143059 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea3e75e0-5062-42fa-91d8-f1276a924a65-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:57 crc kubenswrapper[4971]: I0309 09:51:57.143139 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ea3e75e0-5062-42fa-91d8-f1276a924a65-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:57 crc kubenswrapper[4971]: I0309 09:51:57.143218 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ea3e75e0-5062-42fa-91d8-f1276a924a65-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:57 crc kubenswrapper[4971]: I0309 09:51:57.143312 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ea3e75e0-5062-42fa-91d8-f1276a924a65-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:57 crc kubenswrapper[4971]: I0309 09:51:57.161097 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea3e75e0-5062-42fa-91d8-f1276a924a65" path="/var/lib/kubelet/pods/ea3e75e0-5062-42fa-91d8-f1276a924a65/volumes" Mar 09 09:51:57 crc kubenswrapper[4971]: I0309 09:51:57.650673 4971 scope.go:117] "RemoveContainer" containerID="bbbe5ff453043f93e5bf841f40c66d7901b5e1cfe9ec79504e0bf8f782fb4605" Mar 09 09:51:57 crc kubenswrapper[4971]: I0309 09:51:57.650727 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gmcb5" Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.078293 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc"] Mar 09 09:51:58 crc kubenswrapper[4971]: E0309 09:51:58.095136 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3e75e0-5062-42fa-91d8-f1276a924a65" containerName="swift-ring-rebalance" Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.095188 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3e75e0-5062-42fa-91d8-f1276a924a65" containerName="swift-ring-rebalance" Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.095711 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3e75e0-5062-42fa-91d8-f1276a924a65" containerName="swift-ring-rebalance" Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.097292 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc" Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.104888 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.105166 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.117631 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc"] Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.155513 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k45gc\" (UniqueName: \"kubernetes.io/projected/fa7ab013-9129-46fc-8b55-97f2118e767b-kube-api-access-k45gc\") pod \"swift-ring-rebalance-debug-xn5hc\" (UID: \"fa7ab013-9129-46fc-8b55-97f2118e767b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc" Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.155564 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fa7ab013-9129-46fc-8b55-97f2118e767b-swiftconf\") pod \"swift-ring-rebalance-debug-xn5hc\" (UID: \"fa7ab013-9129-46fc-8b55-97f2118e767b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc" Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.155662 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fa7ab013-9129-46fc-8b55-97f2118e767b-etc-swift\") pod \"swift-ring-rebalance-debug-xn5hc\" (UID: \"fa7ab013-9129-46fc-8b55-97f2118e767b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc" Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.155702 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fa7ab013-9129-46fc-8b55-97f2118e767b-ring-data-devices\") pod \"swift-ring-rebalance-debug-xn5hc\" (UID: \"fa7ab013-9129-46fc-8b55-97f2118e767b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc" Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.155786 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa7ab013-9129-46fc-8b55-97f2118e767b-scripts\") pod \"swift-ring-rebalance-debug-xn5hc\" (UID: \"fa7ab013-9129-46fc-8b55-97f2118e767b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc" Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.155815 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fa7ab013-9129-46fc-8b55-97f2118e767b-dispersionconf\") pod \"swift-ring-rebalance-debug-xn5hc\" (UID: \"fa7ab013-9129-46fc-8b55-97f2118e767b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc" Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.256612 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa7ab013-9129-46fc-8b55-97f2118e767b-scripts\") pod \"swift-ring-rebalance-debug-xn5hc\" (UID: \"fa7ab013-9129-46fc-8b55-97f2118e767b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc" Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.256672 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fa7ab013-9129-46fc-8b55-97f2118e767b-dispersionconf\") pod \"swift-ring-rebalance-debug-xn5hc\" (UID: \"fa7ab013-9129-46fc-8b55-97f2118e767b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc" Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.256736 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k45gc\" (UniqueName: \"kubernetes.io/projected/fa7ab013-9129-46fc-8b55-97f2118e767b-kube-api-access-k45gc\") pod \"swift-ring-rebalance-debug-xn5hc\" (UID: \"fa7ab013-9129-46fc-8b55-97f2118e767b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc" Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.256760 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fa7ab013-9129-46fc-8b55-97f2118e767b-swiftconf\") pod \"swift-ring-rebalance-debug-xn5hc\" (UID: \"fa7ab013-9129-46fc-8b55-97f2118e767b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc" Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.256823 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fa7ab013-9129-46fc-8b55-97f2118e767b-etc-swift\") pod \"swift-ring-rebalance-debug-xn5hc\" (UID: \"fa7ab013-9129-46fc-8b55-97f2118e767b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc" Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.256855 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fa7ab013-9129-46fc-8b55-97f2118e767b-ring-data-devices\") pod \"swift-ring-rebalance-debug-xn5hc\" (UID: \"fa7ab013-9129-46fc-8b55-97f2118e767b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc" Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.257849 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa7ab013-9129-46fc-8b55-97f2118e767b-scripts\") pod \"swift-ring-rebalance-debug-xn5hc\" (UID: \"fa7ab013-9129-46fc-8b55-97f2118e767b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc" Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.259001 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fa7ab013-9129-46fc-8b55-97f2118e767b-etc-swift\") pod \"swift-ring-rebalance-debug-xn5hc\" (UID: \"fa7ab013-9129-46fc-8b55-97f2118e767b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc" Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.259364 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fa7ab013-9129-46fc-8b55-97f2118e767b-ring-data-devices\") pod \"swift-ring-rebalance-debug-xn5hc\" (UID: \"fa7ab013-9129-46fc-8b55-97f2118e767b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc" Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.262385 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fa7ab013-9129-46fc-8b55-97f2118e767b-swiftconf\") pod \"swift-ring-rebalance-debug-xn5hc\" (UID: \"fa7ab013-9129-46fc-8b55-97f2118e767b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc" Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.266814 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fa7ab013-9129-46fc-8b55-97f2118e767b-dispersionconf\") pod \"swift-ring-rebalance-debug-xn5hc\" (UID: \"fa7ab013-9129-46fc-8b55-97f2118e767b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc" Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.273477 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k45gc\" (UniqueName: \"kubernetes.io/projected/fa7ab013-9129-46fc-8b55-97f2118e767b-kube-api-access-k45gc\") pod \"swift-ring-rebalance-debug-xn5hc\" (UID: \"fa7ab013-9129-46fc-8b55-97f2118e767b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc" Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.441664 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc" Mar 09 09:51:58 crc kubenswrapper[4971]: I0309 09:51:58.854968 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc"] Mar 09 09:51:59 crc kubenswrapper[4971]: I0309 09:51:59.669534 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc" event={"ID":"fa7ab013-9129-46fc-8b55-97f2118e767b","Type":"ContainerStarted","Data":"6012da9cb7cab1a63e85164cac194e0a7304a8809c24b43d7dc38dfc8edd57fb"} Mar 09 09:51:59 crc kubenswrapper[4971]: I0309 09:51:59.669919 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc" event={"ID":"fa7ab013-9129-46fc-8b55-97f2118e767b","Type":"ContainerStarted","Data":"9737af408f197e41ec57c86e89244e0cd96d0bba658fcd8594af89577f19157b"} Mar 09 09:52:00 crc kubenswrapper[4971]: I0309 09:52:00.131208 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc" podStartSLOduration=2.131185397 podStartE2EDuration="2.131185397s" podCreationTimestamp="2026-03-09 09:51:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:51:59.689059221 +0000 UTC m=+1923.248987051" watchObservedRunningTime="2026-03-09 09:52:00.131185397 +0000 UTC m=+1923.691113217" Mar 09 09:52:00 crc kubenswrapper[4971]: I0309 09:52:00.136613 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550832-7sp5k"] Mar 09 09:52:00 crc kubenswrapper[4971]: I0309 09:52:00.137656 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550832-7sp5k" Mar 09 09:52:00 crc kubenswrapper[4971]: I0309 09:52:00.139858 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:52:00 crc kubenswrapper[4971]: I0309 09:52:00.140132 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:52:00 crc kubenswrapper[4971]: I0309 09:52:00.142291 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-xhrv2" Mar 09 09:52:00 crc kubenswrapper[4971]: I0309 09:52:00.144986 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550832-7sp5k"] Mar 09 09:52:00 crc kubenswrapper[4971]: I0309 09:52:00.286713 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xjgx\" (UniqueName: \"kubernetes.io/projected/f3cdaf5b-75af-469e-b233-f1cadc6f49a0-kube-api-access-2xjgx\") pod \"auto-csr-approver-29550832-7sp5k\" (UID: \"f3cdaf5b-75af-469e-b233-f1cadc6f49a0\") " pod="openshift-infra/auto-csr-approver-29550832-7sp5k" Mar 09 09:52:00 crc kubenswrapper[4971]: I0309 09:52:00.388905 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xjgx\" (UniqueName: \"kubernetes.io/projected/f3cdaf5b-75af-469e-b233-f1cadc6f49a0-kube-api-access-2xjgx\") pod \"auto-csr-approver-29550832-7sp5k\" (UID: \"f3cdaf5b-75af-469e-b233-f1cadc6f49a0\") " pod="openshift-infra/auto-csr-approver-29550832-7sp5k" Mar 09 09:52:00 crc kubenswrapper[4971]: I0309 09:52:00.408293 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xjgx\" (UniqueName: \"kubernetes.io/projected/f3cdaf5b-75af-469e-b233-f1cadc6f49a0-kube-api-access-2xjgx\") pod \"auto-csr-approver-29550832-7sp5k\" (UID: \"f3cdaf5b-75af-469e-b233-f1cadc6f49a0\") " pod="openshift-infra/auto-csr-approver-29550832-7sp5k" Mar 09 09:52:00 crc kubenswrapper[4971]: I0309 09:52:00.453209 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550832-7sp5k" Mar 09 09:52:00 crc kubenswrapper[4971]: I0309 09:52:00.678584 4971 generic.go:334] "Generic (PLEG): container finished" podID="fa7ab013-9129-46fc-8b55-97f2118e767b" containerID="6012da9cb7cab1a63e85164cac194e0a7304a8809c24b43d7dc38dfc8edd57fb" exitCode=0 Mar 09 09:52:00 crc kubenswrapper[4971]: I0309 09:52:00.678640 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc" event={"ID":"fa7ab013-9129-46fc-8b55-97f2118e767b","Type":"ContainerDied","Data":"6012da9cb7cab1a63e85164cac194e0a7304a8809c24b43d7dc38dfc8edd57fb"} Mar 09 09:52:00 crc kubenswrapper[4971]: I0309 09:52:00.857553 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550832-7sp5k"] Mar 09 09:52:00 crc kubenswrapper[4971]: W0309 09:52:00.864940 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3cdaf5b_75af_469e_b233_f1cadc6f49a0.slice/crio-c1296bcf0467080c160a756862079022fbc0f5f71287d919bae2d15bde2678b3 WatchSource:0}: Error finding container c1296bcf0467080c160a756862079022fbc0f5f71287d919bae2d15bde2678b3: Status 404 returned error can't find the container with id c1296bcf0467080c160a756862079022fbc0f5f71287d919bae2d15bde2678b3 Mar 09 09:52:01 crc kubenswrapper[4971]: I0309 09:52:01.688626 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550832-7sp5k" event={"ID":"f3cdaf5b-75af-469e-b233-f1cadc6f49a0","Type":"ContainerStarted","Data":"c1296bcf0467080c160a756862079022fbc0f5f71287d919bae2d15bde2678b3"} Mar 09 09:52:02 crc kubenswrapper[4971]: I0309 09:52:02.003342 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc" Mar 09 09:52:02 crc kubenswrapper[4971]: I0309 09:52:02.046732 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc"] Mar 09 09:52:02 crc kubenswrapper[4971]: I0309 09:52:02.068148 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc"] Mar 09 09:52:02 crc kubenswrapper[4971]: I0309 09:52:02.113464 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fa7ab013-9129-46fc-8b55-97f2118e767b-etc-swift\") pod \"fa7ab013-9129-46fc-8b55-97f2118e767b\" (UID: \"fa7ab013-9129-46fc-8b55-97f2118e767b\") " Mar 09 09:52:02 crc kubenswrapper[4971]: I0309 09:52:02.113589 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fa7ab013-9129-46fc-8b55-97f2118e767b-dispersionconf\") pod \"fa7ab013-9129-46fc-8b55-97f2118e767b\" (UID: \"fa7ab013-9129-46fc-8b55-97f2118e767b\") " Mar 09 09:52:02 crc kubenswrapper[4971]: I0309 09:52:02.113706 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fa7ab013-9129-46fc-8b55-97f2118e767b-ring-data-devices\") pod \"fa7ab013-9129-46fc-8b55-97f2118e767b\" (UID: \"fa7ab013-9129-46fc-8b55-97f2118e767b\") " Mar 09 09:52:02 crc kubenswrapper[4971]: I0309 09:52:02.113811 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fa7ab013-9129-46fc-8b55-97f2118e767b-swiftconf\") pod \"fa7ab013-9129-46fc-8b55-97f2118e767b\" (UID: \"fa7ab013-9129-46fc-8b55-97f2118e767b\") " Mar 09 09:52:02 crc kubenswrapper[4971]: I0309 09:52:02.114198 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa7ab013-9129-46fc-8b55-97f2118e767b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "fa7ab013-9129-46fc-8b55-97f2118e767b" (UID: "fa7ab013-9129-46fc-8b55-97f2118e767b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:52:02 crc kubenswrapper[4971]: I0309 09:52:02.114443 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa7ab013-9129-46fc-8b55-97f2118e767b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "fa7ab013-9129-46fc-8b55-97f2118e767b" (UID: "fa7ab013-9129-46fc-8b55-97f2118e767b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:52:02 crc kubenswrapper[4971]: I0309 09:52:02.114522 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k45gc\" (UniqueName: \"kubernetes.io/projected/fa7ab013-9129-46fc-8b55-97f2118e767b-kube-api-access-k45gc\") pod \"fa7ab013-9129-46fc-8b55-97f2118e767b\" (UID: \"fa7ab013-9129-46fc-8b55-97f2118e767b\") " Mar 09 09:52:02 crc kubenswrapper[4971]: I0309 09:52:02.114583 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa7ab013-9129-46fc-8b55-97f2118e767b-scripts\") pod \"fa7ab013-9129-46fc-8b55-97f2118e767b\" (UID: \"fa7ab013-9129-46fc-8b55-97f2118e767b\") " Mar 09 09:52:02 crc kubenswrapper[4971]: I0309 09:52:02.115050 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fa7ab013-9129-46fc-8b55-97f2118e767b-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:02 crc kubenswrapper[4971]: I0309 09:52:02.115067 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fa7ab013-9129-46fc-8b55-97f2118e767b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:02 crc kubenswrapper[4971]: I0309 09:52:02.124650 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa7ab013-9129-46fc-8b55-97f2118e767b-kube-api-access-k45gc" (OuterVolumeSpecName: "kube-api-access-k45gc") pod "fa7ab013-9129-46fc-8b55-97f2118e767b" (UID: "fa7ab013-9129-46fc-8b55-97f2118e767b"). InnerVolumeSpecName "kube-api-access-k45gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:52:02 crc kubenswrapper[4971]: I0309 09:52:02.135838 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa7ab013-9129-46fc-8b55-97f2118e767b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "fa7ab013-9129-46fc-8b55-97f2118e767b" (UID: "fa7ab013-9129-46fc-8b55-97f2118e767b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:02 crc kubenswrapper[4971]: I0309 09:52:02.152490 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa7ab013-9129-46fc-8b55-97f2118e767b-scripts" (OuterVolumeSpecName: "scripts") pod "fa7ab013-9129-46fc-8b55-97f2118e767b" (UID: "fa7ab013-9129-46fc-8b55-97f2118e767b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:52:02 crc kubenswrapper[4971]: I0309 09:52:02.153065 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa7ab013-9129-46fc-8b55-97f2118e767b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "fa7ab013-9129-46fc-8b55-97f2118e767b" (UID: "fa7ab013-9129-46fc-8b55-97f2118e767b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:02 crc kubenswrapper[4971]: I0309 09:52:02.215705 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fa7ab013-9129-46fc-8b55-97f2118e767b-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:02 crc kubenswrapper[4971]: I0309 09:52:02.215734 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fa7ab013-9129-46fc-8b55-97f2118e767b-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:02 crc kubenswrapper[4971]: I0309 09:52:02.215744 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k45gc\" (UniqueName: \"kubernetes.io/projected/fa7ab013-9129-46fc-8b55-97f2118e767b-kube-api-access-k45gc\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:02 crc kubenswrapper[4971]: I0309 09:52:02.215844 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa7ab013-9129-46fc-8b55-97f2118e767b-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:02 crc kubenswrapper[4971]: I0309 09:52:02.696434 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9737af408f197e41ec57c86e89244e0cd96d0bba658fcd8594af89577f19157b" Mar 09 09:52:02 crc kubenswrapper[4971]: I0309 09:52:02.696812 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xn5hc" Mar 09 09:52:02 crc kubenswrapper[4971]: I0309 09:52:02.698800 4971 generic.go:334] "Generic (PLEG): container finished" podID="f3cdaf5b-75af-469e-b233-f1cadc6f49a0" containerID="85009dcc1fa7756739a14ad0cb0328f3f6018b3684fccf1ffb98460612ee0a23" exitCode=0 Mar 09 09:52:02 crc kubenswrapper[4971]: I0309 09:52:02.698836 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550832-7sp5k" event={"ID":"f3cdaf5b-75af-469e-b233-f1cadc6f49a0","Type":"ContainerDied","Data":"85009dcc1fa7756739a14ad0cb0328f3f6018b3684fccf1ffb98460612ee0a23"} Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.152933 4971 scope.go:117] "RemoveContainer" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" Mar 09 09:52:03 crc kubenswrapper[4971]: E0309 09:52:03.153171 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.162925 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa7ab013-9129-46fc-8b55-97f2118e767b" path="/var/lib/kubelet/pods/fa7ab013-9129-46fc-8b55-97f2118e767b/volumes" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.163382 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt"] Mar 09 09:52:03 crc kubenswrapper[4971]: E0309 09:52:03.163619 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa7ab013-9129-46fc-8b55-97f2118e767b" containerName="swift-ring-rebalance" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.163633 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa7ab013-9129-46fc-8b55-97f2118e767b" containerName="swift-ring-rebalance" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.163784 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa7ab013-9129-46fc-8b55-97f2118e767b" containerName="swift-ring-rebalance" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.164290 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.167059 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.167893 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.177543 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt"] Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.331111 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-ring-data-devices\") pod \"swift-ring-rebalance-debug-k9nrt\" (UID: \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.331204 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-scripts\") pod \"swift-ring-rebalance-debug-k9nrt\" (UID: \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.331225 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-etc-swift\") pod \"swift-ring-rebalance-debug-k9nrt\" (UID: \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.331670 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-swiftconf\") pod \"swift-ring-rebalance-debug-k9nrt\" (UID: \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.331711 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfh22\" (UniqueName: \"kubernetes.io/projected/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-kube-api-access-tfh22\") pod \"swift-ring-rebalance-debug-k9nrt\" (UID: \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.334719 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-dispersionconf\") pod \"swift-ring-rebalance-debug-k9nrt\" (UID: \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.436451 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-swiftconf\") pod \"swift-ring-rebalance-debug-k9nrt\" (UID: \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.436916 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfh22\" (UniqueName: \"kubernetes.io/projected/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-kube-api-access-tfh22\") pod \"swift-ring-rebalance-debug-k9nrt\" (UID: \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.436962 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-dispersionconf\") pod \"swift-ring-rebalance-debug-k9nrt\" (UID: \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.437076 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-ring-data-devices\") pod \"swift-ring-rebalance-debug-k9nrt\" (UID: \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.437939 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-scripts\") pod \"swift-ring-rebalance-debug-k9nrt\" (UID: \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.437969 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-etc-swift\") pod \"swift-ring-rebalance-debug-k9nrt\" (UID: \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.438466 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-etc-swift\") pod \"swift-ring-rebalance-debug-k9nrt\" (UID: \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.438568 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-ring-data-devices\") pod \"swift-ring-rebalance-debug-k9nrt\" (UID: \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.438992 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-scripts\") pod \"swift-ring-rebalance-debug-k9nrt\" (UID: \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.442844 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-dispersionconf\") pod \"swift-ring-rebalance-debug-k9nrt\" (UID: \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.442988 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-swiftconf\") pod \"swift-ring-rebalance-debug-k9nrt\" (UID: \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.459156 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfh22\" (UniqueName: \"kubernetes.io/projected/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-kube-api-access-tfh22\") pod \"swift-ring-rebalance-debug-k9nrt\" (UID: \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.485403 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt" Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.692647 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt"] Mar 09 09:52:03 crc kubenswrapper[4971]: W0309 09:52:03.698419 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc9e7d2d_75bf_4d7a_9117_0b2410ff833a.slice/crio-c7e8a4c644513dac4d5070346069020dec9301344b1a20135295b7689670571f WatchSource:0}: Error finding container c7e8a4c644513dac4d5070346069020dec9301344b1a20135295b7689670571f: Status 404 returned error can't find the container with id c7e8a4c644513dac4d5070346069020dec9301344b1a20135295b7689670571f Mar 09 09:52:03 crc kubenswrapper[4971]: I0309 09:52:03.709585 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt" event={"ID":"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a","Type":"ContainerStarted","Data":"c7e8a4c644513dac4d5070346069020dec9301344b1a20135295b7689670571f"} Mar 09 09:52:04 crc kubenswrapper[4971]: I0309 09:52:04.004978 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550832-7sp5k" Mar 09 09:52:04 crc kubenswrapper[4971]: I0309 09:52:04.147646 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xjgx\" (UniqueName: \"kubernetes.io/projected/f3cdaf5b-75af-469e-b233-f1cadc6f49a0-kube-api-access-2xjgx\") pod \"f3cdaf5b-75af-469e-b233-f1cadc6f49a0\" (UID: \"f3cdaf5b-75af-469e-b233-f1cadc6f49a0\") " Mar 09 09:52:04 crc kubenswrapper[4971]: I0309 09:52:04.152224 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3cdaf5b-75af-469e-b233-f1cadc6f49a0-kube-api-access-2xjgx" (OuterVolumeSpecName: "kube-api-access-2xjgx") pod "f3cdaf5b-75af-469e-b233-f1cadc6f49a0" (UID: "f3cdaf5b-75af-469e-b233-f1cadc6f49a0"). InnerVolumeSpecName "kube-api-access-2xjgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:52:04 crc kubenswrapper[4971]: I0309 09:52:04.250066 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xjgx\" (UniqueName: \"kubernetes.io/projected/f3cdaf5b-75af-469e-b233-f1cadc6f49a0-kube-api-access-2xjgx\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:04 crc kubenswrapper[4971]: I0309 09:52:04.720285 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt" event={"ID":"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a","Type":"ContainerStarted","Data":"4e149b5b9c3f5bbb3f7867076809b14ba330f611e0df39a5fc3bd688dfe0d1e8"} Mar 09 09:52:04 crc kubenswrapper[4971]: I0309 09:52:04.722760 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550832-7sp5k" event={"ID":"f3cdaf5b-75af-469e-b233-f1cadc6f49a0","Type":"ContainerDied","Data":"c1296bcf0467080c160a756862079022fbc0f5f71287d919bae2d15bde2678b3"} Mar 09 09:52:04 crc kubenswrapper[4971]: I0309 09:52:04.722789 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1296bcf0467080c160a756862079022fbc0f5f71287d919bae2d15bde2678b3" Mar 09 09:52:04 crc kubenswrapper[4971]: I0309 09:52:04.722918 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550832-7sp5k" Mar 09 09:52:04 crc kubenswrapper[4971]: I0309 09:52:04.746582 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt" podStartSLOduration=1.7465637410000001 podStartE2EDuration="1.746563741s" podCreationTimestamp="2026-03-09 09:52:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:52:04.743662178 +0000 UTC m=+1928.303589988" watchObservedRunningTime="2026-03-09 09:52:04.746563741 +0000 UTC m=+1928.306491551" Mar 09 09:52:05 crc kubenswrapper[4971]: I0309 09:52:05.059167 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550826-4xn5v"] Mar 09 09:52:05 crc kubenswrapper[4971]: I0309 09:52:05.071494 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550826-4xn5v"] Mar 09 09:52:05 crc kubenswrapper[4971]: I0309 09:52:05.160244 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6dc741c-5773-4202-9293-aec350558517" path="/var/lib/kubelet/pods/e6dc741c-5773-4202-9293-aec350558517/volumes" Mar 09 09:52:05 crc kubenswrapper[4971]: I0309 09:52:05.270125 4971 scope.go:117] "RemoveContainer" containerID="81bdf10f7d99ca46218cb59d915f9c767940f5d8ba0338265356992f5b7860cf" Mar 09 09:52:05 crc kubenswrapper[4971]: I0309 09:52:05.732942 4971 generic.go:334] "Generic (PLEG): container finished" podID="fc9e7d2d-75bf-4d7a-9117-0b2410ff833a" containerID="4e149b5b9c3f5bbb3f7867076809b14ba330f611e0df39a5fc3bd688dfe0d1e8" exitCode=0 Mar 09 09:52:05 crc kubenswrapper[4971]: I0309 09:52:05.733022 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt" event={"ID":"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a","Type":"ContainerDied","Data":"4e149b5b9c3f5bbb3f7867076809b14ba330f611e0df39a5fc3bd688dfe0d1e8"} Mar 09 09:52:07 crc kubenswrapper[4971]: I0309 09:52:07.048821 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt" Mar 09 09:52:07 crc kubenswrapper[4971]: I0309 09:52:07.082609 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt"] Mar 09 09:52:07 crc kubenswrapper[4971]: I0309 09:52:07.086405 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfh22\" (UniqueName: \"kubernetes.io/projected/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-kube-api-access-tfh22\") pod \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\" (UID: \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\") " Mar 09 09:52:07 crc kubenswrapper[4971]: I0309 09:52:07.088726 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt"] Mar 09 09:52:07 crc kubenswrapper[4971]: I0309 09:52:07.092055 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-kube-api-access-tfh22" (OuterVolumeSpecName: "kube-api-access-tfh22") pod "fc9e7d2d-75bf-4d7a-9117-0b2410ff833a" (UID: "fc9e7d2d-75bf-4d7a-9117-0b2410ff833a"). InnerVolumeSpecName "kube-api-access-tfh22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:52:07 crc kubenswrapper[4971]: I0309 09:52:07.187821 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-etc-swift\") pod \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\" (UID: \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\") " Mar 09 09:52:07 crc kubenswrapper[4971]: I0309 09:52:07.187864 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-ring-data-devices\") pod \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\" (UID: \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\") " Mar 09 09:52:07 crc kubenswrapper[4971]: I0309 09:52:07.188329 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-scripts\") pod \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\" (UID: \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\") " Mar 09 09:52:07 crc kubenswrapper[4971]: I0309 09:52:07.188499 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-dispersionconf\") pod \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\" (UID: \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\") " Mar 09 09:52:07 crc kubenswrapper[4971]: I0309 09:52:07.188377 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "fc9e7d2d-75bf-4d7a-9117-0b2410ff833a" (UID: "fc9e7d2d-75bf-4d7a-9117-0b2410ff833a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:52:07 crc kubenswrapper[4971]: I0309 09:52:07.188587 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-swiftconf\") pod \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\" (UID: \"fc9e7d2d-75bf-4d7a-9117-0b2410ff833a\") " Mar 09 09:52:07 crc kubenswrapper[4971]: I0309 09:52:07.188611 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "fc9e7d2d-75bf-4d7a-9117-0b2410ff833a" (UID: "fc9e7d2d-75bf-4d7a-9117-0b2410ff833a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:52:07 crc kubenswrapper[4971]: I0309 09:52:07.189128 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfh22\" (UniqueName: \"kubernetes.io/projected/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-kube-api-access-tfh22\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:07 crc kubenswrapper[4971]: I0309 09:52:07.189160 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:07 crc kubenswrapper[4971]: I0309 09:52:07.189171 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:07 crc kubenswrapper[4971]: I0309 09:52:07.204840 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-scripts" (OuterVolumeSpecName: "scripts") pod "fc9e7d2d-75bf-4d7a-9117-0b2410ff833a" (UID: "fc9e7d2d-75bf-4d7a-9117-0b2410ff833a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:52:07 crc kubenswrapper[4971]: I0309 09:52:07.209590 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "fc9e7d2d-75bf-4d7a-9117-0b2410ff833a" (UID: "fc9e7d2d-75bf-4d7a-9117-0b2410ff833a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:07 crc kubenswrapper[4971]: I0309 09:52:07.209675 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "fc9e7d2d-75bf-4d7a-9117-0b2410ff833a" (UID: "fc9e7d2d-75bf-4d7a-9117-0b2410ff833a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:07 crc kubenswrapper[4971]: I0309 09:52:07.289816 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:07 crc kubenswrapper[4971]: I0309 09:52:07.289854 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:07 crc kubenswrapper[4971]: I0309 09:52:07.289866 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:07 crc kubenswrapper[4971]: I0309 09:52:07.750602 4971 scope.go:117] "RemoveContainer" containerID="4e149b5b9c3f5bbb3f7867076809b14ba330f611e0df39a5fc3bd688dfe0d1e8" Mar 09 09:52:07 crc kubenswrapper[4971]: I0309 09:52:07.750654 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-k9nrt" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.206001 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6"] Mar 09 09:52:08 crc kubenswrapper[4971]: E0309 09:52:08.206295 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3cdaf5b-75af-469e-b233-f1cadc6f49a0" containerName="oc" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.206306 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3cdaf5b-75af-469e-b233-f1cadc6f49a0" containerName="oc" Mar 09 09:52:08 crc kubenswrapper[4971]: E0309 09:52:08.206325 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc9e7d2d-75bf-4d7a-9117-0b2410ff833a" containerName="swift-ring-rebalance" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.206331 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc9e7d2d-75bf-4d7a-9117-0b2410ff833a" containerName="swift-ring-rebalance" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.206489 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3cdaf5b-75af-469e-b233-f1cadc6f49a0" containerName="oc" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.206509 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc9e7d2d-75bf-4d7a-9117-0b2410ff833a" containerName="swift-ring-rebalance" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.206963 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.209520 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.210726 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.217058 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6"] Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.302962 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ede2637c-e034-401a-aed2-4bee64eaee46-dispersionconf\") pod \"swift-ring-rebalance-debug-v8hg6\" (UID: \"ede2637c-e034-401a-aed2-4bee64eaee46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.303030 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ede2637c-e034-401a-aed2-4bee64eaee46-scripts\") pod \"swift-ring-rebalance-debug-v8hg6\" (UID: \"ede2637c-e034-401a-aed2-4bee64eaee46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.303169 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ede2637c-e034-401a-aed2-4bee64eaee46-etc-swift\") pod \"swift-ring-rebalance-debug-v8hg6\" (UID: \"ede2637c-e034-401a-aed2-4bee64eaee46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.303206 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-776vg\" (UniqueName: \"kubernetes.io/projected/ede2637c-e034-401a-aed2-4bee64eaee46-kube-api-access-776vg\") pod \"swift-ring-rebalance-debug-v8hg6\" (UID: \"ede2637c-e034-401a-aed2-4bee64eaee46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.303239 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ede2637c-e034-401a-aed2-4bee64eaee46-swiftconf\") pod \"swift-ring-rebalance-debug-v8hg6\" (UID: \"ede2637c-e034-401a-aed2-4bee64eaee46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.303400 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ede2637c-e034-401a-aed2-4bee64eaee46-ring-data-devices\") pod \"swift-ring-rebalance-debug-v8hg6\" (UID: \"ede2637c-e034-401a-aed2-4bee64eaee46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.404081 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ede2637c-e034-401a-aed2-4bee64eaee46-dispersionconf\") pod \"swift-ring-rebalance-debug-v8hg6\" (UID: \"ede2637c-e034-401a-aed2-4bee64eaee46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.404430 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ede2637c-e034-401a-aed2-4bee64eaee46-scripts\") pod \"swift-ring-rebalance-debug-v8hg6\" (UID: \"ede2637c-e034-401a-aed2-4bee64eaee46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.404487 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ede2637c-e034-401a-aed2-4bee64eaee46-etc-swift\") pod \"swift-ring-rebalance-debug-v8hg6\" (UID: \"ede2637c-e034-401a-aed2-4bee64eaee46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.404513 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-776vg\" (UniqueName: \"kubernetes.io/projected/ede2637c-e034-401a-aed2-4bee64eaee46-kube-api-access-776vg\") pod \"swift-ring-rebalance-debug-v8hg6\" (UID: \"ede2637c-e034-401a-aed2-4bee64eaee46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.404536 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ede2637c-e034-401a-aed2-4bee64eaee46-swiftconf\") pod \"swift-ring-rebalance-debug-v8hg6\" (UID: \"ede2637c-e034-401a-aed2-4bee64eaee46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.404556 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ede2637c-e034-401a-aed2-4bee64eaee46-ring-data-devices\") pod \"swift-ring-rebalance-debug-v8hg6\" (UID: \"ede2637c-e034-401a-aed2-4bee64eaee46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.405108 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ede2637c-e034-401a-aed2-4bee64eaee46-etc-swift\") pod \"swift-ring-rebalance-debug-v8hg6\" (UID: \"ede2637c-e034-401a-aed2-4bee64eaee46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.405395 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ede2637c-e034-401a-aed2-4bee64eaee46-ring-data-devices\") pod \"swift-ring-rebalance-debug-v8hg6\" (UID: \"ede2637c-e034-401a-aed2-4bee64eaee46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.405622 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ede2637c-e034-401a-aed2-4bee64eaee46-scripts\") pod \"swift-ring-rebalance-debug-v8hg6\" (UID: \"ede2637c-e034-401a-aed2-4bee64eaee46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.412916 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ede2637c-e034-401a-aed2-4bee64eaee46-swiftconf\") pod \"swift-ring-rebalance-debug-v8hg6\" (UID: \"ede2637c-e034-401a-aed2-4bee64eaee46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.414206 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ede2637c-e034-401a-aed2-4bee64eaee46-dispersionconf\") pod \"swift-ring-rebalance-debug-v8hg6\" (UID: \"ede2637c-e034-401a-aed2-4bee64eaee46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.425426 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-776vg\" (UniqueName: \"kubernetes.io/projected/ede2637c-e034-401a-aed2-4bee64eaee46-kube-api-access-776vg\") pod \"swift-ring-rebalance-debug-v8hg6\" (UID: \"ede2637c-e034-401a-aed2-4bee64eaee46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.525780 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6" Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.736101 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6"] Mar 09 09:52:08 crc kubenswrapper[4971]: I0309 09:52:08.759584 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6" event={"ID":"ede2637c-e034-401a-aed2-4bee64eaee46","Type":"ContainerStarted","Data":"4f03491ad0b8de6bbd34811ffd975076b9a75e1be83bd65d18dbf77aa3149a35"} Mar 09 09:52:09 crc kubenswrapper[4971]: I0309 09:52:09.162328 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc9e7d2d-75bf-4d7a-9117-0b2410ff833a" path="/var/lib/kubelet/pods/fc9e7d2d-75bf-4d7a-9117-0b2410ff833a/volumes" Mar 09 09:52:09 crc kubenswrapper[4971]: I0309 09:52:09.768927 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6" event={"ID":"ede2637c-e034-401a-aed2-4bee64eaee46","Type":"ContainerStarted","Data":"a1e7cb89cb53089e1513d4e0dd5f9375d450bdb2770aca66d03ef0c8257ff6f3"} Mar 09 09:52:09 crc kubenswrapper[4971]: I0309 09:52:09.785939 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6" podStartSLOduration=1.785918589 podStartE2EDuration="1.785918589s" podCreationTimestamp="2026-03-09 09:52:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:52:09.783361506 +0000 UTC m=+1933.343289336" watchObservedRunningTime="2026-03-09 09:52:09.785918589 +0000 UTC m=+1933.345846399" Mar 09 09:52:10 crc kubenswrapper[4971]: I0309 09:52:10.778438 4971 generic.go:334] "Generic (PLEG): container finished" podID="ede2637c-e034-401a-aed2-4bee64eaee46" containerID="a1e7cb89cb53089e1513d4e0dd5f9375d450bdb2770aca66d03ef0c8257ff6f3" exitCode=0 Mar 09 09:52:10 crc kubenswrapper[4971]: I0309 09:52:10.778531 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6" event={"ID":"ede2637c-e034-401a-aed2-4bee64eaee46","Type":"ContainerDied","Data":"a1e7cb89cb53089e1513d4e0dd5f9375d450bdb2770aca66d03ef0c8257ff6f3"} Mar 09 09:52:12 crc kubenswrapper[4971]: I0309 09:52:12.079106 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6" Mar 09 09:52:12 crc kubenswrapper[4971]: I0309 09:52:12.117655 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6"] Mar 09 09:52:12 crc kubenswrapper[4971]: I0309 09:52:12.124739 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6"] Mar 09 09:52:12 crc kubenswrapper[4971]: I0309 09:52:12.256886 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-776vg\" (UniqueName: \"kubernetes.io/projected/ede2637c-e034-401a-aed2-4bee64eaee46-kube-api-access-776vg\") pod \"ede2637c-e034-401a-aed2-4bee64eaee46\" (UID: \"ede2637c-e034-401a-aed2-4bee64eaee46\") " Mar 09 09:52:12 crc kubenswrapper[4971]: I0309 09:52:12.256977 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ede2637c-e034-401a-aed2-4bee64eaee46-swiftconf\") pod \"ede2637c-e034-401a-aed2-4bee64eaee46\" (UID: \"ede2637c-e034-401a-aed2-4bee64eaee46\") " Mar 09 09:52:12 crc kubenswrapper[4971]: I0309 09:52:12.257001 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ede2637c-e034-401a-aed2-4bee64eaee46-dispersionconf\") pod \"ede2637c-e034-401a-aed2-4bee64eaee46\" (UID: \"ede2637c-e034-401a-aed2-4bee64eaee46\") " Mar 09 09:52:12 crc kubenswrapper[4971]: I0309 09:52:12.257056 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ede2637c-e034-401a-aed2-4bee64eaee46-ring-data-devices\") pod \"ede2637c-e034-401a-aed2-4bee64eaee46\" (UID: \"ede2637c-e034-401a-aed2-4bee64eaee46\") " Mar 09 09:52:12 crc kubenswrapper[4971]: I0309 09:52:12.257128 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ede2637c-e034-401a-aed2-4bee64eaee46-etc-swift\") pod \"ede2637c-e034-401a-aed2-4bee64eaee46\" (UID: \"ede2637c-e034-401a-aed2-4bee64eaee46\") " Mar 09 09:52:12 crc kubenswrapper[4971]: I0309 09:52:12.257187 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ede2637c-e034-401a-aed2-4bee64eaee46-scripts\") pod \"ede2637c-e034-401a-aed2-4bee64eaee46\" (UID: \"ede2637c-e034-401a-aed2-4bee64eaee46\") " Mar 09 09:52:12 crc kubenswrapper[4971]: I0309 09:52:12.258126 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ede2637c-e034-401a-aed2-4bee64eaee46-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ede2637c-e034-401a-aed2-4bee64eaee46" (UID: "ede2637c-e034-401a-aed2-4bee64eaee46"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:52:12 crc kubenswrapper[4971]: I0309 09:52:12.258269 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ede2637c-e034-401a-aed2-4bee64eaee46-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ede2637c-e034-401a-aed2-4bee64eaee46" (UID: "ede2637c-e034-401a-aed2-4bee64eaee46"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:52:12 crc kubenswrapper[4971]: I0309 09:52:12.258711 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ede2637c-e034-401a-aed2-4bee64eaee46-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:12 crc kubenswrapper[4971]: I0309 09:52:12.258730 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ede2637c-e034-401a-aed2-4bee64eaee46-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:12 crc kubenswrapper[4971]: I0309 09:52:12.262319 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ede2637c-e034-401a-aed2-4bee64eaee46-kube-api-access-776vg" (OuterVolumeSpecName: "kube-api-access-776vg") pod "ede2637c-e034-401a-aed2-4bee64eaee46" (UID: "ede2637c-e034-401a-aed2-4bee64eaee46"). InnerVolumeSpecName "kube-api-access-776vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:52:12 crc kubenswrapper[4971]: I0309 09:52:12.279825 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ede2637c-e034-401a-aed2-4bee64eaee46-scripts" (OuterVolumeSpecName: "scripts") pod "ede2637c-e034-401a-aed2-4bee64eaee46" (UID: "ede2637c-e034-401a-aed2-4bee64eaee46"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:52:12 crc kubenswrapper[4971]: I0309 09:52:12.281657 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ede2637c-e034-401a-aed2-4bee64eaee46-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ede2637c-e034-401a-aed2-4bee64eaee46" (UID: "ede2637c-e034-401a-aed2-4bee64eaee46"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:12 crc kubenswrapper[4971]: I0309 09:52:12.286505 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ede2637c-e034-401a-aed2-4bee64eaee46-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ede2637c-e034-401a-aed2-4bee64eaee46" (UID: "ede2637c-e034-401a-aed2-4bee64eaee46"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:12 crc kubenswrapper[4971]: I0309 09:52:12.359788 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ede2637c-e034-401a-aed2-4bee64eaee46-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:12 crc kubenswrapper[4971]: I0309 09:52:12.359824 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ede2637c-e034-401a-aed2-4bee64eaee46-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:12 crc kubenswrapper[4971]: I0309 09:52:12.359833 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ede2637c-e034-401a-aed2-4bee64eaee46-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:12 crc kubenswrapper[4971]: I0309 09:52:12.359843 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-776vg\" (UniqueName: \"kubernetes.io/projected/ede2637c-e034-401a-aed2-4bee64eaee46-kube-api-access-776vg\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:12 crc kubenswrapper[4971]: I0309 09:52:12.797567 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f03491ad0b8de6bbd34811ffd975076b9a75e1be83bd65d18dbf77aa3149a35" Mar 09 09:52:12 crc kubenswrapper[4971]: I0309 09:52:12.797800 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8hg6" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.164908 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ede2637c-e034-401a-aed2-4bee64eaee46" path="/var/lib/kubelet/pods/ede2637c-e034-401a-aed2-4bee64eaee46/volumes" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.251575 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x"] Mar 09 09:52:13 crc kubenswrapper[4971]: E0309 09:52:13.251851 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ede2637c-e034-401a-aed2-4bee64eaee46" containerName="swift-ring-rebalance" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.251863 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ede2637c-e034-401a-aed2-4bee64eaee46" containerName="swift-ring-rebalance" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.252046 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ede2637c-e034-401a-aed2-4bee64eaee46" containerName="swift-ring-rebalance" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.252509 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.255185 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.255817 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.263119 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x"] Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.376203 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-dispersionconf\") pod \"swift-ring-rebalance-debug-6tx6x\" (UID: \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.376301 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f7pm\" (UniqueName: \"kubernetes.io/projected/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-kube-api-access-7f7pm\") pod \"swift-ring-rebalance-debug-6tx6x\" (UID: \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.376406 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-swiftconf\") pod \"swift-ring-rebalance-debug-6tx6x\" (UID: \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.376475 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-scripts\") pod \"swift-ring-rebalance-debug-6tx6x\" (UID: \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.376578 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-etc-swift\") pod \"swift-ring-rebalance-debug-6tx6x\" (UID: \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.376640 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-ring-data-devices\") pod \"swift-ring-rebalance-debug-6tx6x\" (UID: \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.478561 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-etc-swift\") pod \"swift-ring-rebalance-debug-6tx6x\" (UID: \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.478641 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-ring-data-devices\") pod \"swift-ring-rebalance-debug-6tx6x\" (UID: \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.478678 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-dispersionconf\") pod \"swift-ring-rebalance-debug-6tx6x\" (UID: \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.478713 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f7pm\" (UniqueName: \"kubernetes.io/projected/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-kube-api-access-7f7pm\") pod \"swift-ring-rebalance-debug-6tx6x\" (UID: \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.478782 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-swiftconf\") pod \"swift-ring-rebalance-debug-6tx6x\" (UID: \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.478808 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-scripts\") pod \"swift-ring-rebalance-debug-6tx6x\" (UID: \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.479399 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-etc-swift\") pod \"swift-ring-rebalance-debug-6tx6x\" (UID: \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.479785 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-scripts\") pod \"swift-ring-rebalance-debug-6tx6x\" (UID: \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.479812 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-ring-data-devices\") pod \"swift-ring-rebalance-debug-6tx6x\" (UID: \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.487041 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-swiftconf\") pod \"swift-ring-rebalance-debug-6tx6x\" (UID: \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.487062 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-dispersionconf\") pod \"swift-ring-rebalance-debug-6tx6x\" (UID: \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.497622 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f7pm\" (UniqueName: \"kubernetes.io/projected/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-kube-api-access-7f7pm\") pod \"swift-ring-rebalance-debug-6tx6x\" (UID: \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.573465 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x" Mar 09 09:52:13 crc kubenswrapper[4971]: I0309 09:52:13.984362 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x"] Mar 09 09:52:14 crc kubenswrapper[4971]: I0309 09:52:14.152628 4971 scope.go:117] "RemoveContainer" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" Mar 09 09:52:14 crc kubenswrapper[4971]: E0309 09:52:14.152893 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 09:52:14 crc kubenswrapper[4971]: I0309 09:52:14.815903 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x" event={"ID":"af994edb-eeb8-4dc1-a592-4414a1ea1b0d","Type":"ContainerStarted","Data":"e93b3ecbf05d2a80a452f4e09c1b56b77749f18c5dbdbbeb500da4b4ce2182cf"} Mar 09 09:52:14 crc kubenswrapper[4971]: I0309 09:52:14.816257 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x" event={"ID":"af994edb-eeb8-4dc1-a592-4414a1ea1b0d","Type":"ContainerStarted","Data":"ff6e0a076c2eea3b362c55641bfd471deb1c23a84f4c4543af46e0b62c184900"} Mar 09 09:52:14 crc kubenswrapper[4971]: I0309 09:52:14.842132 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x" podStartSLOduration=1.842112342 podStartE2EDuration="1.842112342s" podCreationTimestamp="2026-03-09 09:52:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:52:14.83682407 +0000 UTC m=+1938.396751880" watchObservedRunningTime="2026-03-09 09:52:14.842112342 +0000 UTC m=+1938.402040152" Mar 09 09:52:15 crc kubenswrapper[4971]: I0309 09:52:15.824480 4971 generic.go:334] "Generic (PLEG): container finished" podID="af994edb-eeb8-4dc1-a592-4414a1ea1b0d" containerID="e93b3ecbf05d2a80a452f4e09c1b56b77749f18c5dbdbbeb500da4b4ce2182cf" exitCode=0 Mar 09 09:52:15 crc kubenswrapper[4971]: I0309 09:52:15.824589 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x" event={"ID":"af994edb-eeb8-4dc1-a592-4414a1ea1b0d","Type":"ContainerDied","Data":"e93b3ecbf05d2a80a452f4e09c1b56b77749f18c5dbdbbeb500da4b4ce2182cf"} Mar 09 09:52:17 crc kubenswrapper[4971]: I0309 09:52:17.182179 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x" Mar 09 09:52:17 crc kubenswrapper[4971]: I0309 09:52:17.215158 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x"] Mar 09 09:52:17 crc kubenswrapper[4971]: I0309 09:52:17.221337 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x"] Mar 09 09:52:17 crc kubenswrapper[4971]: I0309 09:52:17.346715 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f7pm\" (UniqueName: \"kubernetes.io/projected/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-kube-api-access-7f7pm\") pod \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\" (UID: \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\") " Mar 09 09:52:17 crc kubenswrapper[4971]: I0309 09:52:17.346768 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-ring-data-devices\") pod \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\" (UID: \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\") " Mar 09 09:52:17 crc kubenswrapper[4971]: I0309 09:52:17.346789 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-scripts\") pod \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\" (UID: \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\") " Mar 09 09:52:17 crc kubenswrapper[4971]: I0309 09:52:17.346824 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-swiftconf\") pod \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\" (UID: \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\") " Mar 09 09:52:17 crc kubenswrapper[4971]: I0309 09:52:17.346854 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-etc-swift\") pod \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\" (UID: \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\") " Mar 09 09:52:17 crc kubenswrapper[4971]: I0309 09:52:17.346969 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-dispersionconf\") pod \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\" (UID: \"af994edb-eeb8-4dc1-a592-4414a1ea1b0d\") " Mar 09 09:52:17 crc kubenswrapper[4971]: I0309 09:52:17.347603 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "af994edb-eeb8-4dc1-a592-4414a1ea1b0d" (UID: "af994edb-eeb8-4dc1-a592-4414a1ea1b0d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:52:17 crc kubenswrapper[4971]: I0309 09:52:17.348372 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "af994edb-eeb8-4dc1-a592-4414a1ea1b0d" (UID: "af994edb-eeb8-4dc1-a592-4414a1ea1b0d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:52:17 crc kubenswrapper[4971]: I0309 09:52:17.356553 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-kube-api-access-7f7pm" (OuterVolumeSpecName: "kube-api-access-7f7pm") pod "af994edb-eeb8-4dc1-a592-4414a1ea1b0d" (UID: "af994edb-eeb8-4dc1-a592-4414a1ea1b0d"). InnerVolumeSpecName "kube-api-access-7f7pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:52:17 crc kubenswrapper[4971]: I0309 09:52:17.365239 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-scripts" (OuterVolumeSpecName: "scripts") pod "af994edb-eeb8-4dc1-a592-4414a1ea1b0d" (UID: "af994edb-eeb8-4dc1-a592-4414a1ea1b0d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:52:17 crc kubenswrapper[4971]: I0309 09:52:17.369520 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "af994edb-eeb8-4dc1-a592-4414a1ea1b0d" (UID: "af994edb-eeb8-4dc1-a592-4414a1ea1b0d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:17 crc kubenswrapper[4971]: I0309 09:52:17.370633 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "af994edb-eeb8-4dc1-a592-4414a1ea1b0d" (UID: "af994edb-eeb8-4dc1-a592-4414a1ea1b0d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:17 crc kubenswrapper[4971]: I0309 09:52:17.449149 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f7pm\" (UniqueName: \"kubernetes.io/projected/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-kube-api-access-7f7pm\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:17 crc kubenswrapper[4971]: I0309 09:52:17.449203 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:17 crc kubenswrapper[4971]: I0309 09:52:17.449216 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:17 crc kubenswrapper[4971]: I0309 09:52:17.449228 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:17 crc kubenswrapper[4971]: I0309 09:52:17.449269 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:17 crc kubenswrapper[4971]: I0309 09:52:17.449281 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/af994edb-eeb8-4dc1-a592-4414a1ea1b0d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:17 crc kubenswrapper[4971]: I0309 09:52:17.849596 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff6e0a076c2eea3b362c55641bfd471deb1c23a84f4c4543af46e0b62c184900" Mar 09 09:52:17 crc kubenswrapper[4971]: I0309 09:52:17.849665 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6tx6x" Mar 09 09:52:18 crc kubenswrapper[4971]: I0309 09:52:18.346075 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b"] Mar 09 09:52:18 crc kubenswrapper[4971]: E0309 09:52:18.346685 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af994edb-eeb8-4dc1-a592-4414a1ea1b0d" containerName="swift-ring-rebalance" Mar 09 09:52:18 crc kubenswrapper[4971]: I0309 09:52:18.346699 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="af994edb-eeb8-4dc1-a592-4414a1ea1b0d" containerName="swift-ring-rebalance" Mar 09 09:52:18 crc kubenswrapper[4971]: I0309 09:52:18.346863 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="af994edb-eeb8-4dc1-a592-4414a1ea1b0d" containerName="swift-ring-rebalance" Mar 09 09:52:18 crc kubenswrapper[4971]: I0309 09:52:18.348466 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b" Mar 09 09:52:18 crc kubenswrapper[4971]: I0309 09:52:18.350569 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:52:18 crc kubenswrapper[4971]: I0309 09:52:18.351971 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:52:18 crc kubenswrapper[4971]: I0309 09:52:18.355325 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b"] Mar 09 09:52:18 crc kubenswrapper[4971]: I0309 09:52:18.388080 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-swiftconf\") pod \"swift-ring-rebalance-debug-j4g6b\" (UID: \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b" Mar 09 09:52:18 crc kubenswrapper[4971]: I0309 09:52:18.388129 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-etc-swift\") pod \"swift-ring-rebalance-debug-j4g6b\" (UID: \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b" Mar 09 09:52:18 crc kubenswrapper[4971]: I0309 09:52:18.388146 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-ring-data-devices\") pod \"swift-ring-rebalance-debug-j4g6b\" (UID: \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b" Mar 09 09:52:18 crc kubenswrapper[4971]: I0309 09:52:18.388184 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-dispersionconf\") pod \"swift-ring-rebalance-debug-j4g6b\" (UID: \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b" Mar 09 09:52:18 crc kubenswrapper[4971]: I0309 09:52:18.388264 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-scripts\") pod \"swift-ring-rebalance-debug-j4g6b\" (UID: \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b" Mar 09 09:52:18 crc kubenswrapper[4971]: I0309 09:52:18.388286 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6hwj\" (UniqueName: \"kubernetes.io/projected/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-kube-api-access-z6hwj\") pod \"swift-ring-rebalance-debug-j4g6b\" (UID: \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b" Mar 09 09:52:18 crc kubenswrapper[4971]: I0309 09:52:18.489763 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-swiftconf\") pod \"swift-ring-rebalance-debug-j4g6b\" (UID: \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b" Mar 09 09:52:18 crc kubenswrapper[4971]: I0309 09:52:18.489831 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-etc-swift\") pod \"swift-ring-rebalance-debug-j4g6b\" (UID: \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b" Mar 09 09:52:18 crc kubenswrapper[4971]: I0309 09:52:18.489855 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-ring-data-devices\") pod \"swift-ring-rebalance-debug-j4g6b\" (UID: \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b" Mar 09 09:52:18 crc kubenswrapper[4971]: I0309 09:52:18.489899 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-dispersionconf\") pod \"swift-ring-rebalance-debug-j4g6b\" (UID: \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b" Mar 09 09:52:18 crc kubenswrapper[4971]: I0309 09:52:18.489951 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-scripts\") pod \"swift-ring-rebalance-debug-j4g6b\" (UID: \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b" Mar 09 09:52:18 crc kubenswrapper[4971]: I0309 09:52:18.489980 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6hwj\" (UniqueName: \"kubernetes.io/projected/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-kube-api-access-z6hwj\") pod \"swift-ring-rebalance-debug-j4g6b\" (UID: \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b" Mar 09 09:52:18 crc kubenswrapper[4971]: I0309 09:52:18.490811 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-etc-swift\") pod \"swift-ring-rebalance-debug-j4g6b\" (UID: \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b" Mar 09 09:52:18 crc kubenswrapper[4971]: I0309 09:52:18.491446 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-scripts\") pod \"swift-ring-rebalance-debug-j4g6b\" (UID: \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b" Mar 09 09:52:18 crc kubenswrapper[4971]: I0309 09:52:18.491870 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-ring-data-devices\") pod \"swift-ring-rebalance-debug-j4g6b\" (UID: \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b" Mar 09 09:52:18 crc kubenswrapper[4971]: I0309 09:52:18.494268 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-swiftconf\") pod \"swift-ring-rebalance-debug-j4g6b\" (UID: \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b" Mar 09 09:52:18 crc kubenswrapper[4971]: I0309 09:52:18.496124 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-dispersionconf\") pod \"swift-ring-rebalance-debug-j4g6b\" (UID: \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b" Mar 09 09:52:18 crc kubenswrapper[4971]: I0309 09:52:18.512745 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6hwj\" (UniqueName: \"kubernetes.io/projected/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-kube-api-access-z6hwj\") pod \"swift-ring-rebalance-debug-j4g6b\" (UID: \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b" Mar 09 09:52:18 crc kubenswrapper[4971]: I0309 09:52:18.702882 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b" Mar 09 09:52:19 crc kubenswrapper[4971]: I0309 09:52:19.137786 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b"] Mar 09 09:52:19 crc kubenswrapper[4971]: I0309 09:52:19.166326 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af994edb-eeb8-4dc1-a592-4414a1ea1b0d" path="/var/lib/kubelet/pods/af994edb-eeb8-4dc1-a592-4414a1ea1b0d/volumes" Mar 09 09:52:19 crc kubenswrapper[4971]: I0309 09:52:19.866611 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b" event={"ID":"21b3c96b-403e-47fb-b0b9-1b35d7d51a84","Type":"ContainerStarted","Data":"732971afa840b8df1a6736b8629246b240281a77aa37b9cb42e5b31288370944"} Mar 09 09:52:19 crc kubenswrapper[4971]: I0309 09:52:19.866951 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b" event={"ID":"21b3c96b-403e-47fb-b0b9-1b35d7d51a84","Type":"ContainerStarted","Data":"1f263ff06646c9110401eda13a7da477967a3739e755da6f960045bd51b6d460"} Mar 09 09:52:19 crc kubenswrapper[4971]: I0309 09:52:19.903112 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b" podStartSLOduration=1.903085093 podStartE2EDuration="1.903085093s" podCreationTimestamp="2026-03-09 09:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:52:19.895257537 +0000 UTC m=+1943.455185337" watchObservedRunningTime="2026-03-09 09:52:19.903085093 +0000 UTC m=+1943.463012913" Mar 09 09:52:20 crc kubenswrapper[4971]: I0309 09:52:20.874960 4971 generic.go:334] "Generic (PLEG): container finished" podID="21b3c96b-403e-47fb-b0b9-1b35d7d51a84" containerID="732971afa840b8df1a6736b8629246b240281a77aa37b9cb42e5b31288370944" exitCode=0 Mar 09 09:52:20 crc kubenswrapper[4971]: I0309 09:52:20.875013 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b" event={"ID":"21b3c96b-403e-47fb-b0b9-1b35d7d51a84","Type":"ContainerDied","Data":"732971afa840b8df1a6736b8629246b240281a77aa37b9cb42e5b31288370944"} Mar 09 09:52:22 crc kubenswrapper[4971]: I0309 09:52:22.169501 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b" Mar 09 09:52:22 crc kubenswrapper[4971]: I0309 09:52:22.205743 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b"] Mar 09 09:52:22 crc kubenswrapper[4971]: I0309 09:52:22.211091 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b"] Mar 09 09:52:22 crc kubenswrapper[4971]: I0309 09:52:22.243148 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-etc-swift\") pod \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\" (UID: \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\") " Mar 09 09:52:22 crc kubenswrapper[4971]: I0309 09:52:22.243385 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-scripts\") pod \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\" (UID: \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\") " Mar 09 09:52:22 crc kubenswrapper[4971]: I0309 09:52:22.243452 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-ring-data-devices\") pod \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\" (UID: \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\") " Mar 09 09:52:22 crc kubenswrapper[4971]: I0309 09:52:22.243479 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-dispersionconf\") pod \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\" (UID: \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\") " Mar 09 09:52:22 crc kubenswrapper[4971]: I0309 09:52:22.243532 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6hwj\" (UniqueName: \"kubernetes.io/projected/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-kube-api-access-z6hwj\") pod \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\" (UID: \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\") " Mar 09 09:52:22 crc kubenswrapper[4971]: I0309 09:52:22.243561 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-swiftconf\") pod \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\" (UID: \"21b3c96b-403e-47fb-b0b9-1b35d7d51a84\") " Mar 09 09:52:22 crc kubenswrapper[4971]: I0309 09:52:22.244147 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "21b3c96b-403e-47fb-b0b9-1b35d7d51a84" (UID: "21b3c96b-403e-47fb-b0b9-1b35d7d51a84"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:52:22 crc kubenswrapper[4971]: I0309 09:52:22.245036 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "21b3c96b-403e-47fb-b0b9-1b35d7d51a84" (UID: "21b3c96b-403e-47fb-b0b9-1b35d7d51a84"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:52:22 crc kubenswrapper[4971]: I0309 09:52:22.248815 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-kube-api-access-z6hwj" (OuterVolumeSpecName: "kube-api-access-z6hwj") pod "21b3c96b-403e-47fb-b0b9-1b35d7d51a84" (UID: "21b3c96b-403e-47fb-b0b9-1b35d7d51a84"). InnerVolumeSpecName "kube-api-access-z6hwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:52:22 crc kubenswrapper[4971]: I0309 09:52:22.262933 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-scripts" (OuterVolumeSpecName: "scripts") pod "21b3c96b-403e-47fb-b0b9-1b35d7d51a84" (UID: "21b3c96b-403e-47fb-b0b9-1b35d7d51a84"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:52:22 crc kubenswrapper[4971]: I0309 09:52:22.266799 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "21b3c96b-403e-47fb-b0b9-1b35d7d51a84" (UID: "21b3c96b-403e-47fb-b0b9-1b35d7d51a84"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:22 crc kubenswrapper[4971]: I0309 09:52:22.268764 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "21b3c96b-403e-47fb-b0b9-1b35d7d51a84" (UID: "21b3c96b-403e-47fb-b0b9-1b35d7d51a84"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:22 crc kubenswrapper[4971]: I0309 09:52:22.345416 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6hwj\" (UniqueName: \"kubernetes.io/projected/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-kube-api-access-z6hwj\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:22 crc kubenswrapper[4971]: I0309 09:52:22.345464 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:22 crc kubenswrapper[4971]: I0309 09:52:22.345479 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:22 crc kubenswrapper[4971]: I0309 09:52:22.345493 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:22 crc kubenswrapper[4971]: I0309 09:52:22.345506 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:22 crc kubenswrapper[4971]: I0309 09:52:22.345518 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21b3c96b-403e-47fb-b0b9-1b35d7d51a84-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:22 crc kubenswrapper[4971]: I0309 09:52:22.895168 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f263ff06646c9110401eda13a7da477967a3739e755da6f960045bd51b6d460" Mar 09 09:52:22 crc kubenswrapper[4971]: I0309 09:52:22.895239 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j4g6b" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.161799 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21b3c96b-403e-47fb-b0b9-1b35d7d51a84" path="/var/lib/kubelet/pods/21b3c96b-403e-47fb-b0b9-1b35d7d51a84/volumes" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.335431 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b"] Mar 09 09:52:23 crc kubenswrapper[4971]: E0309 09:52:23.335824 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b3c96b-403e-47fb-b0b9-1b35d7d51a84" containerName="swift-ring-rebalance" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.335840 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b3c96b-403e-47fb-b0b9-1b35d7d51a84" containerName="swift-ring-rebalance" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.336042 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="21b3c96b-403e-47fb-b0b9-1b35d7d51a84" containerName="swift-ring-rebalance" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.336646 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.340753 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.341081 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.349847 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b"] Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.358238 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a90a621-c900-4f6b-8629-601fca5b8fe6-scripts\") pod \"swift-ring-rebalance-debug-9qt9b\" (UID: \"6a90a621-c900-4f6b-8629-601fca5b8fe6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.358301 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6a90a621-c900-4f6b-8629-601fca5b8fe6-swiftconf\") pod \"swift-ring-rebalance-debug-9qt9b\" (UID: \"6a90a621-c900-4f6b-8629-601fca5b8fe6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.358366 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n948h\" (UniqueName: \"kubernetes.io/projected/6a90a621-c900-4f6b-8629-601fca5b8fe6-kube-api-access-n948h\") pod \"swift-ring-rebalance-debug-9qt9b\" (UID: \"6a90a621-c900-4f6b-8629-601fca5b8fe6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.358456 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6a90a621-c900-4f6b-8629-601fca5b8fe6-dispersionconf\") pod \"swift-ring-rebalance-debug-9qt9b\" (UID: \"6a90a621-c900-4f6b-8629-601fca5b8fe6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.358522 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6a90a621-c900-4f6b-8629-601fca5b8fe6-etc-swift\") pod \"swift-ring-rebalance-debug-9qt9b\" (UID: \"6a90a621-c900-4f6b-8629-601fca5b8fe6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.358544 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6a90a621-c900-4f6b-8629-601fca5b8fe6-ring-data-devices\") pod \"swift-ring-rebalance-debug-9qt9b\" (UID: \"6a90a621-c900-4f6b-8629-601fca5b8fe6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.460095 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6a90a621-c900-4f6b-8629-601fca5b8fe6-etc-swift\") pod \"swift-ring-rebalance-debug-9qt9b\" (UID: \"6a90a621-c900-4f6b-8629-601fca5b8fe6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.460142 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6a90a621-c900-4f6b-8629-601fca5b8fe6-ring-data-devices\") pod \"swift-ring-rebalance-debug-9qt9b\" (UID: \"6a90a621-c900-4f6b-8629-601fca5b8fe6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.460178 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a90a621-c900-4f6b-8629-601fca5b8fe6-scripts\") pod \"swift-ring-rebalance-debug-9qt9b\" (UID: \"6a90a621-c900-4f6b-8629-601fca5b8fe6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.460217 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6a90a621-c900-4f6b-8629-601fca5b8fe6-swiftconf\") pod \"swift-ring-rebalance-debug-9qt9b\" (UID: \"6a90a621-c900-4f6b-8629-601fca5b8fe6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.460259 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n948h\" (UniqueName: \"kubernetes.io/projected/6a90a621-c900-4f6b-8629-601fca5b8fe6-kube-api-access-n948h\") pod \"swift-ring-rebalance-debug-9qt9b\" (UID: \"6a90a621-c900-4f6b-8629-601fca5b8fe6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.460311 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6a90a621-c900-4f6b-8629-601fca5b8fe6-dispersionconf\") pod \"swift-ring-rebalance-debug-9qt9b\" (UID: \"6a90a621-c900-4f6b-8629-601fca5b8fe6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.460809 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6a90a621-c900-4f6b-8629-601fca5b8fe6-etc-swift\") pod \"swift-ring-rebalance-debug-9qt9b\" (UID: \"6a90a621-c900-4f6b-8629-601fca5b8fe6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.461023 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6a90a621-c900-4f6b-8629-601fca5b8fe6-ring-data-devices\") pod \"swift-ring-rebalance-debug-9qt9b\" (UID: \"6a90a621-c900-4f6b-8629-601fca5b8fe6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.461991 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a90a621-c900-4f6b-8629-601fca5b8fe6-scripts\") pod \"swift-ring-rebalance-debug-9qt9b\" (UID: \"6a90a621-c900-4f6b-8629-601fca5b8fe6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.464662 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6a90a621-c900-4f6b-8629-601fca5b8fe6-swiftconf\") pod \"swift-ring-rebalance-debug-9qt9b\" (UID: \"6a90a621-c900-4f6b-8629-601fca5b8fe6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.468895 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6a90a621-c900-4f6b-8629-601fca5b8fe6-dispersionconf\") pod \"swift-ring-rebalance-debug-9qt9b\" (UID: \"6a90a621-c900-4f6b-8629-601fca5b8fe6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.477968 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n948h\" (UniqueName: \"kubernetes.io/projected/6a90a621-c900-4f6b-8629-601fca5b8fe6-kube-api-access-n948h\") pod \"swift-ring-rebalance-debug-9qt9b\" (UID: \"6a90a621-c900-4f6b-8629-601fca5b8fe6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.659438 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b" Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.891044 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b"] Mar 09 09:52:23 crc kubenswrapper[4971]: I0309 09:52:23.915289 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b" event={"ID":"6a90a621-c900-4f6b-8629-601fca5b8fe6","Type":"ContainerStarted","Data":"e1e696d66931910132f07ac8078fca8217f335ddd5e181309f635a43b27b8047"} Mar 09 09:52:24 crc kubenswrapper[4971]: I0309 09:52:24.922423 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b" event={"ID":"6a90a621-c900-4f6b-8629-601fca5b8fe6","Type":"ContainerStarted","Data":"155016833dc428af8abfc333edc45781dccb518cf18df24f90eae524b6a98026"} Mar 09 09:52:24 crc kubenswrapper[4971]: I0309 09:52:24.938772 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b" podStartSLOduration=1.9387556030000002 podStartE2EDuration="1.938755603s" podCreationTimestamp="2026-03-09 09:52:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:52:24.935551651 +0000 UTC m=+1948.495479471" watchObservedRunningTime="2026-03-09 09:52:24.938755603 +0000 UTC m=+1948.498683413" Mar 09 09:52:25 crc kubenswrapper[4971]: I0309 09:52:25.932500 4971 generic.go:334] "Generic (PLEG): container finished" podID="6a90a621-c900-4f6b-8629-601fca5b8fe6" containerID="155016833dc428af8abfc333edc45781dccb518cf18df24f90eae524b6a98026" exitCode=0 Mar 09 09:52:25 crc kubenswrapper[4971]: I0309 09:52:25.932579 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b" event={"ID":"6a90a621-c900-4f6b-8629-601fca5b8fe6","Type":"ContainerDied","Data":"155016833dc428af8abfc333edc45781dccb518cf18df24f90eae524b6a98026"} Mar 09 09:52:26 crc kubenswrapper[4971]: I0309 09:52:26.151972 4971 scope.go:117] "RemoveContainer" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" Mar 09 09:52:26 crc kubenswrapper[4971]: E0309 09:52:26.152264 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 09:52:27 crc kubenswrapper[4971]: I0309 09:52:27.203157 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b" Mar 09 09:52:27 crc kubenswrapper[4971]: I0309 09:52:27.239198 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b"] Mar 09 09:52:27 crc kubenswrapper[4971]: I0309 09:52:27.245109 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b"] Mar 09 09:52:27 crc kubenswrapper[4971]: I0309 09:52:27.312873 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6a90a621-c900-4f6b-8629-601fca5b8fe6-dispersionconf\") pod \"6a90a621-c900-4f6b-8629-601fca5b8fe6\" (UID: \"6a90a621-c900-4f6b-8629-601fca5b8fe6\") " Mar 09 09:52:27 crc kubenswrapper[4971]: I0309 09:52:27.312940 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a90a621-c900-4f6b-8629-601fca5b8fe6-scripts\") pod \"6a90a621-c900-4f6b-8629-601fca5b8fe6\" (UID: \"6a90a621-c900-4f6b-8629-601fca5b8fe6\") " Mar 09 09:52:27 crc kubenswrapper[4971]: I0309 09:52:27.312974 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6a90a621-c900-4f6b-8629-601fca5b8fe6-etc-swift\") pod \"6a90a621-c900-4f6b-8629-601fca5b8fe6\" (UID: \"6a90a621-c900-4f6b-8629-601fca5b8fe6\") " Mar 09 09:52:27 crc kubenswrapper[4971]: I0309 09:52:27.313001 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6a90a621-c900-4f6b-8629-601fca5b8fe6-swiftconf\") pod \"6a90a621-c900-4f6b-8629-601fca5b8fe6\" (UID: \"6a90a621-c900-4f6b-8629-601fca5b8fe6\") " Mar 09 09:52:27 crc kubenswrapper[4971]: I0309 09:52:27.313037 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n948h\" (UniqueName: \"kubernetes.io/projected/6a90a621-c900-4f6b-8629-601fca5b8fe6-kube-api-access-n948h\") pod \"6a90a621-c900-4f6b-8629-601fca5b8fe6\" (UID: \"6a90a621-c900-4f6b-8629-601fca5b8fe6\") " Mar 09 09:52:27 crc kubenswrapper[4971]: I0309 09:52:27.313101 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6a90a621-c900-4f6b-8629-601fca5b8fe6-ring-data-devices\") pod \"6a90a621-c900-4f6b-8629-601fca5b8fe6\" (UID: \"6a90a621-c900-4f6b-8629-601fca5b8fe6\") " Mar 09 09:52:27 crc kubenswrapper[4971]: I0309 09:52:27.313716 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a90a621-c900-4f6b-8629-601fca5b8fe6-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6a90a621-c900-4f6b-8629-601fca5b8fe6" (UID: "6a90a621-c900-4f6b-8629-601fca5b8fe6"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:52:27 crc kubenswrapper[4971]: I0309 09:52:27.315097 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a90a621-c900-4f6b-8629-601fca5b8fe6-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6a90a621-c900-4f6b-8629-601fca5b8fe6" (UID: "6a90a621-c900-4f6b-8629-601fca5b8fe6"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:52:27 crc kubenswrapper[4971]: I0309 09:52:27.319565 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a90a621-c900-4f6b-8629-601fca5b8fe6-kube-api-access-n948h" (OuterVolumeSpecName: "kube-api-access-n948h") pod "6a90a621-c900-4f6b-8629-601fca5b8fe6" (UID: "6a90a621-c900-4f6b-8629-601fca5b8fe6"). InnerVolumeSpecName "kube-api-access-n948h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:52:27 crc kubenswrapper[4971]: I0309 09:52:27.338617 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a90a621-c900-4f6b-8629-601fca5b8fe6-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6a90a621-c900-4f6b-8629-601fca5b8fe6" (UID: "6a90a621-c900-4f6b-8629-601fca5b8fe6"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:27 crc kubenswrapper[4971]: I0309 09:52:27.338650 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a90a621-c900-4f6b-8629-601fca5b8fe6-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6a90a621-c900-4f6b-8629-601fca5b8fe6" (UID: "6a90a621-c900-4f6b-8629-601fca5b8fe6"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:27 crc kubenswrapper[4971]: I0309 09:52:27.343897 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a90a621-c900-4f6b-8629-601fca5b8fe6-scripts" (OuterVolumeSpecName: "scripts") pod "6a90a621-c900-4f6b-8629-601fca5b8fe6" (UID: "6a90a621-c900-4f6b-8629-601fca5b8fe6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:52:27 crc kubenswrapper[4971]: I0309 09:52:27.416132 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6a90a621-c900-4f6b-8629-601fca5b8fe6-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:27 crc kubenswrapper[4971]: I0309 09:52:27.416182 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a90a621-c900-4f6b-8629-601fca5b8fe6-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:27 crc kubenswrapper[4971]: I0309 09:52:27.416191 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6a90a621-c900-4f6b-8629-601fca5b8fe6-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:27 crc kubenswrapper[4971]: I0309 09:52:27.416199 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6a90a621-c900-4f6b-8629-601fca5b8fe6-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:27 crc kubenswrapper[4971]: I0309 09:52:27.416209 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n948h\" (UniqueName: \"kubernetes.io/projected/6a90a621-c900-4f6b-8629-601fca5b8fe6-kube-api-access-n948h\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:27 crc kubenswrapper[4971]: I0309 09:52:27.416239 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6a90a621-c900-4f6b-8629-601fca5b8fe6-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:27 crc kubenswrapper[4971]: I0309 09:52:27.953675 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1e696d66931910132f07ac8078fca8217f335ddd5e181309f635a43b27b8047" Mar 09 09:52:27 crc kubenswrapper[4971]: I0309 09:52:27.953724 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qt9b" Mar 09 09:52:28 crc kubenswrapper[4971]: I0309 09:52:28.372586 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6nbng"] Mar 09 09:52:28 crc kubenswrapper[4971]: E0309 09:52:28.373082 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a90a621-c900-4f6b-8629-601fca5b8fe6" containerName="swift-ring-rebalance" Mar 09 09:52:28 crc kubenswrapper[4971]: I0309 09:52:28.373112 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a90a621-c900-4f6b-8629-601fca5b8fe6" containerName="swift-ring-rebalance" Mar 09 09:52:28 crc kubenswrapper[4971]: I0309 09:52:28.373481 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a90a621-c900-4f6b-8629-601fca5b8fe6" containerName="swift-ring-rebalance" Mar 09 09:52:28 crc kubenswrapper[4971]: I0309 09:52:28.374266 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6nbng" Mar 09 09:52:28 crc kubenswrapper[4971]: I0309 09:52:28.377414 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:52:28 crc kubenswrapper[4971]: I0309 09:52:28.379607 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6nbng"] Mar 09 09:52:28 crc kubenswrapper[4971]: I0309 09:52:28.380387 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:52:28 crc kubenswrapper[4971]: I0309 09:52:28.532868 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d7ffef04-0e69-476d-bb3d-929b38c4b835-ring-data-devices\") pod \"swift-ring-rebalance-debug-6nbng\" (UID: \"d7ffef04-0e69-476d-bb3d-929b38c4b835\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6nbng" Mar 09 09:52:28 crc kubenswrapper[4971]: I0309 09:52:28.532930 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d7ffef04-0e69-476d-bb3d-929b38c4b835-etc-swift\") pod \"swift-ring-rebalance-debug-6nbng\" (UID: \"d7ffef04-0e69-476d-bb3d-929b38c4b835\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6nbng" Mar 09 09:52:28 crc kubenswrapper[4971]: I0309 09:52:28.532972 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7ffef04-0e69-476d-bb3d-929b38c4b835-scripts\") pod \"swift-ring-rebalance-debug-6nbng\" (UID: \"d7ffef04-0e69-476d-bb3d-929b38c4b835\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6nbng" Mar 09 09:52:28 crc kubenswrapper[4971]: I0309 09:52:28.532991 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d7ffef04-0e69-476d-bb3d-929b38c4b835-swiftconf\") pod \"swift-ring-rebalance-debug-6nbng\" (UID: \"d7ffef04-0e69-476d-bb3d-929b38c4b835\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6nbng" Mar 09 09:52:28 crc kubenswrapper[4971]: I0309 09:52:28.533015 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d7ffef04-0e69-476d-bb3d-929b38c4b835-dispersionconf\") pod \"swift-ring-rebalance-debug-6nbng\" (UID: \"d7ffef04-0e69-476d-bb3d-929b38c4b835\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6nbng" Mar 09 09:52:28 crc kubenswrapper[4971]: I0309 09:52:28.533171 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mqnj\" (UniqueName: \"kubernetes.io/projected/d7ffef04-0e69-476d-bb3d-929b38c4b835-kube-api-access-2mqnj\") pod \"swift-ring-rebalance-debug-6nbng\" (UID: \"d7ffef04-0e69-476d-bb3d-929b38c4b835\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6nbng" Mar 09 09:52:28 crc kubenswrapper[4971]: I0309 09:52:28.634669 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7ffef04-0e69-476d-bb3d-929b38c4b835-scripts\") pod \"swift-ring-rebalance-debug-6nbng\" (UID: \"d7ffef04-0e69-476d-bb3d-929b38c4b835\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6nbng" Mar 09 09:52:28 crc kubenswrapper[4971]: I0309 09:52:28.634729 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d7ffef04-0e69-476d-bb3d-929b38c4b835-swiftconf\") pod \"swift-ring-rebalance-debug-6nbng\" (UID: \"d7ffef04-0e69-476d-bb3d-929b38c4b835\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6nbng" Mar 09 09:52:28 crc kubenswrapper[4971]: I0309 09:52:28.634767 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d7ffef04-0e69-476d-bb3d-929b38c4b835-dispersionconf\") pod \"swift-ring-rebalance-debug-6nbng\" (UID: \"d7ffef04-0e69-476d-bb3d-929b38c4b835\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6nbng" Mar 09 09:52:28 crc kubenswrapper[4971]: I0309 09:52:28.634834 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mqnj\" (UniqueName: \"kubernetes.io/projected/d7ffef04-0e69-476d-bb3d-929b38c4b835-kube-api-access-2mqnj\") pod \"swift-ring-rebalance-debug-6nbng\" (UID: \"d7ffef04-0e69-476d-bb3d-929b38c4b835\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6nbng" Mar 09 09:52:28 crc kubenswrapper[4971]: I0309 09:52:28.634872 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d7ffef04-0e69-476d-bb3d-929b38c4b835-ring-data-devices\") pod \"swift-ring-rebalance-debug-6nbng\" (UID: \"d7ffef04-0e69-476d-bb3d-929b38c4b835\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6nbng" Mar 09 09:52:28 crc kubenswrapper[4971]: I0309 09:52:28.634914 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d7ffef04-0e69-476d-bb3d-929b38c4b835-etc-swift\") pod \"swift-ring-rebalance-debug-6nbng\" (UID: \"d7ffef04-0e69-476d-bb3d-929b38c4b835\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6nbng" Mar 09 09:52:28 crc kubenswrapper[4971]: I0309 09:52:28.635419 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d7ffef04-0e69-476d-bb3d-929b38c4b835-etc-swift\") pod \"swift-ring-rebalance-debug-6nbng\" (UID: \"d7ffef04-0e69-476d-bb3d-929b38c4b835\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6nbng" Mar 09 09:52:28 crc kubenswrapper[4971]: I0309 09:52:28.635766 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7ffef04-0e69-476d-bb3d-929b38c4b835-scripts\") pod \"swift-ring-rebalance-debug-6nbng\" (UID: \"d7ffef04-0e69-476d-bb3d-929b38c4b835\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6nbng" Mar 09 09:52:28 crc kubenswrapper[4971]: I0309 09:52:28.636084 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d7ffef04-0e69-476d-bb3d-929b38c4b835-ring-data-devices\") pod \"swift-ring-rebalance-debug-6nbng\" (UID: \"d7ffef04-0e69-476d-bb3d-929b38c4b835\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6nbng" Mar 09 09:52:28 crc kubenswrapper[4971]: I0309 09:52:28.639846 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d7ffef04-0e69-476d-bb3d-929b38c4b835-swiftconf\") pod \"swift-ring-rebalance-debug-6nbng\" (UID: \"d7ffef04-0e69-476d-bb3d-929b38c4b835\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6nbng" Mar 09 09:52:28 crc kubenswrapper[4971]: I0309 09:52:28.644773 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d7ffef04-0e69-476d-bb3d-929b38c4b835-dispersionconf\") pod \"swift-ring-rebalance-debug-6nbng\" (UID: \"d7ffef04-0e69-476d-bb3d-929b38c4b835\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6nbng" Mar 09 09:52:28 crc kubenswrapper[4971]: I0309 09:52:28.672251 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mqnj\" (UniqueName: \"kubernetes.io/projected/d7ffef04-0e69-476d-bb3d-929b38c4b835-kube-api-access-2mqnj\") pod \"swift-ring-rebalance-debug-6nbng\" (UID: \"d7ffef04-0e69-476d-bb3d-929b38c4b835\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6nbng" Mar 09 09:52:28 crc kubenswrapper[4971]: I0309 09:52:28.702256 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6nbng" Mar 09 09:52:29 crc kubenswrapper[4971]: I0309 09:52:29.089788 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6nbng"] Mar 09 09:52:29 crc kubenswrapper[4971]: I0309 09:52:29.165651 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a90a621-c900-4f6b-8629-601fca5b8fe6" path="/var/lib/kubelet/pods/6a90a621-c900-4f6b-8629-601fca5b8fe6/volumes" Mar 09 09:52:29 crc kubenswrapper[4971]: I0309 09:52:29.969333 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6nbng" event={"ID":"d7ffef04-0e69-476d-bb3d-929b38c4b835","Type":"ContainerStarted","Data":"6bdf31424b9fac476fcdee729b745912a6c0e7331c5d9d088040ee0700caaf27"} Mar 09 09:52:29 crc kubenswrapper[4971]: I0309 09:52:29.969741 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6nbng" event={"ID":"d7ffef04-0e69-476d-bb3d-929b38c4b835","Type":"ContainerStarted","Data":"b47a9cd9862bcb987abec06daface236021e8f1240c12c4354e4c1ec04433b28"} Mar 09 09:52:29 crc kubenswrapper[4971]: I0309 09:52:29.988987 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6nbng" podStartSLOduration=1.9889646650000001 podStartE2EDuration="1.988964665s" podCreationTimestamp="2026-03-09 09:52:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:52:29.983551588 +0000 UTC m=+1953.543479428" watchObservedRunningTime="2026-03-09 09:52:29.988964665 +0000 UTC m=+1953.548892475" Mar 09 09:52:30 crc kubenswrapper[4971]: I0309 09:52:30.979033 4971 generic.go:334] "Generic (PLEG): container finished" podID="d7ffef04-0e69-476d-bb3d-929b38c4b835" containerID="6bdf31424b9fac476fcdee729b745912a6c0e7331c5d9d088040ee0700caaf27" exitCode=0 Mar 09 09:52:30 crc kubenswrapper[4971]: I0309 09:52:30.979125 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6nbng" event={"ID":"d7ffef04-0e69-476d-bb3d-929b38c4b835","Type":"ContainerDied","Data":"6bdf31424b9fac476fcdee729b745912a6c0e7331c5d9d088040ee0700caaf27"} Mar 09 09:52:32 crc kubenswrapper[4971]: I0309 09:52:32.276421 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6nbng" Mar 09 09:52:32 crc kubenswrapper[4971]: I0309 09:52:32.307949 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6nbng"] Mar 09 09:52:32 crc kubenswrapper[4971]: I0309 09:52:32.313001 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6nbng"] Mar 09 09:52:32 crc kubenswrapper[4971]: I0309 09:52:32.405864 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d7ffef04-0e69-476d-bb3d-929b38c4b835-ring-data-devices\") pod \"d7ffef04-0e69-476d-bb3d-929b38c4b835\" (UID: \"d7ffef04-0e69-476d-bb3d-929b38c4b835\") " Mar 09 09:52:32 crc kubenswrapper[4971]: I0309 09:52:32.405930 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7ffef04-0e69-476d-bb3d-929b38c4b835-scripts\") pod \"d7ffef04-0e69-476d-bb3d-929b38c4b835\" (UID: \"d7ffef04-0e69-476d-bb3d-929b38c4b835\") " Mar 09 09:52:32 crc kubenswrapper[4971]: I0309 09:52:32.406020 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d7ffef04-0e69-476d-bb3d-929b38c4b835-dispersionconf\") pod \"d7ffef04-0e69-476d-bb3d-929b38c4b835\" (UID: \"d7ffef04-0e69-476d-bb3d-929b38c4b835\") " Mar 09 09:52:32 crc kubenswrapper[4971]: I0309 09:52:32.406079 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d7ffef04-0e69-476d-bb3d-929b38c4b835-swiftconf\") pod \"d7ffef04-0e69-476d-bb3d-929b38c4b835\" (UID: \"d7ffef04-0e69-476d-bb3d-929b38c4b835\") " Mar 09 09:52:32 crc kubenswrapper[4971]: I0309 09:52:32.406107 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d7ffef04-0e69-476d-bb3d-929b38c4b835-etc-swift\") pod \"d7ffef04-0e69-476d-bb3d-929b38c4b835\" (UID: \"d7ffef04-0e69-476d-bb3d-929b38c4b835\") " Mar 09 09:52:32 crc kubenswrapper[4971]: I0309 09:52:32.406180 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mqnj\" (UniqueName: \"kubernetes.io/projected/d7ffef04-0e69-476d-bb3d-929b38c4b835-kube-api-access-2mqnj\") pod \"d7ffef04-0e69-476d-bb3d-929b38c4b835\" (UID: \"d7ffef04-0e69-476d-bb3d-929b38c4b835\") " Mar 09 09:52:32 crc kubenswrapper[4971]: I0309 09:52:32.407005 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7ffef04-0e69-476d-bb3d-929b38c4b835-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d7ffef04-0e69-476d-bb3d-929b38c4b835" (UID: "d7ffef04-0e69-476d-bb3d-929b38c4b835"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:52:32 crc kubenswrapper[4971]: I0309 09:52:32.407286 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7ffef04-0e69-476d-bb3d-929b38c4b835-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d7ffef04-0e69-476d-bb3d-929b38c4b835" (UID: "d7ffef04-0e69-476d-bb3d-929b38c4b835"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:52:32 crc kubenswrapper[4971]: I0309 09:52:32.413318 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7ffef04-0e69-476d-bb3d-929b38c4b835-kube-api-access-2mqnj" (OuterVolumeSpecName: "kube-api-access-2mqnj") pod "d7ffef04-0e69-476d-bb3d-929b38c4b835" (UID: "d7ffef04-0e69-476d-bb3d-929b38c4b835"). InnerVolumeSpecName "kube-api-access-2mqnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:52:32 crc kubenswrapper[4971]: I0309 09:52:32.430965 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7ffef04-0e69-476d-bb3d-929b38c4b835-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d7ffef04-0e69-476d-bb3d-929b38c4b835" (UID: "d7ffef04-0e69-476d-bb3d-929b38c4b835"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:32 crc kubenswrapper[4971]: I0309 09:52:32.431046 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7ffef04-0e69-476d-bb3d-929b38c4b835-scripts" (OuterVolumeSpecName: "scripts") pod "d7ffef04-0e69-476d-bb3d-929b38c4b835" (UID: "d7ffef04-0e69-476d-bb3d-929b38c4b835"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:52:32 crc kubenswrapper[4971]: I0309 09:52:32.435919 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7ffef04-0e69-476d-bb3d-929b38c4b835-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d7ffef04-0e69-476d-bb3d-929b38c4b835" (UID: "d7ffef04-0e69-476d-bb3d-929b38c4b835"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:32 crc kubenswrapper[4971]: I0309 09:52:32.507717 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d7ffef04-0e69-476d-bb3d-929b38c4b835-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:32 crc kubenswrapper[4971]: I0309 09:52:32.507769 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7ffef04-0e69-476d-bb3d-929b38c4b835-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:32 crc kubenswrapper[4971]: I0309 09:52:32.507783 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d7ffef04-0e69-476d-bb3d-929b38c4b835-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:32 crc kubenswrapper[4971]: I0309 09:52:32.507795 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d7ffef04-0e69-476d-bb3d-929b38c4b835-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:32 crc kubenswrapper[4971]: I0309 09:52:32.507807 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d7ffef04-0e69-476d-bb3d-929b38c4b835-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:32 crc kubenswrapper[4971]: I0309 09:52:32.507820 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mqnj\" (UniqueName: \"kubernetes.io/projected/d7ffef04-0e69-476d-bb3d-929b38c4b835-kube-api-access-2mqnj\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:32 crc kubenswrapper[4971]: I0309 09:52:32.996197 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b47a9cd9862bcb987abec06daface236021e8f1240c12c4354e4c1ec04433b28" Mar 09 09:52:32 crc kubenswrapper[4971]: I0309 09:52:32.996274 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6nbng" Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.160172 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7ffef04-0e69-476d-bb3d-929b38c4b835" path="/var/lib/kubelet/pods/d7ffef04-0e69-476d-bb3d-929b38c4b835/volumes" Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.442272 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv"] Mar 09 09:52:33 crc kubenswrapper[4971]: E0309 09:52:33.442609 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7ffef04-0e69-476d-bb3d-929b38c4b835" containerName="swift-ring-rebalance" Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.442625 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7ffef04-0e69-476d-bb3d-929b38c4b835" containerName="swift-ring-rebalance" Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.442796 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7ffef04-0e69-476d-bb3d-929b38c4b835" containerName="swift-ring-rebalance" Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.443370 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv" Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.451976 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.451987 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.456207 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv"] Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.521235 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvc8k\" (UniqueName: \"kubernetes.io/projected/b6563c0a-d756-4fc6-8175-582dc8687511-kube-api-access-qvc8k\") pod \"swift-ring-rebalance-debug-j6hkv\" (UID: \"b6563c0a-d756-4fc6-8175-582dc8687511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv" Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.521408 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b6563c0a-d756-4fc6-8175-582dc8687511-ring-data-devices\") pod \"swift-ring-rebalance-debug-j6hkv\" (UID: \"b6563c0a-d756-4fc6-8175-582dc8687511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv" Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.521443 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b6563c0a-d756-4fc6-8175-582dc8687511-etc-swift\") pod \"swift-ring-rebalance-debug-j6hkv\" (UID: \"b6563c0a-d756-4fc6-8175-582dc8687511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv" Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.521477 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6563c0a-d756-4fc6-8175-582dc8687511-scripts\") pod \"swift-ring-rebalance-debug-j6hkv\" (UID: \"b6563c0a-d756-4fc6-8175-582dc8687511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv" Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.521589 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b6563c0a-d756-4fc6-8175-582dc8687511-dispersionconf\") pod \"swift-ring-rebalance-debug-j6hkv\" (UID: \"b6563c0a-d756-4fc6-8175-582dc8687511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv" Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.521672 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b6563c0a-d756-4fc6-8175-582dc8687511-swiftconf\") pod \"swift-ring-rebalance-debug-j6hkv\" (UID: \"b6563c0a-d756-4fc6-8175-582dc8687511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv" Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.622667 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvc8k\" (UniqueName: \"kubernetes.io/projected/b6563c0a-d756-4fc6-8175-582dc8687511-kube-api-access-qvc8k\") pod \"swift-ring-rebalance-debug-j6hkv\" (UID: \"b6563c0a-d756-4fc6-8175-582dc8687511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv" Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.622741 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b6563c0a-d756-4fc6-8175-582dc8687511-ring-data-devices\") pod \"swift-ring-rebalance-debug-j6hkv\" (UID: \"b6563c0a-d756-4fc6-8175-582dc8687511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv" Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.622765 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b6563c0a-d756-4fc6-8175-582dc8687511-etc-swift\") pod \"swift-ring-rebalance-debug-j6hkv\" (UID: \"b6563c0a-d756-4fc6-8175-582dc8687511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv" Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.622784 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6563c0a-d756-4fc6-8175-582dc8687511-scripts\") pod \"swift-ring-rebalance-debug-j6hkv\" (UID: \"b6563c0a-d756-4fc6-8175-582dc8687511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv" Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.622809 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b6563c0a-d756-4fc6-8175-582dc8687511-dispersionconf\") pod \"swift-ring-rebalance-debug-j6hkv\" (UID: \"b6563c0a-d756-4fc6-8175-582dc8687511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv" Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.622840 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b6563c0a-d756-4fc6-8175-582dc8687511-swiftconf\") pod \"swift-ring-rebalance-debug-j6hkv\" (UID: \"b6563c0a-d756-4fc6-8175-582dc8687511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv" Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.623730 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b6563c0a-d756-4fc6-8175-582dc8687511-etc-swift\") pod \"swift-ring-rebalance-debug-j6hkv\" (UID: \"b6563c0a-d756-4fc6-8175-582dc8687511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv" Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.624493 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6563c0a-d756-4fc6-8175-582dc8687511-scripts\") pod \"swift-ring-rebalance-debug-j6hkv\" (UID: \"b6563c0a-d756-4fc6-8175-582dc8687511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv" Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.624528 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b6563c0a-d756-4fc6-8175-582dc8687511-ring-data-devices\") pod \"swift-ring-rebalance-debug-j6hkv\" (UID: \"b6563c0a-d756-4fc6-8175-582dc8687511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv" Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.627408 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b6563c0a-d756-4fc6-8175-582dc8687511-swiftconf\") pod \"swift-ring-rebalance-debug-j6hkv\" (UID: \"b6563c0a-d756-4fc6-8175-582dc8687511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv" Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.630979 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b6563c0a-d756-4fc6-8175-582dc8687511-dispersionconf\") pod \"swift-ring-rebalance-debug-j6hkv\" (UID: \"b6563c0a-d756-4fc6-8175-582dc8687511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv" Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.641517 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvc8k\" (UniqueName: \"kubernetes.io/projected/b6563c0a-d756-4fc6-8175-582dc8687511-kube-api-access-qvc8k\") pod \"swift-ring-rebalance-debug-j6hkv\" (UID: \"b6563c0a-d756-4fc6-8175-582dc8687511\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv" Mar 09 09:52:33 crc kubenswrapper[4971]: I0309 09:52:33.768385 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv" Mar 09 09:52:34 crc kubenswrapper[4971]: W0309 09:52:34.185027 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6563c0a_d756_4fc6_8175_582dc8687511.slice/crio-6a222178d123b60491f822e60aa71e3db94e282fb4b3d83df68876ff962ae3f0 WatchSource:0}: Error finding container 6a222178d123b60491f822e60aa71e3db94e282fb4b3d83df68876ff962ae3f0: Status 404 returned error can't find the container with id 6a222178d123b60491f822e60aa71e3db94e282fb4b3d83df68876ff962ae3f0 Mar 09 09:52:34 crc kubenswrapper[4971]: I0309 09:52:34.185176 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv"] Mar 09 09:52:35 crc kubenswrapper[4971]: I0309 09:52:35.014489 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv" event={"ID":"b6563c0a-d756-4fc6-8175-582dc8687511","Type":"ContainerStarted","Data":"9637f2c7d9b596a0022bdd874771092ff9c33d601f8048f17784582729bc9372"} Mar 09 09:52:35 crc kubenswrapper[4971]: I0309 09:52:35.014800 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv" event={"ID":"b6563c0a-d756-4fc6-8175-582dc8687511","Type":"ContainerStarted","Data":"6a222178d123b60491f822e60aa71e3db94e282fb4b3d83df68876ff962ae3f0"} Mar 09 09:52:35 crc kubenswrapper[4971]: I0309 09:52:35.035564 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv" podStartSLOduration=2.035542891 podStartE2EDuration="2.035542891s" podCreationTimestamp="2026-03-09 09:52:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:52:35.03376751 +0000 UTC m=+1958.593695340" watchObservedRunningTime="2026-03-09 09:52:35.035542891 +0000 UTC m=+1958.595470701" Mar 09 09:52:35 crc kubenswrapper[4971]: E0309 09:52:35.558786 4971 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6563c0a_d756_4fc6_8175_582dc8687511.slice/crio-conmon-9637f2c7d9b596a0022bdd874771092ff9c33d601f8048f17784582729bc9372.scope\": RecentStats: unable to find data in memory cache]" Mar 09 09:52:36 crc kubenswrapper[4971]: I0309 09:52:36.023489 4971 generic.go:334] "Generic (PLEG): container finished" podID="b6563c0a-d756-4fc6-8175-582dc8687511" containerID="9637f2c7d9b596a0022bdd874771092ff9c33d601f8048f17784582729bc9372" exitCode=0 Mar 09 09:52:36 crc kubenswrapper[4971]: I0309 09:52:36.023535 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv" event={"ID":"b6563c0a-d756-4fc6-8175-582dc8687511","Type":"ContainerDied","Data":"9637f2c7d9b596a0022bdd874771092ff9c33d601f8048f17784582729bc9372"} Mar 09 09:52:37 crc kubenswrapper[4971]: I0309 09:52:37.160245 4971 scope.go:117] "RemoveContainer" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" Mar 09 09:52:37 crc kubenswrapper[4971]: E0309 09:52:37.160627 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 09:52:37 crc kubenswrapper[4971]: I0309 09:52:37.309649 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv" Mar 09 09:52:37 crc kubenswrapper[4971]: I0309 09:52:37.339872 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv"] Mar 09 09:52:37 crc kubenswrapper[4971]: I0309 09:52:37.345994 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv"] Mar 09 09:52:37 crc kubenswrapper[4971]: I0309 09:52:37.373363 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b6563c0a-d756-4fc6-8175-582dc8687511-etc-swift\") pod \"b6563c0a-d756-4fc6-8175-582dc8687511\" (UID: \"b6563c0a-d756-4fc6-8175-582dc8687511\") " Mar 09 09:52:37 crc kubenswrapper[4971]: I0309 09:52:37.373416 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b6563c0a-d756-4fc6-8175-582dc8687511-swiftconf\") pod \"b6563c0a-d756-4fc6-8175-582dc8687511\" (UID: \"b6563c0a-d756-4fc6-8175-582dc8687511\") " Mar 09 09:52:37 crc kubenswrapper[4971]: I0309 09:52:37.373522 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b6563c0a-d756-4fc6-8175-582dc8687511-ring-data-devices\") pod \"b6563c0a-d756-4fc6-8175-582dc8687511\" (UID: \"b6563c0a-d756-4fc6-8175-582dc8687511\") " Mar 09 09:52:37 crc kubenswrapper[4971]: I0309 09:52:37.373571 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b6563c0a-d756-4fc6-8175-582dc8687511-dispersionconf\") pod \"b6563c0a-d756-4fc6-8175-582dc8687511\" (UID: \"b6563c0a-d756-4fc6-8175-582dc8687511\") " Mar 09 09:52:37 crc kubenswrapper[4971]: I0309 09:52:37.373590 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6563c0a-d756-4fc6-8175-582dc8687511-scripts\") pod \"b6563c0a-d756-4fc6-8175-582dc8687511\" (UID: \"b6563c0a-d756-4fc6-8175-582dc8687511\") " Mar 09 09:52:37 crc kubenswrapper[4971]: I0309 09:52:37.373627 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvc8k\" (UniqueName: \"kubernetes.io/projected/b6563c0a-d756-4fc6-8175-582dc8687511-kube-api-access-qvc8k\") pod \"b6563c0a-d756-4fc6-8175-582dc8687511\" (UID: \"b6563c0a-d756-4fc6-8175-582dc8687511\") " Mar 09 09:52:37 crc kubenswrapper[4971]: I0309 09:52:37.374522 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6563c0a-d756-4fc6-8175-582dc8687511-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b6563c0a-d756-4fc6-8175-582dc8687511" (UID: "b6563c0a-d756-4fc6-8175-582dc8687511"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:52:37 crc kubenswrapper[4971]: I0309 09:52:37.374634 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6563c0a-d756-4fc6-8175-582dc8687511-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b6563c0a-d756-4fc6-8175-582dc8687511" (UID: "b6563c0a-d756-4fc6-8175-582dc8687511"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:52:37 crc kubenswrapper[4971]: I0309 09:52:37.379616 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6563c0a-d756-4fc6-8175-582dc8687511-kube-api-access-qvc8k" (OuterVolumeSpecName: "kube-api-access-qvc8k") pod "b6563c0a-d756-4fc6-8175-582dc8687511" (UID: "b6563c0a-d756-4fc6-8175-582dc8687511"). InnerVolumeSpecName "kube-api-access-qvc8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:52:37 crc kubenswrapper[4971]: I0309 09:52:37.395098 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6563c0a-d756-4fc6-8175-582dc8687511-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b6563c0a-d756-4fc6-8175-582dc8687511" (UID: "b6563c0a-d756-4fc6-8175-582dc8687511"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:37 crc kubenswrapper[4971]: I0309 09:52:37.396067 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6563c0a-d756-4fc6-8175-582dc8687511-scripts" (OuterVolumeSpecName: "scripts") pod "b6563c0a-d756-4fc6-8175-582dc8687511" (UID: "b6563c0a-d756-4fc6-8175-582dc8687511"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:52:37 crc kubenswrapper[4971]: I0309 09:52:37.397541 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6563c0a-d756-4fc6-8175-582dc8687511-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b6563c0a-d756-4fc6-8175-582dc8687511" (UID: "b6563c0a-d756-4fc6-8175-582dc8687511"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:37 crc kubenswrapper[4971]: I0309 09:52:37.474372 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b6563c0a-d756-4fc6-8175-582dc8687511-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:37 crc kubenswrapper[4971]: I0309 09:52:37.474410 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b6563c0a-d756-4fc6-8175-582dc8687511-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:37 crc kubenswrapper[4971]: I0309 09:52:37.474422 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6563c0a-d756-4fc6-8175-582dc8687511-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:37 crc kubenswrapper[4971]: I0309 09:52:37.474433 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvc8k\" (UniqueName: \"kubernetes.io/projected/b6563c0a-d756-4fc6-8175-582dc8687511-kube-api-access-qvc8k\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:37 crc kubenswrapper[4971]: I0309 09:52:37.474452 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b6563c0a-d756-4fc6-8175-582dc8687511-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:37 crc kubenswrapper[4971]: I0309 09:52:37.474483 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b6563c0a-d756-4fc6-8175-582dc8687511-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.037531 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a222178d123b60491f822e60aa71e3db94e282fb4b3d83df68876ff962ae3f0" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.037563 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6hkv" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.486382 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn"] Mar 09 09:52:38 crc kubenswrapper[4971]: E0309 09:52:38.486824 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6563c0a-d756-4fc6-8175-582dc8687511" containerName="swift-ring-rebalance" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.486842 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6563c0a-d756-4fc6-8175-582dc8687511" containerName="swift-ring-rebalance" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.487046 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6563c0a-d756-4fc6-8175-582dc8687511" containerName="swift-ring-rebalance" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.487611 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.489306 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.489719 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.495241 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn"] Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.589742 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6b90c54d-929b-455f-a649-7e0272beaf89-swiftconf\") pod \"swift-ring-rebalance-debug-hjjbn\" (UID: \"6b90c54d-929b-455f-a649-7e0272beaf89\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.590133 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6b90c54d-929b-455f-a649-7e0272beaf89-dispersionconf\") pod \"swift-ring-rebalance-debug-hjjbn\" (UID: \"6b90c54d-929b-455f-a649-7e0272beaf89\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.590171 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn6jd\" (UniqueName: \"kubernetes.io/projected/6b90c54d-929b-455f-a649-7e0272beaf89-kube-api-access-vn6jd\") pod \"swift-ring-rebalance-debug-hjjbn\" (UID: \"6b90c54d-929b-455f-a649-7e0272beaf89\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.590190 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b90c54d-929b-455f-a649-7e0272beaf89-scripts\") pod \"swift-ring-rebalance-debug-hjjbn\" (UID: \"6b90c54d-929b-455f-a649-7e0272beaf89\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.590248 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6b90c54d-929b-455f-a649-7e0272beaf89-ring-data-devices\") pod \"swift-ring-rebalance-debug-hjjbn\" (UID: \"6b90c54d-929b-455f-a649-7e0272beaf89\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.590265 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6b90c54d-929b-455f-a649-7e0272beaf89-etc-swift\") pod \"swift-ring-rebalance-debug-hjjbn\" (UID: \"6b90c54d-929b-455f-a649-7e0272beaf89\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.690880 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6b90c54d-929b-455f-a649-7e0272beaf89-dispersionconf\") pod \"swift-ring-rebalance-debug-hjjbn\" (UID: \"6b90c54d-929b-455f-a649-7e0272beaf89\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.690956 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn6jd\" (UniqueName: \"kubernetes.io/projected/6b90c54d-929b-455f-a649-7e0272beaf89-kube-api-access-vn6jd\") pod \"swift-ring-rebalance-debug-hjjbn\" (UID: \"6b90c54d-929b-455f-a649-7e0272beaf89\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.691017 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b90c54d-929b-455f-a649-7e0272beaf89-scripts\") pod \"swift-ring-rebalance-debug-hjjbn\" (UID: \"6b90c54d-929b-455f-a649-7e0272beaf89\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.691056 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6b90c54d-929b-455f-a649-7e0272beaf89-ring-data-devices\") pod \"swift-ring-rebalance-debug-hjjbn\" (UID: \"6b90c54d-929b-455f-a649-7e0272beaf89\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.691079 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6b90c54d-929b-455f-a649-7e0272beaf89-etc-swift\") pod \"swift-ring-rebalance-debug-hjjbn\" (UID: \"6b90c54d-929b-455f-a649-7e0272beaf89\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.691152 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6b90c54d-929b-455f-a649-7e0272beaf89-swiftconf\") pod \"swift-ring-rebalance-debug-hjjbn\" (UID: \"6b90c54d-929b-455f-a649-7e0272beaf89\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.691917 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6b90c54d-929b-455f-a649-7e0272beaf89-etc-swift\") pod \"swift-ring-rebalance-debug-hjjbn\" (UID: \"6b90c54d-929b-455f-a649-7e0272beaf89\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.692118 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6b90c54d-929b-455f-a649-7e0272beaf89-ring-data-devices\") pod \"swift-ring-rebalance-debug-hjjbn\" (UID: \"6b90c54d-929b-455f-a649-7e0272beaf89\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.692570 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b90c54d-929b-455f-a649-7e0272beaf89-scripts\") pod \"swift-ring-rebalance-debug-hjjbn\" (UID: \"6b90c54d-929b-455f-a649-7e0272beaf89\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.698172 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6b90c54d-929b-455f-a649-7e0272beaf89-dispersionconf\") pod \"swift-ring-rebalance-debug-hjjbn\" (UID: \"6b90c54d-929b-455f-a649-7e0272beaf89\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.702110 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6b90c54d-929b-455f-a649-7e0272beaf89-swiftconf\") pod \"swift-ring-rebalance-debug-hjjbn\" (UID: \"6b90c54d-929b-455f-a649-7e0272beaf89\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.709401 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn6jd\" (UniqueName: \"kubernetes.io/projected/6b90c54d-929b-455f-a649-7e0272beaf89-kube-api-access-vn6jd\") pod \"swift-ring-rebalance-debug-hjjbn\" (UID: \"6b90c54d-929b-455f-a649-7e0272beaf89\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn" Mar 09 09:52:38 crc kubenswrapper[4971]: I0309 09:52:38.806406 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn" Mar 09 09:52:39 crc kubenswrapper[4971]: I0309 09:52:39.160622 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6563c0a-d756-4fc6-8175-582dc8687511" path="/var/lib/kubelet/pods/b6563c0a-d756-4fc6-8175-582dc8687511/volumes" Mar 09 09:52:39 crc kubenswrapper[4971]: I0309 09:52:39.225451 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn"] Mar 09 09:52:39 crc kubenswrapper[4971]: W0309 09:52:39.228109 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b90c54d_929b_455f_a649_7e0272beaf89.slice/crio-920ae5209ead5f36958e9dbabeb158b9ea459eab86772b4f8992352e7f2ed4ae WatchSource:0}: Error finding container 920ae5209ead5f36958e9dbabeb158b9ea459eab86772b4f8992352e7f2ed4ae: Status 404 returned error can't find the container with id 920ae5209ead5f36958e9dbabeb158b9ea459eab86772b4f8992352e7f2ed4ae Mar 09 09:52:40 crc kubenswrapper[4971]: I0309 09:52:40.051138 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn" event={"ID":"6b90c54d-929b-455f-a649-7e0272beaf89","Type":"ContainerStarted","Data":"c790fcd5ec69025e216f89e967a044d0b659fc7fc83557580a7466e7fb5576f5"} Mar 09 09:52:40 crc kubenswrapper[4971]: I0309 09:52:40.052383 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn" event={"ID":"6b90c54d-929b-455f-a649-7e0272beaf89","Type":"ContainerStarted","Data":"920ae5209ead5f36958e9dbabeb158b9ea459eab86772b4f8992352e7f2ed4ae"} Mar 09 09:52:40 crc kubenswrapper[4971]: I0309 09:52:40.071822 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn" podStartSLOduration=2.071801979 podStartE2EDuration="2.071801979s" podCreationTimestamp="2026-03-09 09:52:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:52:40.067923347 +0000 UTC m=+1963.627851157" watchObservedRunningTime="2026-03-09 09:52:40.071801979 +0000 UTC m=+1963.631729789" Mar 09 09:52:41 crc kubenswrapper[4971]: I0309 09:52:41.060851 4971 generic.go:334] "Generic (PLEG): container finished" podID="6b90c54d-929b-455f-a649-7e0272beaf89" containerID="c790fcd5ec69025e216f89e967a044d0b659fc7fc83557580a7466e7fb5576f5" exitCode=0 Mar 09 09:52:41 crc kubenswrapper[4971]: I0309 09:52:41.060949 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn" event={"ID":"6b90c54d-929b-455f-a649-7e0272beaf89","Type":"ContainerDied","Data":"c790fcd5ec69025e216f89e967a044d0b659fc7fc83557580a7466e7fb5576f5"} Mar 09 09:52:42 crc kubenswrapper[4971]: I0309 09:52:42.344898 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn" Mar 09 09:52:42 crc kubenswrapper[4971]: I0309 09:52:42.381778 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn"] Mar 09 09:52:42 crc kubenswrapper[4971]: I0309 09:52:42.388997 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn"] Mar 09 09:52:42 crc kubenswrapper[4971]: I0309 09:52:42.445876 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6b90c54d-929b-455f-a649-7e0272beaf89-etc-swift\") pod \"6b90c54d-929b-455f-a649-7e0272beaf89\" (UID: \"6b90c54d-929b-455f-a649-7e0272beaf89\") " Mar 09 09:52:42 crc kubenswrapper[4971]: I0309 09:52:42.445998 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6b90c54d-929b-455f-a649-7e0272beaf89-swiftconf\") pod \"6b90c54d-929b-455f-a649-7e0272beaf89\" (UID: \"6b90c54d-929b-455f-a649-7e0272beaf89\") " Mar 09 09:52:42 crc kubenswrapper[4971]: I0309 09:52:42.446064 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6b90c54d-929b-455f-a649-7e0272beaf89-ring-data-devices\") pod \"6b90c54d-929b-455f-a649-7e0272beaf89\" (UID: \"6b90c54d-929b-455f-a649-7e0272beaf89\") " Mar 09 09:52:42 crc kubenswrapper[4971]: I0309 09:52:42.446607 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b90c54d-929b-455f-a649-7e0272beaf89-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6b90c54d-929b-455f-a649-7e0272beaf89" (UID: "6b90c54d-929b-455f-a649-7e0272beaf89"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:52:42 crc kubenswrapper[4971]: I0309 09:52:42.446856 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b90c54d-929b-455f-a649-7e0272beaf89-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6b90c54d-929b-455f-a649-7e0272beaf89" (UID: "6b90c54d-929b-455f-a649-7e0272beaf89"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:52:42 crc kubenswrapper[4971]: I0309 09:52:42.470522 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b90c54d-929b-455f-a649-7e0272beaf89-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6b90c54d-929b-455f-a649-7e0272beaf89" (UID: "6b90c54d-929b-455f-a649-7e0272beaf89"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:42 crc kubenswrapper[4971]: I0309 09:52:42.547010 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn6jd\" (UniqueName: \"kubernetes.io/projected/6b90c54d-929b-455f-a649-7e0272beaf89-kube-api-access-vn6jd\") pod \"6b90c54d-929b-455f-a649-7e0272beaf89\" (UID: \"6b90c54d-929b-455f-a649-7e0272beaf89\") " Mar 09 09:52:42 crc kubenswrapper[4971]: I0309 09:52:42.547057 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6b90c54d-929b-455f-a649-7e0272beaf89-dispersionconf\") pod \"6b90c54d-929b-455f-a649-7e0272beaf89\" (UID: \"6b90c54d-929b-455f-a649-7e0272beaf89\") " Mar 09 09:52:42 crc kubenswrapper[4971]: I0309 09:52:42.547139 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b90c54d-929b-455f-a649-7e0272beaf89-scripts\") pod \"6b90c54d-929b-455f-a649-7e0272beaf89\" (UID: \"6b90c54d-929b-455f-a649-7e0272beaf89\") " Mar 09 09:52:42 crc kubenswrapper[4971]: I0309 09:52:42.547508 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6b90c54d-929b-455f-a649-7e0272beaf89-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:42 crc kubenswrapper[4971]: I0309 09:52:42.547532 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6b90c54d-929b-455f-a649-7e0272beaf89-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:42 crc kubenswrapper[4971]: I0309 09:52:42.547545 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6b90c54d-929b-455f-a649-7e0272beaf89-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:42 crc kubenswrapper[4971]: I0309 09:52:42.551005 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b90c54d-929b-455f-a649-7e0272beaf89-kube-api-access-vn6jd" (OuterVolumeSpecName: "kube-api-access-vn6jd") pod "6b90c54d-929b-455f-a649-7e0272beaf89" (UID: "6b90c54d-929b-455f-a649-7e0272beaf89"). InnerVolumeSpecName "kube-api-access-vn6jd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:52:42 crc kubenswrapper[4971]: I0309 09:52:42.566068 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b90c54d-929b-455f-a649-7e0272beaf89-scripts" (OuterVolumeSpecName: "scripts") pod "6b90c54d-929b-455f-a649-7e0272beaf89" (UID: "6b90c54d-929b-455f-a649-7e0272beaf89"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:52:42 crc kubenswrapper[4971]: I0309 09:52:42.571663 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b90c54d-929b-455f-a649-7e0272beaf89-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6b90c54d-929b-455f-a649-7e0272beaf89" (UID: "6b90c54d-929b-455f-a649-7e0272beaf89"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:42 crc kubenswrapper[4971]: I0309 09:52:42.648533 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn6jd\" (UniqueName: \"kubernetes.io/projected/6b90c54d-929b-455f-a649-7e0272beaf89-kube-api-access-vn6jd\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:42 crc kubenswrapper[4971]: I0309 09:52:42.648567 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6b90c54d-929b-455f-a649-7e0272beaf89-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:42 crc kubenswrapper[4971]: I0309 09:52:42.648579 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b90c54d-929b-455f-a649-7e0272beaf89-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.081905 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="920ae5209ead5f36958e9dbabeb158b9ea459eab86772b4f8992352e7f2ed4ae" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.081958 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjjbn" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.159868 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b90c54d-929b-455f-a649-7e0272beaf89" path="/var/lib/kubelet/pods/6b90c54d-929b-455f-a649-7e0272beaf89/volumes" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.526104 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg"] Mar 09 09:52:43 crc kubenswrapper[4971]: E0309 09:52:43.526408 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b90c54d-929b-455f-a649-7e0272beaf89" containerName="swift-ring-rebalance" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.526419 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b90c54d-929b-455f-a649-7e0272beaf89" containerName="swift-ring-rebalance" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.526557 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b90c54d-929b-455f-a649-7e0272beaf89" containerName="swift-ring-rebalance" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.526978 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.528446 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.529599 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.538179 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg"] Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.561540 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4mlt\" (UniqueName: \"kubernetes.io/projected/18df9828-4784-426d-9fab-5d3997adb7d2-kube-api-access-f4mlt\") pod \"swift-ring-rebalance-debug-hm4hg\" (UID: \"18df9828-4784-426d-9fab-5d3997adb7d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.561624 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18df9828-4784-426d-9fab-5d3997adb7d2-scripts\") pod \"swift-ring-rebalance-debug-hm4hg\" (UID: \"18df9828-4784-426d-9fab-5d3997adb7d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.561649 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/18df9828-4784-426d-9fab-5d3997adb7d2-etc-swift\") pod \"swift-ring-rebalance-debug-hm4hg\" (UID: \"18df9828-4784-426d-9fab-5d3997adb7d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.561676 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/18df9828-4784-426d-9fab-5d3997adb7d2-swiftconf\") pod \"swift-ring-rebalance-debug-hm4hg\" (UID: \"18df9828-4784-426d-9fab-5d3997adb7d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.561701 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/18df9828-4784-426d-9fab-5d3997adb7d2-ring-data-devices\") pod \"swift-ring-rebalance-debug-hm4hg\" (UID: \"18df9828-4784-426d-9fab-5d3997adb7d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.561726 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/18df9828-4784-426d-9fab-5d3997adb7d2-dispersionconf\") pod \"swift-ring-rebalance-debug-hm4hg\" (UID: \"18df9828-4784-426d-9fab-5d3997adb7d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.663176 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4mlt\" (UniqueName: \"kubernetes.io/projected/18df9828-4784-426d-9fab-5d3997adb7d2-kube-api-access-f4mlt\") pod \"swift-ring-rebalance-debug-hm4hg\" (UID: \"18df9828-4784-426d-9fab-5d3997adb7d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.663261 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18df9828-4784-426d-9fab-5d3997adb7d2-scripts\") pod \"swift-ring-rebalance-debug-hm4hg\" (UID: \"18df9828-4784-426d-9fab-5d3997adb7d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.663287 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/18df9828-4784-426d-9fab-5d3997adb7d2-etc-swift\") pod \"swift-ring-rebalance-debug-hm4hg\" (UID: \"18df9828-4784-426d-9fab-5d3997adb7d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.663310 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/18df9828-4784-426d-9fab-5d3997adb7d2-swiftconf\") pod \"swift-ring-rebalance-debug-hm4hg\" (UID: \"18df9828-4784-426d-9fab-5d3997adb7d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.663328 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/18df9828-4784-426d-9fab-5d3997adb7d2-ring-data-devices\") pod \"swift-ring-rebalance-debug-hm4hg\" (UID: \"18df9828-4784-426d-9fab-5d3997adb7d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.663373 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/18df9828-4784-426d-9fab-5d3997adb7d2-dispersionconf\") pod \"swift-ring-rebalance-debug-hm4hg\" (UID: \"18df9828-4784-426d-9fab-5d3997adb7d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.664098 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/18df9828-4784-426d-9fab-5d3997adb7d2-etc-swift\") pod \"swift-ring-rebalance-debug-hm4hg\" (UID: \"18df9828-4784-426d-9fab-5d3997adb7d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.665834 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18df9828-4784-426d-9fab-5d3997adb7d2-scripts\") pod \"swift-ring-rebalance-debug-hm4hg\" (UID: \"18df9828-4784-426d-9fab-5d3997adb7d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.666027 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/18df9828-4784-426d-9fab-5d3997adb7d2-ring-data-devices\") pod \"swift-ring-rebalance-debug-hm4hg\" (UID: \"18df9828-4784-426d-9fab-5d3997adb7d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.668551 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/18df9828-4784-426d-9fab-5d3997adb7d2-swiftconf\") pod \"swift-ring-rebalance-debug-hm4hg\" (UID: \"18df9828-4784-426d-9fab-5d3997adb7d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.668819 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/18df9828-4784-426d-9fab-5d3997adb7d2-dispersionconf\") pod \"swift-ring-rebalance-debug-hm4hg\" (UID: \"18df9828-4784-426d-9fab-5d3997adb7d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.682665 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4mlt\" (UniqueName: \"kubernetes.io/projected/18df9828-4784-426d-9fab-5d3997adb7d2-kube-api-access-f4mlt\") pod \"swift-ring-rebalance-debug-hm4hg\" (UID: \"18df9828-4784-426d-9fab-5d3997adb7d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg" Mar 09 09:52:43 crc kubenswrapper[4971]: I0309 09:52:43.847092 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg" Mar 09 09:52:44 crc kubenswrapper[4971]: I0309 09:52:44.252562 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg"] Mar 09 09:52:44 crc kubenswrapper[4971]: W0309 09:52:44.255083 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18df9828_4784_426d_9fab_5d3997adb7d2.slice/crio-e4d7a998cae108f18a80f18cb09ce84e86f22bc7e9c27be8e2610f5c0077fd08 WatchSource:0}: Error finding container e4d7a998cae108f18a80f18cb09ce84e86f22bc7e9c27be8e2610f5c0077fd08: Status 404 returned error can't find the container with id e4d7a998cae108f18a80f18cb09ce84e86f22bc7e9c27be8e2610f5c0077fd08 Mar 09 09:52:45 crc kubenswrapper[4971]: I0309 09:52:45.099329 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg" event={"ID":"18df9828-4784-426d-9fab-5d3997adb7d2","Type":"ContainerStarted","Data":"e2465713fde0531a23362ad6760ad7da36b11e07dc49d14c5f7e7292edb7c0b4"} Mar 09 09:52:45 crc kubenswrapper[4971]: I0309 09:52:45.099694 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg" event={"ID":"18df9828-4784-426d-9fab-5d3997adb7d2","Type":"ContainerStarted","Data":"e4d7a998cae108f18a80f18cb09ce84e86f22bc7e9c27be8e2610f5c0077fd08"} Mar 09 09:52:45 crc kubenswrapper[4971]: I0309 09:52:45.119693 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg" podStartSLOduration=2.119674283 podStartE2EDuration="2.119674283s" podCreationTimestamp="2026-03-09 09:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:52:45.112394573 +0000 UTC m=+1968.672322383" watchObservedRunningTime="2026-03-09 09:52:45.119674283 +0000 UTC m=+1968.679602093" Mar 09 09:52:45 crc kubenswrapper[4971]: E0309 09:52:45.741562 4971 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18df9828_4784_426d_9fab_5d3997adb7d2.slice/crio-e2465713fde0531a23362ad6760ad7da36b11e07dc49d14c5f7e7292edb7c0b4.scope\": RecentStats: unable to find data in memory cache]" Mar 09 09:52:46 crc kubenswrapper[4971]: I0309 09:52:46.108440 4971 generic.go:334] "Generic (PLEG): container finished" podID="18df9828-4784-426d-9fab-5d3997adb7d2" containerID="e2465713fde0531a23362ad6760ad7da36b11e07dc49d14c5f7e7292edb7c0b4" exitCode=0 Mar 09 09:52:46 crc kubenswrapper[4971]: I0309 09:52:46.108492 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg" event={"ID":"18df9828-4784-426d-9fab-5d3997adb7d2","Type":"ContainerDied","Data":"e2465713fde0531a23362ad6760ad7da36b11e07dc49d14c5f7e7292edb7c0b4"} Mar 09 09:52:47 crc kubenswrapper[4971]: I0309 09:52:47.405022 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg" Mar 09 09:52:47 crc kubenswrapper[4971]: I0309 09:52:47.439273 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg"] Mar 09 09:52:47 crc kubenswrapper[4971]: I0309 09:52:47.445804 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg"] Mar 09 09:52:47 crc kubenswrapper[4971]: I0309 09:52:47.516263 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/18df9828-4784-426d-9fab-5d3997adb7d2-dispersionconf\") pod \"18df9828-4784-426d-9fab-5d3997adb7d2\" (UID: \"18df9828-4784-426d-9fab-5d3997adb7d2\") " Mar 09 09:52:47 crc kubenswrapper[4971]: I0309 09:52:47.516380 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/18df9828-4784-426d-9fab-5d3997adb7d2-ring-data-devices\") pod \"18df9828-4784-426d-9fab-5d3997adb7d2\" (UID: \"18df9828-4784-426d-9fab-5d3997adb7d2\") " Mar 09 09:52:47 crc kubenswrapper[4971]: I0309 09:52:47.516464 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4mlt\" (UniqueName: \"kubernetes.io/projected/18df9828-4784-426d-9fab-5d3997adb7d2-kube-api-access-f4mlt\") pod \"18df9828-4784-426d-9fab-5d3997adb7d2\" (UID: \"18df9828-4784-426d-9fab-5d3997adb7d2\") " Mar 09 09:52:47 crc kubenswrapper[4971]: I0309 09:52:47.516498 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18df9828-4784-426d-9fab-5d3997adb7d2-scripts\") pod \"18df9828-4784-426d-9fab-5d3997adb7d2\" (UID: \"18df9828-4784-426d-9fab-5d3997adb7d2\") " Mar 09 09:52:47 crc kubenswrapper[4971]: I0309 09:52:47.516533 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/18df9828-4784-426d-9fab-5d3997adb7d2-etc-swift\") pod \"18df9828-4784-426d-9fab-5d3997adb7d2\" (UID: \"18df9828-4784-426d-9fab-5d3997adb7d2\") " Mar 09 09:52:47 crc kubenswrapper[4971]: I0309 09:52:47.516557 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/18df9828-4784-426d-9fab-5d3997adb7d2-swiftconf\") pod \"18df9828-4784-426d-9fab-5d3997adb7d2\" (UID: \"18df9828-4784-426d-9fab-5d3997adb7d2\") " Mar 09 09:52:47 crc kubenswrapper[4971]: I0309 09:52:47.517227 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18df9828-4784-426d-9fab-5d3997adb7d2-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "18df9828-4784-426d-9fab-5d3997adb7d2" (UID: "18df9828-4784-426d-9fab-5d3997adb7d2"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:52:47 crc kubenswrapper[4971]: I0309 09:52:47.518021 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18df9828-4784-426d-9fab-5d3997adb7d2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "18df9828-4784-426d-9fab-5d3997adb7d2" (UID: "18df9828-4784-426d-9fab-5d3997adb7d2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:52:47 crc kubenswrapper[4971]: I0309 09:52:47.522333 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18df9828-4784-426d-9fab-5d3997adb7d2-kube-api-access-f4mlt" (OuterVolumeSpecName: "kube-api-access-f4mlt") pod "18df9828-4784-426d-9fab-5d3997adb7d2" (UID: "18df9828-4784-426d-9fab-5d3997adb7d2"). InnerVolumeSpecName "kube-api-access-f4mlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:52:47 crc kubenswrapper[4971]: I0309 09:52:47.536291 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18df9828-4784-426d-9fab-5d3997adb7d2-scripts" (OuterVolumeSpecName: "scripts") pod "18df9828-4784-426d-9fab-5d3997adb7d2" (UID: "18df9828-4784-426d-9fab-5d3997adb7d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:52:47 crc kubenswrapper[4971]: I0309 09:52:47.540264 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18df9828-4784-426d-9fab-5d3997adb7d2-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "18df9828-4784-426d-9fab-5d3997adb7d2" (UID: "18df9828-4784-426d-9fab-5d3997adb7d2"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:47 crc kubenswrapper[4971]: I0309 09:52:47.547990 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18df9828-4784-426d-9fab-5d3997adb7d2-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "18df9828-4784-426d-9fab-5d3997adb7d2" (UID: "18df9828-4784-426d-9fab-5d3997adb7d2"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:47 crc kubenswrapper[4971]: I0309 09:52:47.618761 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4mlt\" (UniqueName: \"kubernetes.io/projected/18df9828-4784-426d-9fab-5d3997adb7d2-kube-api-access-f4mlt\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:47 crc kubenswrapper[4971]: I0309 09:52:47.618806 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18df9828-4784-426d-9fab-5d3997adb7d2-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:47 crc kubenswrapper[4971]: I0309 09:52:47.618815 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/18df9828-4784-426d-9fab-5d3997adb7d2-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:47 crc kubenswrapper[4971]: I0309 09:52:47.618826 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/18df9828-4784-426d-9fab-5d3997adb7d2-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:47 crc kubenswrapper[4971]: I0309 09:52:47.618835 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/18df9828-4784-426d-9fab-5d3997adb7d2-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:47 crc kubenswrapper[4971]: I0309 09:52:47.618843 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/18df9828-4784-426d-9fab-5d3997adb7d2-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.123923 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4d7a998cae108f18a80f18cb09ce84e86f22bc7e9c27be8e2610f5c0077fd08" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.124179 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hm4hg" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.606560 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql"] Mar 09 09:52:48 crc kubenswrapper[4971]: E0309 09:52:48.607174 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18df9828-4784-426d-9fab-5d3997adb7d2" containerName="swift-ring-rebalance" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.607189 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="18df9828-4784-426d-9fab-5d3997adb7d2" containerName="swift-ring-rebalance" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.607398 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="18df9828-4784-426d-9fab-5d3997adb7d2" containerName="swift-ring-rebalance" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.607958 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.609761 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.609887 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.621095 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql"] Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.648317 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtpc9\" (UniqueName: \"kubernetes.io/projected/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-kube-api-access-wtpc9\") pod \"swift-ring-rebalance-debug-xj9ql\" (UID: \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.648406 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-dispersionconf\") pod \"swift-ring-rebalance-debug-xj9ql\" (UID: \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.648439 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-swiftconf\") pod \"swift-ring-rebalance-debug-xj9ql\" (UID: \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.648458 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-etc-swift\") pod \"swift-ring-rebalance-debug-xj9ql\" (UID: \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.648483 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-ring-data-devices\") pod \"swift-ring-rebalance-debug-xj9ql\" (UID: \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.648504 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-scripts\") pod \"swift-ring-rebalance-debug-xj9ql\" (UID: \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.749888 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtpc9\" (UniqueName: \"kubernetes.io/projected/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-kube-api-access-wtpc9\") pod \"swift-ring-rebalance-debug-xj9ql\" (UID: \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.749971 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-dispersionconf\") pod \"swift-ring-rebalance-debug-xj9ql\" (UID: \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.750199 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-swiftconf\") pod \"swift-ring-rebalance-debug-xj9ql\" (UID: \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.750242 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-etc-swift\") pod \"swift-ring-rebalance-debug-xj9ql\" (UID: \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.750286 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-ring-data-devices\") pod \"swift-ring-rebalance-debug-xj9ql\" (UID: \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.750319 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-scripts\") pod \"swift-ring-rebalance-debug-xj9ql\" (UID: \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.750910 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-etc-swift\") pod \"swift-ring-rebalance-debug-xj9ql\" (UID: \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.751169 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-scripts\") pod \"swift-ring-rebalance-debug-xj9ql\" (UID: \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.751198 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-ring-data-devices\") pod \"swift-ring-rebalance-debug-xj9ql\" (UID: \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.755244 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-swiftconf\") pod \"swift-ring-rebalance-debug-xj9ql\" (UID: \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.755397 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-dispersionconf\") pod \"swift-ring-rebalance-debug-xj9ql\" (UID: \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.769343 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtpc9\" (UniqueName: \"kubernetes.io/projected/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-kube-api-access-wtpc9\") pod \"swift-ring-rebalance-debug-xj9ql\" (UID: \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql" Mar 09 09:52:48 crc kubenswrapper[4971]: I0309 09:52:48.959530 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql" Mar 09 09:52:49 crc kubenswrapper[4971]: I0309 09:52:49.166119 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18df9828-4784-426d-9fab-5d3997adb7d2" path="/var/lib/kubelet/pods/18df9828-4784-426d-9fab-5d3997adb7d2/volumes" Mar 09 09:52:49 crc kubenswrapper[4971]: I0309 09:52:49.351994 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql"] Mar 09 09:52:49 crc kubenswrapper[4971]: W0309 09:52:49.352983 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce33bc2f_96f8_4d22_87aa_d69bce69b2c7.slice/crio-40f2ac2c903e875dd3528a99d2b4ca0fcaf4ed83357cf25790d7483547e418ce WatchSource:0}: Error finding container 40f2ac2c903e875dd3528a99d2b4ca0fcaf4ed83357cf25790d7483547e418ce: Status 404 returned error can't find the container with id 40f2ac2c903e875dd3528a99d2b4ca0fcaf4ed83357cf25790d7483547e418ce Mar 09 09:52:50 crc kubenswrapper[4971]: I0309 09:52:50.139639 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql" event={"ID":"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7","Type":"ContainerStarted","Data":"f1a9f5f4bfe9c9e580742c62e02d7fc6a7f7eb8dbee5c142e2a227ab5f7b0833"} Mar 09 09:52:50 crc kubenswrapper[4971]: I0309 09:52:50.139697 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql" event={"ID":"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7","Type":"ContainerStarted","Data":"40f2ac2c903e875dd3528a99d2b4ca0fcaf4ed83357cf25790d7483547e418ce"} Mar 09 09:52:50 crc kubenswrapper[4971]: I0309 09:52:50.152503 4971 scope.go:117] "RemoveContainer" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" Mar 09 09:52:50 crc kubenswrapper[4971]: E0309 09:52:50.153088 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 09:52:51 crc kubenswrapper[4971]: I0309 09:52:51.148707 4971 generic.go:334] "Generic (PLEG): container finished" podID="ce33bc2f-96f8-4d22-87aa-d69bce69b2c7" containerID="f1a9f5f4bfe9c9e580742c62e02d7fc6a7f7eb8dbee5c142e2a227ab5f7b0833" exitCode=0 Mar 09 09:52:51 crc kubenswrapper[4971]: I0309 09:52:51.148771 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql" event={"ID":"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7","Type":"ContainerDied","Data":"f1a9f5f4bfe9c9e580742c62e02d7fc6a7f7eb8dbee5c142e2a227ab5f7b0833"} Mar 09 09:52:52 crc kubenswrapper[4971]: I0309 09:52:52.410634 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql" Mar 09 09:52:52 crc kubenswrapper[4971]: I0309 09:52:52.446875 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql"] Mar 09 09:52:52 crc kubenswrapper[4971]: I0309 09:52:52.454063 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql"] Mar 09 09:52:52 crc kubenswrapper[4971]: I0309 09:52:52.609489 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-dispersionconf\") pod \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\" (UID: \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\") " Mar 09 09:52:52 crc kubenswrapper[4971]: I0309 09:52:52.609672 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-ring-data-devices\") pod \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\" (UID: \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\") " Mar 09 09:52:52 crc kubenswrapper[4971]: I0309 09:52:52.609769 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-swiftconf\") pod \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\" (UID: \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\") " Mar 09 09:52:52 crc kubenswrapper[4971]: I0309 09:52:52.609839 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-scripts\") pod \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\" (UID: \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\") " Mar 09 09:52:52 crc kubenswrapper[4971]: I0309 09:52:52.609883 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtpc9\" (UniqueName: \"kubernetes.io/projected/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-kube-api-access-wtpc9\") pod \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\" (UID: \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\") " Mar 09 09:52:52 crc kubenswrapper[4971]: I0309 09:52:52.609927 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-etc-swift\") pod \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\" (UID: \"ce33bc2f-96f8-4d22-87aa-d69bce69b2c7\") " Mar 09 09:52:52 crc kubenswrapper[4971]: I0309 09:52:52.610215 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ce33bc2f-96f8-4d22-87aa-d69bce69b2c7" (UID: "ce33bc2f-96f8-4d22-87aa-d69bce69b2c7"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:52:52 crc kubenswrapper[4971]: I0309 09:52:52.611805 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ce33bc2f-96f8-4d22-87aa-d69bce69b2c7" (UID: "ce33bc2f-96f8-4d22-87aa-d69bce69b2c7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:52:52 crc kubenswrapper[4971]: I0309 09:52:52.623012 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-kube-api-access-wtpc9" (OuterVolumeSpecName: "kube-api-access-wtpc9") pod "ce33bc2f-96f8-4d22-87aa-d69bce69b2c7" (UID: "ce33bc2f-96f8-4d22-87aa-d69bce69b2c7"). InnerVolumeSpecName "kube-api-access-wtpc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:52:52 crc kubenswrapper[4971]: I0309 09:52:52.638371 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ce33bc2f-96f8-4d22-87aa-d69bce69b2c7" (UID: "ce33bc2f-96f8-4d22-87aa-d69bce69b2c7"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:52 crc kubenswrapper[4971]: I0309 09:52:52.644901 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ce33bc2f-96f8-4d22-87aa-d69bce69b2c7" (UID: "ce33bc2f-96f8-4d22-87aa-d69bce69b2c7"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:52 crc kubenswrapper[4971]: I0309 09:52:52.646761 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-scripts" (OuterVolumeSpecName: "scripts") pod "ce33bc2f-96f8-4d22-87aa-d69bce69b2c7" (UID: "ce33bc2f-96f8-4d22-87aa-d69bce69b2c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:52:52 crc kubenswrapper[4971]: I0309 09:52:52.710928 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:52 crc kubenswrapper[4971]: I0309 09:52:52.710975 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:52 crc kubenswrapper[4971]: I0309 09:52:52.710989 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtpc9\" (UniqueName: \"kubernetes.io/projected/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-kube-api-access-wtpc9\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:52 crc kubenswrapper[4971]: I0309 09:52:52.711003 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:52 crc kubenswrapper[4971]: I0309 09:52:52.711013 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:52 crc kubenswrapper[4971]: I0309 09:52:52.711023 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.160997 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce33bc2f-96f8-4d22-87aa-d69bce69b2c7" path="/var/lib/kubelet/pods/ce33bc2f-96f8-4d22-87aa-d69bce69b2c7/volumes" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.164479 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xj9ql" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.164483 4971 scope.go:117] "RemoveContainer" containerID="f1a9f5f4bfe9c9e580742c62e02d7fc6a7f7eb8dbee5c142e2a227ab5f7b0833" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.581039 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9z29w"] Mar 09 09:52:53 crc kubenswrapper[4971]: E0309 09:52:53.581674 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce33bc2f-96f8-4d22-87aa-d69bce69b2c7" containerName="swift-ring-rebalance" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.581686 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce33bc2f-96f8-4d22-87aa-d69bce69b2c7" containerName="swift-ring-rebalance" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.581851 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce33bc2f-96f8-4d22-87aa-d69bce69b2c7" containerName="swift-ring-rebalance" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.582398 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9z29w" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.585116 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.585117 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.596294 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9z29w"] Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.626274 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38c6b7c5-b394-428f-8e99-85de20a76eba-scripts\") pod \"swift-ring-rebalance-debug-9z29w\" (UID: \"38c6b7c5-b394-428f-8e99-85de20a76eba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9z29w" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.626324 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/38c6b7c5-b394-428f-8e99-85de20a76eba-swiftconf\") pod \"swift-ring-rebalance-debug-9z29w\" (UID: \"38c6b7c5-b394-428f-8e99-85de20a76eba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9z29w" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.626430 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb7kt\" (UniqueName: \"kubernetes.io/projected/38c6b7c5-b394-428f-8e99-85de20a76eba-kube-api-access-rb7kt\") pod \"swift-ring-rebalance-debug-9z29w\" (UID: \"38c6b7c5-b394-428f-8e99-85de20a76eba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9z29w" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.626455 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/38c6b7c5-b394-428f-8e99-85de20a76eba-ring-data-devices\") pod \"swift-ring-rebalance-debug-9z29w\" (UID: \"38c6b7c5-b394-428f-8e99-85de20a76eba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9z29w" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.626477 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/38c6b7c5-b394-428f-8e99-85de20a76eba-etc-swift\") pod \"swift-ring-rebalance-debug-9z29w\" (UID: \"38c6b7c5-b394-428f-8e99-85de20a76eba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9z29w" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.626498 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/38c6b7c5-b394-428f-8e99-85de20a76eba-dispersionconf\") pod \"swift-ring-rebalance-debug-9z29w\" (UID: \"38c6b7c5-b394-428f-8e99-85de20a76eba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9z29w" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.727808 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb7kt\" (UniqueName: \"kubernetes.io/projected/38c6b7c5-b394-428f-8e99-85de20a76eba-kube-api-access-rb7kt\") pod \"swift-ring-rebalance-debug-9z29w\" (UID: \"38c6b7c5-b394-428f-8e99-85de20a76eba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9z29w" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.727851 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/38c6b7c5-b394-428f-8e99-85de20a76eba-ring-data-devices\") pod \"swift-ring-rebalance-debug-9z29w\" (UID: \"38c6b7c5-b394-428f-8e99-85de20a76eba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9z29w" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.727878 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/38c6b7c5-b394-428f-8e99-85de20a76eba-etc-swift\") pod \"swift-ring-rebalance-debug-9z29w\" (UID: \"38c6b7c5-b394-428f-8e99-85de20a76eba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9z29w" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.727903 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/38c6b7c5-b394-428f-8e99-85de20a76eba-dispersionconf\") pod \"swift-ring-rebalance-debug-9z29w\" (UID: \"38c6b7c5-b394-428f-8e99-85de20a76eba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9z29w" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.727929 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38c6b7c5-b394-428f-8e99-85de20a76eba-scripts\") pod \"swift-ring-rebalance-debug-9z29w\" (UID: \"38c6b7c5-b394-428f-8e99-85de20a76eba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9z29w" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.727945 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/38c6b7c5-b394-428f-8e99-85de20a76eba-swiftconf\") pod \"swift-ring-rebalance-debug-9z29w\" (UID: \"38c6b7c5-b394-428f-8e99-85de20a76eba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9z29w" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.728801 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/38c6b7c5-b394-428f-8e99-85de20a76eba-etc-swift\") pod \"swift-ring-rebalance-debug-9z29w\" (UID: \"38c6b7c5-b394-428f-8e99-85de20a76eba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9z29w" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.729422 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38c6b7c5-b394-428f-8e99-85de20a76eba-scripts\") pod \"swift-ring-rebalance-debug-9z29w\" (UID: \"38c6b7c5-b394-428f-8e99-85de20a76eba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9z29w" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.729427 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/38c6b7c5-b394-428f-8e99-85de20a76eba-ring-data-devices\") pod \"swift-ring-rebalance-debug-9z29w\" (UID: \"38c6b7c5-b394-428f-8e99-85de20a76eba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9z29w" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.733311 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/38c6b7c5-b394-428f-8e99-85de20a76eba-swiftconf\") pod \"swift-ring-rebalance-debug-9z29w\" (UID: \"38c6b7c5-b394-428f-8e99-85de20a76eba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9z29w" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.733311 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/38c6b7c5-b394-428f-8e99-85de20a76eba-dispersionconf\") pod \"swift-ring-rebalance-debug-9z29w\" (UID: \"38c6b7c5-b394-428f-8e99-85de20a76eba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9z29w" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.748783 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb7kt\" (UniqueName: \"kubernetes.io/projected/38c6b7c5-b394-428f-8e99-85de20a76eba-kube-api-access-rb7kt\") pod \"swift-ring-rebalance-debug-9z29w\" (UID: \"38c6b7c5-b394-428f-8e99-85de20a76eba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9z29w" Mar 09 09:52:53 crc kubenswrapper[4971]: I0309 09:52:53.899571 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9z29w" Mar 09 09:52:54 crc kubenswrapper[4971]: W0309 09:52:54.301045 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38c6b7c5_b394_428f_8e99_85de20a76eba.slice/crio-f859fedc3239bafb50a8c3cc4bc427d8cf4ba19e2c4ed943e091e35646ecd65e WatchSource:0}: Error finding container f859fedc3239bafb50a8c3cc4bc427d8cf4ba19e2c4ed943e091e35646ecd65e: Status 404 returned error can't find the container with id f859fedc3239bafb50a8c3cc4bc427d8cf4ba19e2c4ed943e091e35646ecd65e Mar 09 09:52:54 crc kubenswrapper[4971]: I0309 09:52:54.301520 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9z29w"] Mar 09 09:52:55 crc kubenswrapper[4971]: I0309 09:52:55.182492 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9z29w" event={"ID":"38c6b7c5-b394-428f-8e99-85de20a76eba","Type":"ContainerStarted","Data":"e0be9f0aa4d9663449403044f1891acba9dccc3538adc7e5ef36feb822a5a6dd"} Mar 09 09:52:55 crc kubenswrapper[4971]: I0309 09:52:55.182560 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9z29w" event={"ID":"38c6b7c5-b394-428f-8e99-85de20a76eba","Type":"ContainerStarted","Data":"f859fedc3239bafb50a8c3cc4bc427d8cf4ba19e2c4ed943e091e35646ecd65e"} Mar 09 09:52:55 crc kubenswrapper[4971]: I0309 09:52:55.201299 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9z29w" podStartSLOduration=2.201281252 podStartE2EDuration="2.201281252s" podCreationTimestamp="2026-03-09 09:52:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:52:55.198404199 +0000 UTC m=+1978.758332009" watchObservedRunningTime="2026-03-09 09:52:55.201281252 +0000 UTC m=+1978.761209062" Mar 09 09:52:56 crc kubenswrapper[4971]: I0309 09:52:56.191763 4971 generic.go:334] "Generic (PLEG): container finished" podID="38c6b7c5-b394-428f-8e99-85de20a76eba" containerID="e0be9f0aa4d9663449403044f1891acba9dccc3538adc7e5ef36feb822a5a6dd" exitCode=0 Mar 09 09:52:56 crc kubenswrapper[4971]: I0309 09:52:56.191803 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9z29w" event={"ID":"38c6b7c5-b394-428f-8e99-85de20a76eba","Type":"ContainerDied","Data":"e0be9f0aa4d9663449403044f1891acba9dccc3538adc7e5ef36feb822a5a6dd"} Mar 09 09:52:57 crc kubenswrapper[4971]: I0309 09:52:57.499168 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9z29w" Mar 09 09:52:57 crc kubenswrapper[4971]: I0309 09:52:57.545323 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9z29w"] Mar 09 09:52:57 crc kubenswrapper[4971]: I0309 09:52:57.553772 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9z29w"] Mar 09 09:52:57 crc kubenswrapper[4971]: I0309 09:52:57.682500 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/38c6b7c5-b394-428f-8e99-85de20a76eba-swiftconf\") pod \"38c6b7c5-b394-428f-8e99-85de20a76eba\" (UID: \"38c6b7c5-b394-428f-8e99-85de20a76eba\") " Mar 09 09:52:57 crc kubenswrapper[4971]: I0309 09:52:57.682852 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/38c6b7c5-b394-428f-8e99-85de20a76eba-etc-swift\") pod \"38c6b7c5-b394-428f-8e99-85de20a76eba\" (UID: \"38c6b7c5-b394-428f-8e99-85de20a76eba\") " Mar 09 09:52:57 crc kubenswrapper[4971]: I0309 09:52:57.682890 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/38c6b7c5-b394-428f-8e99-85de20a76eba-dispersionconf\") pod \"38c6b7c5-b394-428f-8e99-85de20a76eba\" (UID: \"38c6b7c5-b394-428f-8e99-85de20a76eba\") " Mar 09 09:52:57 crc kubenswrapper[4971]: I0309 09:52:57.682927 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38c6b7c5-b394-428f-8e99-85de20a76eba-scripts\") pod \"38c6b7c5-b394-428f-8e99-85de20a76eba\" (UID: \"38c6b7c5-b394-428f-8e99-85de20a76eba\") " Mar 09 09:52:57 crc kubenswrapper[4971]: I0309 09:52:57.682946 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/38c6b7c5-b394-428f-8e99-85de20a76eba-ring-data-devices\") pod \"38c6b7c5-b394-428f-8e99-85de20a76eba\" (UID: \"38c6b7c5-b394-428f-8e99-85de20a76eba\") " Mar 09 09:52:57 crc kubenswrapper[4971]: I0309 09:52:57.682997 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb7kt\" (UniqueName: \"kubernetes.io/projected/38c6b7c5-b394-428f-8e99-85de20a76eba-kube-api-access-rb7kt\") pod \"38c6b7c5-b394-428f-8e99-85de20a76eba\" (UID: \"38c6b7c5-b394-428f-8e99-85de20a76eba\") " Mar 09 09:52:57 crc kubenswrapper[4971]: I0309 09:52:57.684031 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38c6b7c5-b394-428f-8e99-85de20a76eba-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "38c6b7c5-b394-428f-8e99-85de20a76eba" (UID: "38c6b7c5-b394-428f-8e99-85de20a76eba"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:52:57 crc kubenswrapper[4971]: I0309 09:52:57.684399 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38c6b7c5-b394-428f-8e99-85de20a76eba-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "38c6b7c5-b394-428f-8e99-85de20a76eba" (UID: "38c6b7c5-b394-428f-8e99-85de20a76eba"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:52:57 crc kubenswrapper[4971]: I0309 09:52:57.688382 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38c6b7c5-b394-428f-8e99-85de20a76eba-kube-api-access-rb7kt" (OuterVolumeSpecName: "kube-api-access-rb7kt") pod "38c6b7c5-b394-428f-8e99-85de20a76eba" (UID: "38c6b7c5-b394-428f-8e99-85de20a76eba"). InnerVolumeSpecName "kube-api-access-rb7kt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:52:57 crc kubenswrapper[4971]: I0309 09:52:57.704216 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38c6b7c5-b394-428f-8e99-85de20a76eba-scripts" (OuterVolumeSpecName: "scripts") pod "38c6b7c5-b394-428f-8e99-85de20a76eba" (UID: "38c6b7c5-b394-428f-8e99-85de20a76eba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:52:57 crc kubenswrapper[4971]: I0309 09:52:57.704503 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38c6b7c5-b394-428f-8e99-85de20a76eba-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "38c6b7c5-b394-428f-8e99-85de20a76eba" (UID: "38c6b7c5-b394-428f-8e99-85de20a76eba"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:57 crc kubenswrapper[4971]: I0309 09:52:57.705040 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38c6b7c5-b394-428f-8e99-85de20a76eba-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "38c6b7c5-b394-428f-8e99-85de20a76eba" (UID: "38c6b7c5-b394-428f-8e99-85de20a76eba"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:57 crc kubenswrapper[4971]: I0309 09:52:57.784181 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/38c6b7c5-b394-428f-8e99-85de20a76eba-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:57 crc kubenswrapper[4971]: I0309 09:52:57.784215 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38c6b7c5-b394-428f-8e99-85de20a76eba-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:57 crc kubenswrapper[4971]: I0309 09:52:57.784226 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/38c6b7c5-b394-428f-8e99-85de20a76eba-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:57 crc kubenswrapper[4971]: I0309 09:52:57.784238 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rb7kt\" (UniqueName: \"kubernetes.io/projected/38c6b7c5-b394-428f-8e99-85de20a76eba-kube-api-access-rb7kt\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:57 crc kubenswrapper[4971]: I0309 09:52:57.784275 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/38c6b7c5-b394-428f-8e99-85de20a76eba-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:57 crc kubenswrapper[4971]: I0309 09:52:57.784286 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/38c6b7c5-b394-428f-8e99-85de20a76eba-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.207950 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f859fedc3239bafb50a8c3cc4bc427d8cf4ba19e2c4ed943e091e35646ecd65e" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.208031 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9z29w" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.658653 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-brq6x"] Mar 09 09:52:58 crc kubenswrapper[4971]: E0309 09:52:58.659978 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38c6b7c5-b394-428f-8e99-85de20a76eba" containerName="swift-ring-rebalance" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.660055 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="38c6b7c5-b394-428f-8e99-85de20a76eba" containerName="swift-ring-rebalance" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.660257 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="38c6b7c5-b394-428f-8e99-85de20a76eba" containerName="swift-ring-rebalance" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.660809 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-brq6x" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.662644 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.663654 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.667757 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-brq6x"] Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.695423 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5189e94a-609a-4541-adb9-575b20cd3d46-etc-swift\") pod \"swift-ring-rebalance-debug-brq6x\" (UID: \"5189e94a-609a-4541-adb9-575b20cd3d46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-brq6x" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.695497 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5189e94a-609a-4541-adb9-575b20cd3d46-scripts\") pod \"swift-ring-rebalance-debug-brq6x\" (UID: \"5189e94a-609a-4541-adb9-575b20cd3d46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-brq6x" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.695535 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5189e94a-609a-4541-adb9-575b20cd3d46-swiftconf\") pod \"swift-ring-rebalance-debug-brq6x\" (UID: \"5189e94a-609a-4541-adb9-575b20cd3d46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-brq6x" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.695575 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5189e94a-609a-4541-adb9-575b20cd3d46-ring-data-devices\") pod \"swift-ring-rebalance-debug-brq6x\" (UID: \"5189e94a-609a-4541-adb9-575b20cd3d46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-brq6x" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.695624 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5189e94a-609a-4541-adb9-575b20cd3d46-dispersionconf\") pod \"swift-ring-rebalance-debug-brq6x\" (UID: \"5189e94a-609a-4541-adb9-575b20cd3d46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-brq6x" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.695677 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjhcr\" (UniqueName: \"kubernetes.io/projected/5189e94a-609a-4541-adb9-575b20cd3d46-kube-api-access-qjhcr\") pod \"swift-ring-rebalance-debug-brq6x\" (UID: \"5189e94a-609a-4541-adb9-575b20cd3d46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-brq6x" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.797272 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5189e94a-609a-4541-adb9-575b20cd3d46-etc-swift\") pod \"swift-ring-rebalance-debug-brq6x\" (UID: \"5189e94a-609a-4541-adb9-575b20cd3d46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-brq6x" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.797365 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5189e94a-609a-4541-adb9-575b20cd3d46-scripts\") pod \"swift-ring-rebalance-debug-brq6x\" (UID: \"5189e94a-609a-4541-adb9-575b20cd3d46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-brq6x" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.797396 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5189e94a-609a-4541-adb9-575b20cd3d46-swiftconf\") pod \"swift-ring-rebalance-debug-brq6x\" (UID: \"5189e94a-609a-4541-adb9-575b20cd3d46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-brq6x" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.797441 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5189e94a-609a-4541-adb9-575b20cd3d46-ring-data-devices\") pod \"swift-ring-rebalance-debug-brq6x\" (UID: \"5189e94a-609a-4541-adb9-575b20cd3d46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-brq6x" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.797465 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5189e94a-609a-4541-adb9-575b20cd3d46-dispersionconf\") pod \"swift-ring-rebalance-debug-brq6x\" (UID: \"5189e94a-609a-4541-adb9-575b20cd3d46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-brq6x" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.797523 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjhcr\" (UniqueName: \"kubernetes.io/projected/5189e94a-609a-4541-adb9-575b20cd3d46-kube-api-access-qjhcr\") pod \"swift-ring-rebalance-debug-brq6x\" (UID: \"5189e94a-609a-4541-adb9-575b20cd3d46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-brq6x" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.797801 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5189e94a-609a-4541-adb9-575b20cd3d46-etc-swift\") pod \"swift-ring-rebalance-debug-brq6x\" (UID: \"5189e94a-609a-4541-adb9-575b20cd3d46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-brq6x" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.798185 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5189e94a-609a-4541-adb9-575b20cd3d46-scripts\") pod \"swift-ring-rebalance-debug-brq6x\" (UID: \"5189e94a-609a-4541-adb9-575b20cd3d46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-brq6x" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.798393 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5189e94a-609a-4541-adb9-575b20cd3d46-ring-data-devices\") pod \"swift-ring-rebalance-debug-brq6x\" (UID: \"5189e94a-609a-4541-adb9-575b20cd3d46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-brq6x" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.805496 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5189e94a-609a-4541-adb9-575b20cd3d46-dispersionconf\") pod \"swift-ring-rebalance-debug-brq6x\" (UID: \"5189e94a-609a-4541-adb9-575b20cd3d46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-brq6x" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.805695 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5189e94a-609a-4541-adb9-575b20cd3d46-swiftconf\") pod \"swift-ring-rebalance-debug-brq6x\" (UID: \"5189e94a-609a-4541-adb9-575b20cd3d46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-brq6x" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.813768 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjhcr\" (UniqueName: \"kubernetes.io/projected/5189e94a-609a-4541-adb9-575b20cd3d46-kube-api-access-qjhcr\") pod \"swift-ring-rebalance-debug-brq6x\" (UID: \"5189e94a-609a-4541-adb9-575b20cd3d46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-brq6x" Mar 09 09:52:58 crc kubenswrapper[4971]: I0309 09:52:58.979994 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-brq6x" Mar 09 09:52:59 crc kubenswrapper[4971]: I0309 09:52:59.165973 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38c6b7c5-b394-428f-8e99-85de20a76eba" path="/var/lib/kubelet/pods/38c6b7c5-b394-428f-8e99-85de20a76eba/volumes" Mar 09 09:52:59 crc kubenswrapper[4971]: I0309 09:52:59.387708 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-brq6x"] Mar 09 09:53:00 crc kubenswrapper[4971]: I0309 09:53:00.226544 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-brq6x" event={"ID":"5189e94a-609a-4541-adb9-575b20cd3d46","Type":"ContainerStarted","Data":"24c6684cd79a45ec74fa83c6ef6bfd8dd44d58e1036dc7d00eff11d76e91f184"} Mar 09 09:53:00 crc kubenswrapper[4971]: I0309 09:53:00.226870 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-brq6x" event={"ID":"5189e94a-609a-4541-adb9-575b20cd3d46","Type":"ContainerStarted","Data":"de9190bb3b7dc68462c7187bb305c238aebc68e20cff2479947f66754ffe9ebd"} Mar 09 09:53:00 crc kubenswrapper[4971]: I0309 09:53:00.249982 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-brq6x" podStartSLOduration=2.249964138 podStartE2EDuration="2.249964138s" podCreationTimestamp="2026-03-09 09:52:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:53:00.245054557 +0000 UTC m=+1983.804982377" watchObservedRunningTime="2026-03-09 09:53:00.249964138 +0000 UTC m=+1983.809891948" Mar 09 09:53:01 crc kubenswrapper[4971]: I0309 09:53:01.241419 4971 generic.go:334] "Generic (PLEG): container finished" podID="5189e94a-609a-4541-adb9-575b20cd3d46" containerID="24c6684cd79a45ec74fa83c6ef6bfd8dd44d58e1036dc7d00eff11d76e91f184" exitCode=0 Mar 09 09:53:01 crc kubenswrapper[4971]: I0309 09:53:01.241471 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-brq6x" event={"ID":"5189e94a-609a-4541-adb9-575b20cd3d46","Type":"ContainerDied","Data":"24c6684cd79a45ec74fa83c6ef6bfd8dd44d58e1036dc7d00eff11d76e91f184"} Mar 09 09:53:02 crc kubenswrapper[4971]: I0309 09:53:02.536851 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-brq6x" Mar 09 09:53:02 crc kubenswrapper[4971]: I0309 09:53:02.555405 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5189e94a-609a-4541-adb9-575b20cd3d46-swiftconf\") pod \"5189e94a-609a-4541-adb9-575b20cd3d46\" (UID: \"5189e94a-609a-4541-adb9-575b20cd3d46\") " Mar 09 09:53:02 crc kubenswrapper[4971]: I0309 09:53:02.555465 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5189e94a-609a-4541-adb9-575b20cd3d46-dispersionconf\") pod \"5189e94a-609a-4541-adb9-575b20cd3d46\" (UID: \"5189e94a-609a-4541-adb9-575b20cd3d46\") " Mar 09 09:53:02 crc kubenswrapper[4971]: I0309 09:53:02.555501 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjhcr\" (UniqueName: \"kubernetes.io/projected/5189e94a-609a-4541-adb9-575b20cd3d46-kube-api-access-qjhcr\") pod \"5189e94a-609a-4541-adb9-575b20cd3d46\" (UID: \"5189e94a-609a-4541-adb9-575b20cd3d46\") " Mar 09 09:53:02 crc kubenswrapper[4971]: I0309 09:53:02.555537 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5189e94a-609a-4541-adb9-575b20cd3d46-etc-swift\") pod \"5189e94a-609a-4541-adb9-575b20cd3d46\" (UID: \"5189e94a-609a-4541-adb9-575b20cd3d46\") " Mar 09 09:53:02 crc kubenswrapper[4971]: I0309 09:53:02.555557 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5189e94a-609a-4541-adb9-575b20cd3d46-ring-data-devices\") pod \"5189e94a-609a-4541-adb9-575b20cd3d46\" (UID: \"5189e94a-609a-4541-adb9-575b20cd3d46\") " Mar 09 09:53:02 crc kubenswrapper[4971]: I0309 09:53:02.555599 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5189e94a-609a-4541-adb9-575b20cd3d46-scripts\") pod \"5189e94a-609a-4541-adb9-575b20cd3d46\" (UID: \"5189e94a-609a-4541-adb9-575b20cd3d46\") " Mar 09 09:53:02 crc kubenswrapper[4971]: I0309 09:53:02.556267 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5189e94a-609a-4541-adb9-575b20cd3d46-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5189e94a-609a-4541-adb9-575b20cd3d46" (UID: "5189e94a-609a-4541-adb9-575b20cd3d46"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:53:02 crc kubenswrapper[4971]: I0309 09:53:02.556421 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5189e94a-609a-4541-adb9-575b20cd3d46-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5189e94a-609a-4541-adb9-575b20cd3d46" (UID: "5189e94a-609a-4541-adb9-575b20cd3d46"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:53:02 crc kubenswrapper[4971]: I0309 09:53:02.572669 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5189e94a-609a-4541-adb9-575b20cd3d46-kube-api-access-qjhcr" (OuterVolumeSpecName: "kube-api-access-qjhcr") pod "5189e94a-609a-4541-adb9-575b20cd3d46" (UID: "5189e94a-609a-4541-adb9-575b20cd3d46"). InnerVolumeSpecName "kube-api-access-qjhcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:53:02 crc kubenswrapper[4971]: I0309 09:53:02.580539 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5189e94a-609a-4541-adb9-575b20cd3d46-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5189e94a-609a-4541-adb9-575b20cd3d46" (UID: "5189e94a-609a-4541-adb9-575b20cd3d46"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:02 crc kubenswrapper[4971]: I0309 09:53:02.580863 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5189e94a-609a-4541-adb9-575b20cd3d46-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5189e94a-609a-4541-adb9-575b20cd3d46" (UID: "5189e94a-609a-4541-adb9-575b20cd3d46"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:02 crc kubenswrapper[4971]: I0309 09:53:02.587039 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5189e94a-609a-4541-adb9-575b20cd3d46-scripts" (OuterVolumeSpecName: "scripts") pod "5189e94a-609a-4541-adb9-575b20cd3d46" (UID: "5189e94a-609a-4541-adb9-575b20cd3d46"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:53:02 crc kubenswrapper[4971]: I0309 09:53:02.591665 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-brq6x"] Mar 09 09:53:02 crc kubenswrapper[4971]: I0309 09:53:02.598079 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-brq6x"] Mar 09 09:53:02 crc kubenswrapper[4971]: I0309 09:53:02.657009 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5189e94a-609a-4541-adb9-575b20cd3d46-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:02 crc kubenswrapper[4971]: I0309 09:53:02.657044 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5189e94a-609a-4541-adb9-575b20cd3d46-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:02 crc kubenswrapper[4971]: I0309 09:53:02.657058 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjhcr\" (UniqueName: \"kubernetes.io/projected/5189e94a-609a-4541-adb9-575b20cd3d46-kube-api-access-qjhcr\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:02 crc kubenswrapper[4971]: I0309 09:53:02.657070 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5189e94a-609a-4541-adb9-575b20cd3d46-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:02 crc kubenswrapper[4971]: I0309 09:53:02.657082 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5189e94a-609a-4541-adb9-575b20cd3d46-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:02 crc kubenswrapper[4971]: I0309 09:53:02.657093 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5189e94a-609a-4541-adb9-575b20cd3d46-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.160521 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5189e94a-609a-4541-adb9-575b20cd3d46" path="/var/lib/kubelet/pods/5189e94a-609a-4541-adb9-575b20cd3d46/volumes" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.257654 4971 scope.go:117] "RemoveContainer" containerID="24c6684cd79a45ec74fa83c6ef6bfd8dd44d58e1036dc7d00eff11d76e91f184" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.257704 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-brq6x" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.744976 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b"] Mar 09 09:53:03 crc kubenswrapper[4971]: E0309 09:53:03.746071 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5189e94a-609a-4541-adb9-575b20cd3d46" containerName="swift-ring-rebalance" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.746092 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5189e94a-609a-4541-adb9-575b20cd3d46" containerName="swift-ring-rebalance" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.746337 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="5189e94a-609a-4541-adb9-575b20cd3d46" containerName="swift-ring-rebalance" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.747114 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.751145 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.751175 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.757952 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b"] Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.775683 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-etc-swift\") pod \"swift-ring-rebalance-debug-x5c8b\" (UID: \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.775739 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-dispersionconf\") pod \"swift-ring-rebalance-debug-x5c8b\" (UID: \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.775766 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-ring-data-devices\") pod \"swift-ring-rebalance-debug-x5c8b\" (UID: \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.775827 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-scripts\") pod \"swift-ring-rebalance-debug-x5c8b\" (UID: \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.775848 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqp9h\" (UniqueName: \"kubernetes.io/projected/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-kube-api-access-tqp9h\") pod \"swift-ring-rebalance-debug-x5c8b\" (UID: \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.775870 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-swiftconf\") pod \"swift-ring-rebalance-debug-x5c8b\" (UID: \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.876934 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-etc-swift\") pod \"swift-ring-rebalance-debug-x5c8b\" (UID: \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.876990 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-dispersionconf\") pod \"swift-ring-rebalance-debug-x5c8b\" (UID: \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.877022 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-ring-data-devices\") pod \"swift-ring-rebalance-debug-x5c8b\" (UID: \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.877091 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-scripts\") pod \"swift-ring-rebalance-debug-x5c8b\" (UID: \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.877114 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqp9h\" (UniqueName: \"kubernetes.io/projected/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-kube-api-access-tqp9h\") pod \"swift-ring-rebalance-debug-x5c8b\" (UID: \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.877144 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-swiftconf\") pod \"swift-ring-rebalance-debug-x5c8b\" (UID: \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.877739 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-etc-swift\") pod \"swift-ring-rebalance-debug-x5c8b\" (UID: \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.878266 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-ring-data-devices\") pod \"swift-ring-rebalance-debug-x5c8b\" (UID: \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.878653 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-scripts\") pod \"swift-ring-rebalance-debug-x5c8b\" (UID: \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.881902 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-swiftconf\") pod \"swift-ring-rebalance-debug-x5c8b\" (UID: \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.881972 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-dispersionconf\") pod \"swift-ring-rebalance-debug-x5c8b\" (UID: \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b" Mar 09 09:53:03 crc kubenswrapper[4971]: I0309 09:53:03.897183 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqp9h\" (UniqueName: \"kubernetes.io/projected/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-kube-api-access-tqp9h\") pod \"swift-ring-rebalance-debug-x5c8b\" (UID: \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b" Mar 09 09:53:04 crc kubenswrapper[4971]: I0309 09:53:04.066742 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b" Mar 09 09:53:04 crc kubenswrapper[4971]: I0309 09:53:04.517295 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b"] Mar 09 09:53:05 crc kubenswrapper[4971]: I0309 09:53:05.152784 4971 scope.go:117] "RemoveContainer" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" Mar 09 09:53:05 crc kubenswrapper[4971]: E0309 09:53:05.153569 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 09:53:05 crc kubenswrapper[4971]: I0309 09:53:05.276679 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b" event={"ID":"8640797f-0c0e-4bd1-8d54-ff6c7e22560e","Type":"ContainerStarted","Data":"5749369ae261d05f2985c4d1397094182354c50c40b2b8ee796b9f35da067601"} Mar 09 09:53:05 crc kubenswrapper[4971]: I0309 09:53:05.276721 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b" event={"ID":"8640797f-0c0e-4bd1-8d54-ff6c7e22560e","Type":"ContainerStarted","Data":"68f855fd7b3888cc59d2aabf6972acd512b1716575938f1f6ef2437283293e9f"} Mar 09 09:53:05 crc kubenswrapper[4971]: I0309 09:53:05.298837 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b" podStartSLOduration=2.298821269 podStartE2EDuration="2.298821269s" podCreationTimestamp="2026-03-09 09:53:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:53:05.291401595 +0000 UTC m=+1988.851329405" watchObservedRunningTime="2026-03-09 09:53:05.298821269 +0000 UTC m=+1988.858749079" Mar 09 09:53:05 crc kubenswrapper[4971]: I0309 09:53:05.429917 4971 scope.go:117] "RemoveContainer" containerID="fd8395bfcf18228ee414683e0617de3257daee4ebc444aba1ae494954145ae74" Mar 09 09:53:05 crc kubenswrapper[4971]: I0309 09:53:05.450969 4971 scope.go:117] "RemoveContainer" containerID="2efd9b419d55287a59dc97fae07b71cb7de6619d58e834fc359dd77099c36d6d" Mar 09 09:53:05 crc kubenswrapper[4971]: I0309 09:53:05.477606 4971 scope.go:117] "RemoveContainer" containerID="1b91ce1a349e1767f2c4e4ac532a41fb583199ab20c0216590a4dabff0285f79" Mar 09 09:53:06 crc kubenswrapper[4971]: I0309 09:53:06.287551 4971 generic.go:334] "Generic (PLEG): container finished" podID="8640797f-0c0e-4bd1-8d54-ff6c7e22560e" containerID="5749369ae261d05f2985c4d1397094182354c50c40b2b8ee796b9f35da067601" exitCode=0 Mar 09 09:53:06 crc kubenswrapper[4971]: I0309 09:53:06.287891 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b" event={"ID":"8640797f-0c0e-4bd1-8d54-ff6c7e22560e","Type":"ContainerDied","Data":"5749369ae261d05f2985c4d1397094182354c50c40b2b8ee796b9f35da067601"} Mar 09 09:53:07 crc kubenswrapper[4971]: I0309 09:53:07.615996 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b" Mar 09 09:53:07 crc kubenswrapper[4971]: I0309 09:53:07.645621 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b"] Mar 09 09:53:07 crc kubenswrapper[4971]: I0309 09:53:07.656244 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqp9h\" (UniqueName: \"kubernetes.io/projected/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-kube-api-access-tqp9h\") pod \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\" (UID: \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\") " Mar 09 09:53:07 crc kubenswrapper[4971]: I0309 09:53:07.656287 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-etc-swift\") pod \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\" (UID: \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\") " Mar 09 09:53:07 crc kubenswrapper[4971]: I0309 09:53:07.656320 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-dispersionconf\") pod \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\" (UID: \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\") " Mar 09 09:53:07 crc kubenswrapper[4971]: I0309 09:53:07.656375 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-swiftconf\") pod \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\" (UID: \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\") " Mar 09 09:53:07 crc kubenswrapper[4971]: I0309 09:53:07.656414 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-scripts\") pod \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\" (UID: \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\") " Mar 09 09:53:07 crc kubenswrapper[4971]: I0309 09:53:07.656468 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-ring-data-devices\") pod \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\" (UID: \"8640797f-0c0e-4bd1-8d54-ff6c7e22560e\") " Mar 09 09:53:07 crc kubenswrapper[4971]: I0309 09:53:07.657210 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8640797f-0c0e-4bd1-8d54-ff6c7e22560e" (UID: "8640797f-0c0e-4bd1-8d54-ff6c7e22560e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:53:07 crc kubenswrapper[4971]: I0309 09:53:07.657504 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8640797f-0c0e-4bd1-8d54-ff6c7e22560e" (UID: "8640797f-0c0e-4bd1-8d54-ff6c7e22560e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:53:07 crc kubenswrapper[4971]: I0309 09:53:07.658382 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b"] Mar 09 09:53:07 crc kubenswrapper[4971]: I0309 09:53:07.682908 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-scripts" (OuterVolumeSpecName: "scripts") pod "8640797f-0c0e-4bd1-8d54-ff6c7e22560e" (UID: "8640797f-0c0e-4bd1-8d54-ff6c7e22560e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:53:07 crc kubenswrapper[4971]: I0309 09:53:07.683486 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-kube-api-access-tqp9h" (OuterVolumeSpecName: "kube-api-access-tqp9h") pod "8640797f-0c0e-4bd1-8d54-ff6c7e22560e" (UID: "8640797f-0c0e-4bd1-8d54-ff6c7e22560e"). InnerVolumeSpecName "kube-api-access-tqp9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:53:07 crc kubenswrapper[4971]: I0309 09:53:07.686050 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8640797f-0c0e-4bd1-8d54-ff6c7e22560e" (UID: "8640797f-0c0e-4bd1-8d54-ff6c7e22560e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:07 crc kubenswrapper[4971]: I0309 09:53:07.707695 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8640797f-0c0e-4bd1-8d54-ff6c7e22560e" (UID: "8640797f-0c0e-4bd1-8d54-ff6c7e22560e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:07 crc kubenswrapper[4971]: I0309 09:53:07.757585 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:07 crc kubenswrapper[4971]: I0309 09:53:07.757643 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:07 crc kubenswrapper[4971]: I0309 09:53:07.757657 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:07 crc kubenswrapper[4971]: I0309 09:53:07.757671 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqp9h\" (UniqueName: \"kubernetes.io/projected/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-kube-api-access-tqp9h\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:07 crc kubenswrapper[4971]: I0309 09:53:07.757683 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:07 crc kubenswrapper[4971]: I0309 09:53:07.757694 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8640797f-0c0e-4bd1-8d54-ff6c7e22560e-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:08 crc kubenswrapper[4971]: I0309 09:53:08.304240 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68f855fd7b3888cc59d2aabf6972acd512b1716575938f1f6ef2437283293e9f" Mar 09 09:53:08 crc kubenswrapper[4971]: I0309 09:53:08.304319 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-x5c8b" Mar 09 09:53:08 crc kubenswrapper[4971]: I0309 09:53:08.788065 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn"] Mar 09 09:53:08 crc kubenswrapper[4971]: E0309 09:53:08.788585 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8640797f-0c0e-4bd1-8d54-ff6c7e22560e" containerName="swift-ring-rebalance" Mar 09 09:53:08 crc kubenswrapper[4971]: I0309 09:53:08.788597 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8640797f-0c0e-4bd1-8d54-ff6c7e22560e" containerName="swift-ring-rebalance" Mar 09 09:53:08 crc kubenswrapper[4971]: I0309 09:53:08.788728 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8640797f-0c0e-4bd1-8d54-ff6c7e22560e" containerName="swift-ring-rebalance" Mar 09 09:53:08 crc kubenswrapper[4971]: I0309 09:53:08.789145 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn" Mar 09 09:53:08 crc kubenswrapper[4971]: I0309 09:53:08.792283 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:53:08 crc kubenswrapper[4971]: I0309 09:53:08.798818 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn"] Mar 09 09:53:08 crc kubenswrapper[4971]: I0309 09:53:08.799739 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:53:08 crc kubenswrapper[4971]: I0309 09:53:08.972491 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6def4da2-585c-4d22-bd0c-25f0598bcaee-ring-data-devices\") pod \"swift-ring-rebalance-debug-nlgtn\" (UID: \"6def4da2-585c-4d22-bd0c-25f0598bcaee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn" Mar 09 09:53:08 crc kubenswrapper[4971]: I0309 09:53:08.972556 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6def4da2-585c-4d22-bd0c-25f0598bcaee-etc-swift\") pod \"swift-ring-rebalance-debug-nlgtn\" (UID: \"6def4da2-585c-4d22-bd0c-25f0598bcaee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn" Mar 09 09:53:08 crc kubenswrapper[4971]: I0309 09:53:08.972595 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6def4da2-585c-4d22-bd0c-25f0598bcaee-scripts\") pod \"swift-ring-rebalance-debug-nlgtn\" (UID: \"6def4da2-585c-4d22-bd0c-25f0598bcaee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn" Mar 09 09:53:08 crc kubenswrapper[4971]: I0309 09:53:08.972664 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6def4da2-585c-4d22-bd0c-25f0598bcaee-swiftconf\") pod \"swift-ring-rebalance-debug-nlgtn\" (UID: \"6def4da2-585c-4d22-bd0c-25f0598bcaee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn" Mar 09 09:53:08 crc kubenswrapper[4971]: I0309 09:53:08.972697 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6def4da2-585c-4d22-bd0c-25f0598bcaee-dispersionconf\") pod \"swift-ring-rebalance-debug-nlgtn\" (UID: \"6def4da2-585c-4d22-bd0c-25f0598bcaee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn" Mar 09 09:53:08 crc kubenswrapper[4971]: I0309 09:53:08.972721 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq69w\" (UniqueName: \"kubernetes.io/projected/6def4da2-585c-4d22-bd0c-25f0598bcaee-kube-api-access-bq69w\") pod \"swift-ring-rebalance-debug-nlgtn\" (UID: \"6def4da2-585c-4d22-bd0c-25f0598bcaee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn" Mar 09 09:53:09 crc kubenswrapper[4971]: I0309 09:53:09.074032 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6def4da2-585c-4d22-bd0c-25f0598bcaee-swiftconf\") pod \"swift-ring-rebalance-debug-nlgtn\" (UID: \"6def4da2-585c-4d22-bd0c-25f0598bcaee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn" Mar 09 09:53:09 crc kubenswrapper[4971]: I0309 09:53:09.074092 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6def4da2-585c-4d22-bd0c-25f0598bcaee-dispersionconf\") pod \"swift-ring-rebalance-debug-nlgtn\" (UID: \"6def4da2-585c-4d22-bd0c-25f0598bcaee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn" Mar 09 09:53:09 crc kubenswrapper[4971]: I0309 09:53:09.074128 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq69w\" (UniqueName: \"kubernetes.io/projected/6def4da2-585c-4d22-bd0c-25f0598bcaee-kube-api-access-bq69w\") pod \"swift-ring-rebalance-debug-nlgtn\" (UID: \"6def4da2-585c-4d22-bd0c-25f0598bcaee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn" Mar 09 09:53:09 crc kubenswrapper[4971]: I0309 09:53:09.074188 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6def4da2-585c-4d22-bd0c-25f0598bcaee-ring-data-devices\") pod \"swift-ring-rebalance-debug-nlgtn\" (UID: \"6def4da2-585c-4d22-bd0c-25f0598bcaee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn" Mar 09 09:53:09 crc kubenswrapper[4971]: I0309 09:53:09.074213 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6def4da2-585c-4d22-bd0c-25f0598bcaee-etc-swift\") pod \"swift-ring-rebalance-debug-nlgtn\" (UID: \"6def4da2-585c-4d22-bd0c-25f0598bcaee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn" Mar 09 09:53:09 crc kubenswrapper[4971]: I0309 09:53:09.074256 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6def4da2-585c-4d22-bd0c-25f0598bcaee-scripts\") pod \"swift-ring-rebalance-debug-nlgtn\" (UID: \"6def4da2-585c-4d22-bd0c-25f0598bcaee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn" Mar 09 09:53:09 crc kubenswrapper[4971]: I0309 09:53:09.075118 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6def4da2-585c-4d22-bd0c-25f0598bcaee-scripts\") pod \"swift-ring-rebalance-debug-nlgtn\" (UID: \"6def4da2-585c-4d22-bd0c-25f0598bcaee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn" Mar 09 09:53:09 crc kubenswrapper[4971]: I0309 09:53:09.075483 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6def4da2-585c-4d22-bd0c-25f0598bcaee-ring-data-devices\") pod \"swift-ring-rebalance-debug-nlgtn\" (UID: \"6def4da2-585c-4d22-bd0c-25f0598bcaee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn" Mar 09 09:53:09 crc kubenswrapper[4971]: I0309 09:53:09.075729 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6def4da2-585c-4d22-bd0c-25f0598bcaee-etc-swift\") pod \"swift-ring-rebalance-debug-nlgtn\" (UID: \"6def4da2-585c-4d22-bd0c-25f0598bcaee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn" Mar 09 09:53:09 crc kubenswrapper[4971]: I0309 09:53:09.079276 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6def4da2-585c-4d22-bd0c-25f0598bcaee-dispersionconf\") pod \"swift-ring-rebalance-debug-nlgtn\" (UID: \"6def4da2-585c-4d22-bd0c-25f0598bcaee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn" Mar 09 09:53:09 crc kubenswrapper[4971]: I0309 09:53:09.088899 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6def4da2-585c-4d22-bd0c-25f0598bcaee-swiftconf\") pod \"swift-ring-rebalance-debug-nlgtn\" (UID: \"6def4da2-585c-4d22-bd0c-25f0598bcaee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn" Mar 09 09:53:09 crc kubenswrapper[4971]: I0309 09:53:09.092894 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq69w\" (UniqueName: \"kubernetes.io/projected/6def4da2-585c-4d22-bd0c-25f0598bcaee-kube-api-access-bq69w\") pod \"swift-ring-rebalance-debug-nlgtn\" (UID: \"6def4da2-585c-4d22-bd0c-25f0598bcaee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn" Mar 09 09:53:09 crc kubenswrapper[4971]: I0309 09:53:09.114040 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn" Mar 09 09:53:09 crc kubenswrapper[4971]: I0309 09:53:09.167737 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8640797f-0c0e-4bd1-8d54-ff6c7e22560e" path="/var/lib/kubelet/pods/8640797f-0c0e-4bd1-8d54-ff6c7e22560e/volumes" Mar 09 09:53:09 crc kubenswrapper[4971]: I0309 09:53:09.395863 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn"] Mar 09 09:53:10 crc kubenswrapper[4971]: I0309 09:53:10.322123 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn" event={"ID":"6def4da2-585c-4d22-bd0c-25f0598bcaee","Type":"ContainerStarted","Data":"83596e5bfcd8fbcc5fb656f4bea30e7c1a6dc4ee007739339fd54593b2486ba1"} Mar 09 09:53:10 crc kubenswrapper[4971]: I0309 09:53:10.323389 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn" event={"ID":"6def4da2-585c-4d22-bd0c-25f0598bcaee","Type":"ContainerStarted","Data":"74daa1b72fe491f2c6c19bc48654bf692741e7b078378b420bb21232d9abee8c"} Mar 09 09:53:10 crc kubenswrapper[4971]: I0309 09:53:10.338770 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn" podStartSLOduration=2.338738823 podStartE2EDuration="2.338738823s" podCreationTimestamp="2026-03-09 09:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:53:10.335096808 +0000 UTC m=+1993.895024618" watchObservedRunningTime="2026-03-09 09:53:10.338738823 +0000 UTC m=+1993.898666633" Mar 09 09:53:11 crc kubenswrapper[4971]: I0309 09:53:11.330498 4971 generic.go:334] "Generic (PLEG): container finished" podID="6def4da2-585c-4d22-bd0c-25f0598bcaee" containerID="83596e5bfcd8fbcc5fb656f4bea30e7c1a6dc4ee007739339fd54593b2486ba1" exitCode=0 Mar 09 09:53:11 crc kubenswrapper[4971]: I0309 09:53:11.330546 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn" event={"ID":"6def4da2-585c-4d22-bd0c-25f0598bcaee","Type":"ContainerDied","Data":"83596e5bfcd8fbcc5fb656f4bea30e7c1a6dc4ee007739339fd54593b2486ba1"} Mar 09 09:53:12 crc kubenswrapper[4971]: I0309 09:53:12.638695 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn" Mar 09 09:53:12 crc kubenswrapper[4971]: I0309 09:53:12.666272 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn"] Mar 09 09:53:12 crc kubenswrapper[4971]: I0309 09:53:12.673102 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn"] Mar 09 09:53:12 crc kubenswrapper[4971]: I0309 09:53:12.833429 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6def4da2-585c-4d22-bd0c-25f0598bcaee-scripts\") pod \"6def4da2-585c-4d22-bd0c-25f0598bcaee\" (UID: \"6def4da2-585c-4d22-bd0c-25f0598bcaee\") " Mar 09 09:53:12 crc kubenswrapper[4971]: I0309 09:53:12.833587 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6def4da2-585c-4d22-bd0c-25f0598bcaee-dispersionconf\") pod \"6def4da2-585c-4d22-bd0c-25f0598bcaee\" (UID: \"6def4da2-585c-4d22-bd0c-25f0598bcaee\") " Mar 09 09:53:12 crc kubenswrapper[4971]: I0309 09:53:12.834311 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6def4da2-585c-4d22-bd0c-25f0598bcaee-swiftconf\") pod \"6def4da2-585c-4d22-bd0c-25f0598bcaee\" (UID: \"6def4da2-585c-4d22-bd0c-25f0598bcaee\") " Mar 09 09:53:12 crc kubenswrapper[4971]: I0309 09:53:12.834572 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6def4da2-585c-4d22-bd0c-25f0598bcaee-ring-data-devices\") pod \"6def4da2-585c-4d22-bd0c-25f0598bcaee\" (UID: \"6def4da2-585c-4d22-bd0c-25f0598bcaee\") " Mar 09 09:53:12 crc kubenswrapper[4971]: I0309 09:53:12.834620 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq69w\" (UniqueName: \"kubernetes.io/projected/6def4da2-585c-4d22-bd0c-25f0598bcaee-kube-api-access-bq69w\") pod \"6def4da2-585c-4d22-bd0c-25f0598bcaee\" (UID: \"6def4da2-585c-4d22-bd0c-25f0598bcaee\") " Mar 09 09:53:12 crc kubenswrapper[4971]: I0309 09:53:12.834703 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6def4da2-585c-4d22-bd0c-25f0598bcaee-etc-swift\") pod \"6def4da2-585c-4d22-bd0c-25f0598bcaee\" (UID: \"6def4da2-585c-4d22-bd0c-25f0598bcaee\") " Mar 09 09:53:12 crc kubenswrapper[4971]: I0309 09:53:12.835150 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6def4da2-585c-4d22-bd0c-25f0598bcaee-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6def4da2-585c-4d22-bd0c-25f0598bcaee" (UID: "6def4da2-585c-4d22-bd0c-25f0598bcaee"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:53:12 crc kubenswrapper[4971]: I0309 09:53:12.835275 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6def4da2-585c-4d22-bd0c-25f0598bcaee-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:12 crc kubenswrapper[4971]: I0309 09:53:12.836163 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6def4da2-585c-4d22-bd0c-25f0598bcaee-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6def4da2-585c-4d22-bd0c-25f0598bcaee" (UID: "6def4da2-585c-4d22-bd0c-25f0598bcaee"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:53:12 crc kubenswrapper[4971]: I0309 09:53:12.838618 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6def4da2-585c-4d22-bd0c-25f0598bcaee-kube-api-access-bq69w" (OuterVolumeSpecName: "kube-api-access-bq69w") pod "6def4da2-585c-4d22-bd0c-25f0598bcaee" (UID: "6def4da2-585c-4d22-bd0c-25f0598bcaee"). InnerVolumeSpecName "kube-api-access-bq69w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:53:12 crc kubenswrapper[4971]: I0309 09:53:12.854527 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6def4da2-585c-4d22-bd0c-25f0598bcaee-scripts" (OuterVolumeSpecName: "scripts") pod "6def4da2-585c-4d22-bd0c-25f0598bcaee" (UID: "6def4da2-585c-4d22-bd0c-25f0598bcaee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:53:12 crc kubenswrapper[4971]: I0309 09:53:12.856587 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6def4da2-585c-4d22-bd0c-25f0598bcaee-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6def4da2-585c-4d22-bd0c-25f0598bcaee" (UID: "6def4da2-585c-4d22-bd0c-25f0598bcaee"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:12 crc kubenswrapper[4971]: I0309 09:53:12.859788 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6def4da2-585c-4d22-bd0c-25f0598bcaee-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6def4da2-585c-4d22-bd0c-25f0598bcaee" (UID: "6def4da2-585c-4d22-bd0c-25f0598bcaee"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:12 crc kubenswrapper[4971]: I0309 09:53:12.935795 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6def4da2-585c-4d22-bd0c-25f0598bcaee-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:12 crc kubenswrapper[4971]: I0309 09:53:12.936012 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6def4da2-585c-4d22-bd0c-25f0598bcaee-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:12 crc kubenswrapper[4971]: I0309 09:53:12.936023 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6def4da2-585c-4d22-bd0c-25f0598bcaee-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:12 crc kubenswrapper[4971]: I0309 09:53:12.936031 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bq69w\" (UniqueName: \"kubernetes.io/projected/6def4da2-585c-4d22-bd0c-25f0598bcaee-kube-api-access-bq69w\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:12 crc kubenswrapper[4971]: I0309 09:53:12.936041 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6def4da2-585c-4d22-bd0c-25f0598bcaee-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.170231 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6def4da2-585c-4d22-bd0c-25f0598bcaee" path="/var/lib/kubelet/pods/6def4da2-585c-4d22-bd0c-25f0598bcaee/volumes" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.350467 4971 scope.go:117] "RemoveContainer" containerID="83596e5bfcd8fbcc5fb656f4bea30e7c1a6dc4ee007739339fd54593b2486ba1" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.350610 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nlgtn" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.804112 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7"] Mar 09 09:53:13 crc kubenswrapper[4971]: E0309 09:53:13.804457 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6def4da2-585c-4d22-bd0c-25f0598bcaee" containerName="swift-ring-rebalance" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.804470 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6def4da2-585c-4d22-bd0c-25f0598bcaee" containerName="swift-ring-rebalance" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.804592 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6def4da2-585c-4d22-bd0c-25f0598bcaee" containerName="swift-ring-rebalance" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.805043 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.808872 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.809302 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.816215 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7"] Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.846163 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f5ea662c-ee35-4695-873f-090741155dfc-ring-data-devices\") pod \"swift-ring-rebalance-debug-gzhv7\" (UID: \"f5ea662c-ee35-4695-873f-090741155dfc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.846284 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f5ea662c-ee35-4695-873f-090741155dfc-etc-swift\") pod \"swift-ring-rebalance-debug-gzhv7\" (UID: \"f5ea662c-ee35-4695-873f-090741155dfc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.846312 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f5ea662c-ee35-4695-873f-090741155dfc-dispersionconf\") pod \"swift-ring-rebalance-debug-gzhv7\" (UID: \"f5ea662c-ee35-4695-873f-090741155dfc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.846335 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f5ea662c-ee35-4695-873f-090741155dfc-swiftconf\") pod \"swift-ring-rebalance-debug-gzhv7\" (UID: \"f5ea662c-ee35-4695-873f-090741155dfc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.846517 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5ea662c-ee35-4695-873f-090741155dfc-scripts\") pod \"swift-ring-rebalance-debug-gzhv7\" (UID: \"f5ea662c-ee35-4695-873f-090741155dfc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.846556 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzbhn\" (UniqueName: \"kubernetes.io/projected/f5ea662c-ee35-4695-873f-090741155dfc-kube-api-access-gzbhn\") pod \"swift-ring-rebalance-debug-gzhv7\" (UID: \"f5ea662c-ee35-4695-873f-090741155dfc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.947383 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f5ea662c-ee35-4695-873f-090741155dfc-ring-data-devices\") pod \"swift-ring-rebalance-debug-gzhv7\" (UID: \"f5ea662c-ee35-4695-873f-090741155dfc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.947479 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f5ea662c-ee35-4695-873f-090741155dfc-etc-swift\") pod \"swift-ring-rebalance-debug-gzhv7\" (UID: \"f5ea662c-ee35-4695-873f-090741155dfc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.947503 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f5ea662c-ee35-4695-873f-090741155dfc-dispersionconf\") pod \"swift-ring-rebalance-debug-gzhv7\" (UID: \"f5ea662c-ee35-4695-873f-090741155dfc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.947520 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f5ea662c-ee35-4695-873f-090741155dfc-swiftconf\") pod \"swift-ring-rebalance-debug-gzhv7\" (UID: \"f5ea662c-ee35-4695-873f-090741155dfc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.947553 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5ea662c-ee35-4695-873f-090741155dfc-scripts\") pod \"swift-ring-rebalance-debug-gzhv7\" (UID: \"f5ea662c-ee35-4695-873f-090741155dfc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.947572 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzbhn\" (UniqueName: \"kubernetes.io/projected/f5ea662c-ee35-4695-873f-090741155dfc-kube-api-access-gzbhn\") pod \"swift-ring-rebalance-debug-gzhv7\" (UID: \"f5ea662c-ee35-4695-873f-090741155dfc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.948321 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f5ea662c-ee35-4695-873f-090741155dfc-ring-data-devices\") pod \"swift-ring-rebalance-debug-gzhv7\" (UID: \"f5ea662c-ee35-4695-873f-090741155dfc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.948338 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f5ea662c-ee35-4695-873f-090741155dfc-etc-swift\") pod \"swift-ring-rebalance-debug-gzhv7\" (UID: \"f5ea662c-ee35-4695-873f-090741155dfc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.948840 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5ea662c-ee35-4695-873f-090741155dfc-scripts\") pod \"swift-ring-rebalance-debug-gzhv7\" (UID: \"f5ea662c-ee35-4695-873f-090741155dfc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.952029 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f5ea662c-ee35-4695-873f-090741155dfc-dispersionconf\") pod \"swift-ring-rebalance-debug-gzhv7\" (UID: \"f5ea662c-ee35-4695-873f-090741155dfc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.952451 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f5ea662c-ee35-4695-873f-090741155dfc-swiftconf\") pod \"swift-ring-rebalance-debug-gzhv7\" (UID: \"f5ea662c-ee35-4695-873f-090741155dfc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7" Mar 09 09:53:13 crc kubenswrapper[4971]: I0309 09:53:13.965770 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzbhn\" (UniqueName: \"kubernetes.io/projected/f5ea662c-ee35-4695-873f-090741155dfc-kube-api-access-gzbhn\") pod \"swift-ring-rebalance-debug-gzhv7\" (UID: \"f5ea662c-ee35-4695-873f-090741155dfc\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7" Mar 09 09:53:14 crc kubenswrapper[4971]: I0309 09:53:14.134482 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7" Mar 09 09:53:14 crc kubenswrapper[4971]: I0309 09:53:14.580924 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7"] Mar 09 09:53:14 crc kubenswrapper[4971]: W0309 09:53:14.587739 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5ea662c_ee35_4695_873f_090741155dfc.slice/crio-93112eaaf4a636d148cc4d9841fe1f5cac003189dc0055716bf49ada4352db8a WatchSource:0}: Error finding container 93112eaaf4a636d148cc4d9841fe1f5cac003189dc0055716bf49ada4352db8a: Status 404 returned error can't find the container with id 93112eaaf4a636d148cc4d9841fe1f5cac003189dc0055716bf49ada4352db8a Mar 09 09:53:15 crc kubenswrapper[4971]: I0309 09:53:15.366179 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7" event={"ID":"f5ea662c-ee35-4695-873f-090741155dfc","Type":"ContainerStarted","Data":"3a3bf38e04e48b829d4c1a02c9e5f7d78c8251844e6871eec6d3e9709c9b61a8"} Mar 09 09:53:15 crc kubenswrapper[4971]: I0309 09:53:15.366548 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7" event={"ID":"f5ea662c-ee35-4695-873f-090741155dfc","Type":"ContainerStarted","Data":"93112eaaf4a636d148cc4d9841fe1f5cac003189dc0055716bf49ada4352db8a"} Mar 09 09:53:15 crc kubenswrapper[4971]: I0309 09:53:15.380996 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7" podStartSLOduration=2.380976854 podStartE2EDuration="2.380976854s" podCreationTimestamp="2026-03-09 09:53:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:53:15.379269764 +0000 UTC m=+1998.939197604" watchObservedRunningTime="2026-03-09 09:53:15.380976854 +0000 UTC m=+1998.940904664" Mar 09 09:53:16 crc kubenswrapper[4971]: I0309 09:53:16.375042 4971 generic.go:334] "Generic (PLEG): container finished" podID="f5ea662c-ee35-4695-873f-090741155dfc" containerID="3a3bf38e04e48b829d4c1a02c9e5f7d78c8251844e6871eec6d3e9709c9b61a8" exitCode=0 Mar 09 09:53:16 crc kubenswrapper[4971]: I0309 09:53:16.375190 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7" event={"ID":"f5ea662c-ee35-4695-873f-090741155dfc","Type":"ContainerDied","Data":"3a3bf38e04e48b829d4c1a02c9e5f7d78c8251844e6871eec6d3e9709c9b61a8"} Mar 09 09:53:17 crc kubenswrapper[4971]: I0309 09:53:17.164927 4971 scope.go:117] "RemoveContainer" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" Mar 09 09:53:17 crc kubenswrapper[4971]: E0309 09:53:17.165202 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 09:53:17 crc kubenswrapper[4971]: I0309 09:53:17.669314 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7" Mar 09 09:53:17 crc kubenswrapper[4971]: I0309 09:53:17.791171 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7"] Mar 09 09:53:17 crc kubenswrapper[4971]: I0309 09:53:17.798861 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7"] Mar 09 09:53:17 crc kubenswrapper[4971]: I0309 09:53:17.815003 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzbhn\" (UniqueName: \"kubernetes.io/projected/f5ea662c-ee35-4695-873f-090741155dfc-kube-api-access-gzbhn\") pod \"f5ea662c-ee35-4695-873f-090741155dfc\" (UID: \"f5ea662c-ee35-4695-873f-090741155dfc\") " Mar 09 09:53:17 crc kubenswrapper[4971]: I0309 09:53:17.815230 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5ea662c-ee35-4695-873f-090741155dfc-scripts\") pod \"f5ea662c-ee35-4695-873f-090741155dfc\" (UID: \"f5ea662c-ee35-4695-873f-090741155dfc\") " Mar 09 09:53:17 crc kubenswrapper[4971]: I0309 09:53:17.815421 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f5ea662c-ee35-4695-873f-090741155dfc-dispersionconf\") pod \"f5ea662c-ee35-4695-873f-090741155dfc\" (UID: \"f5ea662c-ee35-4695-873f-090741155dfc\") " Mar 09 09:53:17 crc kubenswrapper[4971]: I0309 09:53:17.815519 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f5ea662c-ee35-4695-873f-090741155dfc-swiftconf\") pod \"f5ea662c-ee35-4695-873f-090741155dfc\" (UID: \"f5ea662c-ee35-4695-873f-090741155dfc\") " Mar 09 09:53:17 crc kubenswrapper[4971]: I0309 09:53:17.815582 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f5ea662c-ee35-4695-873f-090741155dfc-ring-data-devices\") pod \"f5ea662c-ee35-4695-873f-090741155dfc\" (UID: \"f5ea662c-ee35-4695-873f-090741155dfc\") " Mar 09 09:53:17 crc kubenswrapper[4971]: I0309 09:53:17.815624 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f5ea662c-ee35-4695-873f-090741155dfc-etc-swift\") pod \"f5ea662c-ee35-4695-873f-090741155dfc\" (UID: \"f5ea662c-ee35-4695-873f-090741155dfc\") " Mar 09 09:53:17 crc kubenswrapper[4971]: I0309 09:53:17.818697 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5ea662c-ee35-4695-873f-090741155dfc-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f5ea662c-ee35-4695-873f-090741155dfc" (UID: "f5ea662c-ee35-4695-873f-090741155dfc"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:53:17 crc kubenswrapper[4971]: I0309 09:53:17.818997 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5ea662c-ee35-4695-873f-090741155dfc-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f5ea662c-ee35-4695-873f-090741155dfc" (UID: "f5ea662c-ee35-4695-873f-090741155dfc"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:53:17 crc kubenswrapper[4971]: I0309 09:53:17.836475 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5ea662c-ee35-4695-873f-090741155dfc-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f5ea662c-ee35-4695-873f-090741155dfc" (UID: "f5ea662c-ee35-4695-873f-090741155dfc"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:17 crc kubenswrapper[4971]: I0309 09:53:17.837770 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5ea662c-ee35-4695-873f-090741155dfc-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f5ea662c-ee35-4695-873f-090741155dfc" (UID: "f5ea662c-ee35-4695-873f-090741155dfc"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:17 crc kubenswrapper[4971]: I0309 09:53:17.838136 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5ea662c-ee35-4695-873f-090741155dfc-scripts" (OuterVolumeSpecName: "scripts") pod "f5ea662c-ee35-4695-873f-090741155dfc" (UID: "f5ea662c-ee35-4695-873f-090741155dfc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:53:17 crc kubenswrapper[4971]: I0309 09:53:17.840701 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5ea662c-ee35-4695-873f-090741155dfc-kube-api-access-gzbhn" (OuterVolumeSpecName: "kube-api-access-gzbhn") pod "f5ea662c-ee35-4695-873f-090741155dfc" (UID: "f5ea662c-ee35-4695-873f-090741155dfc"). InnerVolumeSpecName "kube-api-access-gzbhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:53:17 crc kubenswrapper[4971]: I0309 09:53:17.917752 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzbhn\" (UniqueName: \"kubernetes.io/projected/f5ea662c-ee35-4695-873f-090741155dfc-kube-api-access-gzbhn\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:17 crc kubenswrapper[4971]: I0309 09:53:17.917789 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5ea662c-ee35-4695-873f-090741155dfc-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:17 crc kubenswrapper[4971]: I0309 09:53:17.917802 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f5ea662c-ee35-4695-873f-090741155dfc-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:17 crc kubenswrapper[4971]: I0309 09:53:17.917814 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f5ea662c-ee35-4695-873f-090741155dfc-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:17 crc kubenswrapper[4971]: I0309 09:53:17.917825 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f5ea662c-ee35-4695-873f-090741155dfc-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:17 crc kubenswrapper[4971]: I0309 09:53:17.917835 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f5ea662c-ee35-4695-873f-090741155dfc-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:18 crc kubenswrapper[4971]: I0309 09:53:18.393382 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93112eaaf4a636d148cc4d9841fe1f5cac003189dc0055716bf49ada4352db8a" Mar 09 09:53:18 crc kubenswrapper[4971]: I0309 09:53:18.393458 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzhv7" Mar 09 09:53:18 crc kubenswrapper[4971]: I0309 09:53:18.928245 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv"] Mar 09 09:53:18 crc kubenswrapper[4971]: E0309 09:53:18.928921 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5ea662c-ee35-4695-873f-090741155dfc" containerName="swift-ring-rebalance" Mar 09 09:53:18 crc kubenswrapper[4971]: I0309 09:53:18.928936 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5ea662c-ee35-4695-873f-090741155dfc" containerName="swift-ring-rebalance" Mar 09 09:53:18 crc kubenswrapper[4971]: I0309 09:53:18.929105 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5ea662c-ee35-4695-873f-090741155dfc" containerName="swift-ring-rebalance" Mar 09 09:53:18 crc kubenswrapper[4971]: I0309 09:53:18.929627 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv" Mar 09 09:53:18 crc kubenswrapper[4971]: I0309 09:53:18.933698 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:53:18 crc kubenswrapper[4971]: I0309 09:53:18.934509 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:53:18 crc kubenswrapper[4971]: I0309 09:53:18.938081 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv"] Mar 09 09:53:19 crc kubenswrapper[4971]: I0309 09:53:19.032529 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/449a048a-278f-430c-a4b1-ce821639186e-scripts\") pod \"swift-ring-rebalance-debug-8h9vv\" (UID: \"449a048a-278f-430c-a4b1-ce821639186e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv" Mar 09 09:53:19 crc kubenswrapper[4971]: I0309 09:53:19.032595 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bxff\" (UniqueName: \"kubernetes.io/projected/449a048a-278f-430c-a4b1-ce821639186e-kube-api-access-4bxff\") pod \"swift-ring-rebalance-debug-8h9vv\" (UID: \"449a048a-278f-430c-a4b1-ce821639186e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv" Mar 09 09:53:19 crc kubenswrapper[4971]: I0309 09:53:19.032663 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/449a048a-278f-430c-a4b1-ce821639186e-etc-swift\") pod \"swift-ring-rebalance-debug-8h9vv\" (UID: \"449a048a-278f-430c-a4b1-ce821639186e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv" Mar 09 09:53:19 crc kubenswrapper[4971]: I0309 09:53:19.032689 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/449a048a-278f-430c-a4b1-ce821639186e-dispersionconf\") pod \"swift-ring-rebalance-debug-8h9vv\" (UID: \"449a048a-278f-430c-a4b1-ce821639186e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv" Mar 09 09:53:19 crc kubenswrapper[4971]: I0309 09:53:19.032719 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/449a048a-278f-430c-a4b1-ce821639186e-swiftconf\") pod \"swift-ring-rebalance-debug-8h9vv\" (UID: \"449a048a-278f-430c-a4b1-ce821639186e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv" Mar 09 09:53:19 crc kubenswrapper[4971]: I0309 09:53:19.032740 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/449a048a-278f-430c-a4b1-ce821639186e-ring-data-devices\") pod \"swift-ring-rebalance-debug-8h9vv\" (UID: \"449a048a-278f-430c-a4b1-ce821639186e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv" Mar 09 09:53:19 crc kubenswrapper[4971]: I0309 09:53:19.134313 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/449a048a-278f-430c-a4b1-ce821639186e-scripts\") pod \"swift-ring-rebalance-debug-8h9vv\" (UID: \"449a048a-278f-430c-a4b1-ce821639186e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv" Mar 09 09:53:19 crc kubenswrapper[4971]: I0309 09:53:19.134408 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bxff\" (UniqueName: \"kubernetes.io/projected/449a048a-278f-430c-a4b1-ce821639186e-kube-api-access-4bxff\") pod \"swift-ring-rebalance-debug-8h9vv\" (UID: \"449a048a-278f-430c-a4b1-ce821639186e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv" Mar 09 09:53:19 crc kubenswrapper[4971]: I0309 09:53:19.134441 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/449a048a-278f-430c-a4b1-ce821639186e-etc-swift\") pod \"swift-ring-rebalance-debug-8h9vv\" (UID: \"449a048a-278f-430c-a4b1-ce821639186e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv" Mar 09 09:53:19 crc kubenswrapper[4971]: I0309 09:53:19.134484 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/449a048a-278f-430c-a4b1-ce821639186e-dispersionconf\") pod \"swift-ring-rebalance-debug-8h9vv\" (UID: \"449a048a-278f-430c-a4b1-ce821639186e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv" Mar 09 09:53:19 crc kubenswrapper[4971]: I0309 09:53:19.134522 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/449a048a-278f-430c-a4b1-ce821639186e-ring-data-devices\") pod \"swift-ring-rebalance-debug-8h9vv\" (UID: \"449a048a-278f-430c-a4b1-ce821639186e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv" Mar 09 09:53:19 crc kubenswrapper[4971]: I0309 09:53:19.134551 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/449a048a-278f-430c-a4b1-ce821639186e-swiftconf\") pod \"swift-ring-rebalance-debug-8h9vv\" (UID: \"449a048a-278f-430c-a4b1-ce821639186e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv" Mar 09 09:53:19 crc kubenswrapper[4971]: I0309 09:53:19.135261 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/449a048a-278f-430c-a4b1-ce821639186e-etc-swift\") pod \"swift-ring-rebalance-debug-8h9vv\" (UID: \"449a048a-278f-430c-a4b1-ce821639186e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv" Mar 09 09:53:19 crc kubenswrapper[4971]: I0309 09:53:19.135329 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/449a048a-278f-430c-a4b1-ce821639186e-scripts\") pod \"swift-ring-rebalance-debug-8h9vv\" (UID: \"449a048a-278f-430c-a4b1-ce821639186e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv" Mar 09 09:53:19 crc kubenswrapper[4971]: I0309 09:53:19.135644 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/449a048a-278f-430c-a4b1-ce821639186e-ring-data-devices\") pod \"swift-ring-rebalance-debug-8h9vv\" (UID: \"449a048a-278f-430c-a4b1-ce821639186e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv" Mar 09 09:53:19 crc kubenswrapper[4971]: I0309 09:53:19.143139 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/449a048a-278f-430c-a4b1-ce821639186e-dispersionconf\") pod \"swift-ring-rebalance-debug-8h9vv\" (UID: \"449a048a-278f-430c-a4b1-ce821639186e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv" Mar 09 09:53:19 crc kubenswrapper[4971]: I0309 09:53:19.143333 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/449a048a-278f-430c-a4b1-ce821639186e-swiftconf\") pod \"swift-ring-rebalance-debug-8h9vv\" (UID: \"449a048a-278f-430c-a4b1-ce821639186e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv" Mar 09 09:53:19 crc kubenswrapper[4971]: I0309 09:53:19.150565 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bxff\" (UniqueName: \"kubernetes.io/projected/449a048a-278f-430c-a4b1-ce821639186e-kube-api-access-4bxff\") pod \"swift-ring-rebalance-debug-8h9vv\" (UID: \"449a048a-278f-430c-a4b1-ce821639186e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv" Mar 09 09:53:19 crc kubenswrapper[4971]: I0309 09:53:19.160342 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5ea662c-ee35-4695-873f-090741155dfc" path="/var/lib/kubelet/pods/f5ea662c-ee35-4695-873f-090741155dfc/volumes" Mar 09 09:53:19 crc kubenswrapper[4971]: I0309 09:53:19.253684 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv" Mar 09 09:53:19 crc kubenswrapper[4971]: I0309 09:53:19.655346 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv"] Mar 09 09:53:19 crc kubenswrapper[4971]: W0309 09:53:19.656892 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod449a048a_278f_430c_a4b1_ce821639186e.slice/crio-0e1f01dd84d6c06e491a00a112bd02536c2f2445a65cd10225efc49cc8046f0b WatchSource:0}: Error finding container 0e1f01dd84d6c06e491a00a112bd02536c2f2445a65cd10225efc49cc8046f0b: Status 404 returned error can't find the container with id 0e1f01dd84d6c06e491a00a112bd02536c2f2445a65cd10225efc49cc8046f0b Mar 09 09:53:20 crc kubenswrapper[4971]: I0309 09:53:20.411128 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv" event={"ID":"449a048a-278f-430c-a4b1-ce821639186e","Type":"ContainerStarted","Data":"390506f353cf9c3a7bd3fad7931f955dc6dbcb7131cf8fb86b077f7d51d703f0"} Mar 09 09:53:20 crc kubenswrapper[4971]: I0309 09:53:20.411414 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv" event={"ID":"449a048a-278f-430c-a4b1-ce821639186e","Type":"ContainerStarted","Data":"0e1f01dd84d6c06e491a00a112bd02536c2f2445a65cd10225efc49cc8046f0b"} Mar 09 09:53:20 crc kubenswrapper[4971]: I0309 09:53:20.425645 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv" podStartSLOduration=2.425626575 podStartE2EDuration="2.425626575s" podCreationTimestamp="2026-03-09 09:53:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:53:20.424176913 +0000 UTC m=+2003.984104743" watchObservedRunningTime="2026-03-09 09:53:20.425626575 +0000 UTC m=+2003.985554385" Mar 09 09:53:21 crc kubenswrapper[4971]: I0309 09:53:21.420746 4971 generic.go:334] "Generic (PLEG): container finished" podID="449a048a-278f-430c-a4b1-ce821639186e" containerID="390506f353cf9c3a7bd3fad7931f955dc6dbcb7131cf8fb86b077f7d51d703f0" exitCode=0 Mar 09 09:53:21 crc kubenswrapper[4971]: I0309 09:53:21.420822 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv" event={"ID":"449a048a-278f-430c-a4b1-ce821639186e","Type":"ContainerDied","Data":"390506f353cf9c3a7bd3fad7931f955dc6dbcb7131cf8fb86b077f7d51d703f0"} Mar 09 09:53:22 crc kubenswrapper[4971]: I0309 09:53:22.681057 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv" Mar 09 09:53:22 crc kubenswrapper[4971]: I0309 09:53:22.700986 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/449a048a-278f-430c-a4b1-ce821639186e-scripts\") pod \"449a048a-278f-430c-a4b1-ce821639186e\" (UID: \"449a048a-278f-430c-a4b1-ce821639186e\") " Mar 09 09:53:22 crc kubenswrapper[4971]: I0309 09:53:22.701042 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/449a048a-278f-430c-a4b1-ce821639186e-dispersionconf\") pod \"449a048a-278f-430c-a4b1-ce821639186e\" (UID: \"449a048a-278f-430c-a4b1-ce821639186e\") " Mar 09 09:53:22 crc kubenswrapper[4971]: I0309 09:53:22.701083 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/449a048a-278f-430c-a4b1-ce821639186e-ring-data-devices\") pod \"449a048a-278f-430c-a4b1-ce821639186e\" (UID: \"449a048a-278f-430c-a4b1-ce821639186e\") " Mar 09 09:53:22 crc kubenswrapper[4971]: I0309 09:53:22.701174 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bxff\" (UniqueName: \"kubernetes.io/projected/449a048a-278f-430c-a4b1-ce821639186e-kube-api-access-4bxff\") pod \"449a048a-278f-430c-a4b1-ce821639186e\" (UID: \"449a048a-278f-430c-a4b1-ce821639186e\") " Mar 09 09:53:22 crc kubenswrapper[4971]: I0309 09:53:22.701205 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/449a048a-278f-430c-a4b1-ce821639186e-etc-swift\") pod \"449a048a-278f-430c-a4b1-ce821639186e\" (UID: \"449a048a-278f-430c-a4b1-ce821639186e\") " Mar 09 09:53:22 crc kubenswrapper[4971]: I0309 09:53:22.701279 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/449a048a-278f-430c-a4b1-ce821639186e-swiftconf\") pod \"449a048a-278f-430c-a4b1-ce821639186e\" (UID: \"449a048a-278f-430c-a4b1-ce821639186e\") " Mar 09 09:53:22 crc kubenswrapper[4971]: I0309 09:53:22.702466 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/449a048a-278f-430c-a4b1-ce821639186e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "449a048a-278f-430c-a4b1-ce821639186e" (UID: "449a048a-278f-430c-a4b1-ce821639186e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:53:22 crc kubenswrapper[4971]: I0309 09:53:22.704848 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/449a048a-278f-430c-a4b1-ce821639186e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "449a048a-278f-430c-a4b1-ce821639186e" (UID: "449a048a-278f-430c-a4b1-ce821639186e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:53:22 crc kubenswrapper[4971]: I0309 09:53:22.710571 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/449a048a-278f-430c-a4b1-ce821639186e-kube-api-access-4bxff" (OuterVolumeSpecName: "kube-api-access-4bxff") pod "449a048a-278f-430c-a4b1-ce821639186e" (UID: "449a048a-278f-430c-a4b1-ce821639186e"). InnerVolumeSpecName "kube-api-access-4bxff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:53:22 crc kubenswrapper[4971]: I0309 09:53:22.715386 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv"] Mar 09 09:53:22 crc kubenswrapper[4971]: I0309 09:53:22.721267 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/449a048a-278f-430c-a4b1-ce821639186e-scripts" (OuterVolumeSpecName: "scripts") pod "449a048a-278f-430c-a4b1-ce821639186e" (UID: "449a048a-278f-430c-a4b1-ce821639186e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:53:22 crc kubenswrapper[4971]: I0309 09:53:22.722076 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449a048a-278f-430c-a4b1-ce821639186e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "449a048a-278f-430c-a4b1-ce821639186e" (UID: "449a048a-278f-430c-a4b1-ce821639186e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:22 crc kubenswrapper[4971]: I0309 09:53:22.722848 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449a048a-278f-430c-a4b1-ce821639186e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "449a048a-278f-430c-a4b1-ce821639186e" (UID: "449a048a-278f-430c-a4b1-ce821639186e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:22 crc kubenswrapper[4971]: I0309 09:53:22.723897 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv"] Mar 09 09:53:22 crc kubenswrapper[4971]: I0309 09:53:22.802583 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bxff\" (UniqueName: \"kubernetes.io/projected/449a048a-278f-430c-a4b1-ce821639186e-kube-api-access-4bxff\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:22 crc kubenswrapper[4971]: I0309 09:53:22.802847 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/449a048a-278f-430c-a4b1-ce821639186e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:22 crc kubenswrapper[4971]: I0309 09:53:22.802944 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/449a048a-278f-430c-a4b1-ce821639186e-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:22 crc kubenswrapper[4971]: I0309 09:53:22.803021 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/449a048a-278f-430c-a4b1-ce821639186e-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:22 crc kubenswrapper[4971]: I0309 09:53:22.803103 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/449a048a-278f-430c-a4b1-ce821639186e-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:22 crc kubenswrapper[4971]: I0309 09:53:22.803186 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/449a048a-278f-430c-a4b1-ce821639186e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:23 crc kubenswrapper[4971]: I0309 09:53:23.160131 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="449a048a-278f-430c-a4b1-ce821639186e" path="/var/lib/kubelet/pods/449a048a-278f-430c-a4b1-ce821639186e/volumes" Mar 09 09:53:23 crc kubenswrapper[4971]: I0309 09:53:23.448971 4971 scope.go:117] "RemoveContainer" containerID="390506f353cf9c3a7bd3fad7931f955dc6dbcb7131cf8fb86b077f7d51d703f0" Mar 09 09:53:23 crc kubenswrapper[4971]: I0309 09:53:23.449251 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8h9vv" Mar 09 09:53:23 crc kubenswrapper[4971]: I0309 09:53:23.832451 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x"] Mar 09 09:53:23 crc kubenswrapper[4971]: E0309 09:53:23.832852 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="449a048a-278f-430c-a4b1-ce821639186e" containerName="swift-ring-rebalance" Mar 09 09:53:23 crc kubenswrapper[4971]: I0309 09:53:23.832867 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="449a048a-278f-430c-a4b1-ce821639186e" containerName="swift-ring-rebalance" Mar 09 09:53:23 crc kubenswrapper[4971]: I0309 09:53:23.833019 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="449a048a-278f-430c-a4b1-ce821639186e" containerName="swift-ring-rebalance" Mar 09 09:53:23 crc kubenswrapper[4971]: I0309 09:53:23.833483 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x" Mar 09 09:53:23 crc kubenswrapper[4971]: I0309 09:53:23.835517 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:53:23 crc kubenswrapper[4971]: I0309 09:53:23.835529 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:53:23 crc kubenswrapper[4971]: I0309 09:53:23.853921 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x"] Mar 09 09:53:23 crc kubenswrapper[4971]: I0309 09:53:23.918801 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7bc66383-d8e7-48b2-a292-42a6b675f846-etc-swift\") pod \"swift-ring-rebalance-debug-tgl2x\" (UID: \"7bc66383-d8e7-48b2-a292-42a6b675f846\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x" Mar 09 09:53:23 crc kubenswrapper[4971]: I0309 09:53:23.919090 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jw7j\" (UniqueName: \"kubernetes.io/projected/7bc66383-d8e7-48b2-a292-42a6b675f846-kube-api-access-5jw7j\") pod \"swift-ring-rebalance-debug-tgl2x\" (UID: \"7bc66383-d8e7-48b2-a292-42a6b675f846\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x" Mar 09 09:53:23 crc kubenswrapper[4971]: I0309 09:53:23.919217 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7bc66383-d8e7-48b2-a292-42a6b675f846-dispersionconf\") pod \"swift-ring-rebalance-debug-tgl2x\" (UID: \"7bc66383-d8e7-48b2-a292-42a6b675f846\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x" Mar 09 09:53:23 crc kubenswrapper[4971]: I0309 09:53:23.919317 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7bc66383-d8e7-48b2-a292-42a6b675f846-scripts\") pod \"swift-ring-rebalance-debug-tgl2x\" (UID: \"7bc66383-d8e7-48b2-a292-42a6b675f846\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x" Mar 09 09:53:23 crc kubenswrapper[4971]: I0309 09:53:23.919454 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7bc66383-d8e7-48b2-a292-42a6b675f846-ring-data-devices\") pod \"swift-ring-rebalance-debug-tgl2x\" (UID: \"7bc66383-d8e7-48b2-a292-42a6b675f846\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x" Mar 09 09:53:23 crc kubenswrapper[4971]: I0309 09:53:23.919620 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7bc66383-d8e7-48b2-a292-42a6b675f846-swiftconf\") pod \"swift-ring-rebalance-debug-tgl2x\" (UID: \"7bc66383-d8e7-48b2-a292-42a6b675f846\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x" Mar 09 09:53:24 crc kubenswrapper[4971]: I0309 09:53:24.020974 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7bc66383-d8e7-48b2-a292-42a6b675f846-swiftconf\") pod \"swift-ring-rebalance-debug-tgl2x\" (UID: \"7bc66383-d8e7-48b2-a292-42a6b675f846\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x" Mar 09 09:53:24 crc kubenswrapper[4971]: I0309 09:53:24.021386 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7bc66383-d8e7-48b2-a292-42a6b675f846-etc-swift\") pod \"swift-ring-rebalance-debug-tgl2x\" (UID: \"7bc66383-d8e7-48b2-a292-42a6b675f846\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x" Mar 09 09:53:24 crc kubenswrapper[4971]: I0309 09:53:24.021506 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jw7j\" (UniqueName: \"kubernetes.io/projected/7bc66383-d8e7-48b2-a292-42a6b675f846-kube-api-access-5jw7j\") pod \"swift-ring-rebalance-debug-tgl2x\" (UID: \"7bc66383-d8e7-48b2-a292-42a6b675f846\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x" Mar 09 09:53:24 crc kubenswrapper[4971]: I0309 09:53:24.021600 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7bc66383-d8e7-48b2-a292-42a6b675f846-dispersionconf\") pod \"swift-ring-rebalance-debug-tgl2x\" (UID: \"7bc66383-d8e7-48b2-a292-42a6b675f846\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x" Mar 09 09:53:24 crc kubenswrapper[4971]: I0309 09:53:24.021690 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7bc66383-d8e7-48b2-a292-42a6b675f846-scripts\") pod \"swift-ring-rebalance-debug-tgl2x\" (UID: \"7bc66383-d8e7-48b2-a292-42a6b675f846\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x" Mar 09 09:53:24 crc kubenswrapper[4971]: I0309 09:53:24.021764 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7bc66383-d8e7-48b2-a292-42a6b675f846-ring-data-devices\") pod \"swift-ring-rebalance-debug-tgl2x\" (UID: \"7bc66383-d8e7-48b2-a292-42a6b675f846\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x" Mar 09 09:53:24 crc kubenswrapper[4971]: I0309 09:53:24.021811 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7bc66383-d8e7-48b2-a292-42a6b675f846-etc-swift\") pod \"swift-ring-rebalance-debug-tgl2x\" (UID: \"7bc66383-d8e7-48b2-a292-42a6b675f846\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x" Mar 09 09:53:24 crc kubenswrapper[4971]: I0309 09:53:24.022441 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7bc66383-d8e7-48b2-a292-42a6b675f846-scripts\") pod \"swift-ring-rebalance-debug-tgl2x\" (UID: \"7bc66383-d8e7-48b2-a292-42a6b675f846\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x" Mar 09 09:53:24 crc kubenswrapper[4971]: I0309 09:53:24.022789 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7bc66383-d8e7-48b2-a292-42a6b675f846-ring-data-devices\") pod \"swift-ring-rebalance-debug-tgl2x\" (UID: \"7bc66383-d8e7-48b2-a292-42a6b675f846\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x" Mar 09 09:53:24 crc kubenswrapper[4971]: I0309 09:53:24.024972 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7bc66383-d8e7-48b2-a292-42a6b675f846-swiftconf\") pod \"swift-ring-rebalance-debug-tgl2x\" (UID: \"7bc66383-d8e7-48b2-a292-42a6b675f846\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x" Mar 09 09:53:24 crc kubenswrapper[4971]: I0309 09:53:24.025023 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7bc66383-d8e7-48b2-a292-42a6b675f846-dispersionconf\") pod \"swift-ring-rebalance-debug-tgl2x\" (UID: \"7bc66383-d8e7-48b2-a292-42a6b675f846\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x" Mar 09 09:53:24 crc kubenswrapper[4971]: I0309 09:53:24.039200 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jw7j\" (UniqueName: \"kubernetes.io/projected/7bc66383-d8e7-48b2-a292-42a6b675f846-kube-api-access-5jw7j\") pod \"swift-ring-rebalance-debug-tgl2x\" (UID: \"7bc66383-d8e7-48b2-a292-42a6b675f846\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x" Mar 09 09:53:24 crc kubenswrapper[4971]: I0309 09:53:24.158363 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x" Mar 09 09:53:25 crc kubenswrapper[4971]: I0309 09:53:25.115932 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x"] Mar 09 09:53:25 crc kubenswrapper[4971]: W0309 09:53:25.118508 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bc66383_d8e7_48b2_a292_42a6b675f846.slice/crio-914fa96ae7a9b4ff27df358043e4e25c07059099cfb3a0e9b7b2eb5afa7705d7 WatchSource:0}: Error finding container 914fa96ae7a9b4ff27df358043e4e25c07059099cfb3a0e9b7b2eb5afa7705d7: Status 404 returned error can't find the container with id 914fa96ae7a9b4ff27df358043e4e25c07059099cfb3a0e9b7b2eb5afa7705d7 Mar 09 09:53:25 crc kubenswrapper[4971]: I0309 09:53:25.472237 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x" event={"ID":"7bc66383-d8e7-48b2-a292-42a6b675f846","Type":"ContainerStarted","Data":"b2295612262f57f838d04515c97b73a9e972ac509fd97301b2af1f412177bbcf"} Mar 09 09:53:25 crc kubenswrapper[4971]: I0309 09:53:25.472626 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x" event={"ID":"7bc66383-d8e7-48b2-a292-42a6b675f846","Type":"ContainerStarted","Data":"914fa96ae7a9b4ff27df358043e4e25c07059099cfb3a0e9b7b2eb5afa7705d7"} Mar 09 09:53:25 crc kubenswrapper[4971]: I0309 09:53:25.495132 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x" podStartSLOduration=2.495111934 podStartE2EDuration="2.495111934s" podCreationTimestamp="2026-03-09 09:53:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:53:25.487069367 +0000 UTC m=+2009.046997177" watchObservedRunningTime="2026-03-09 09:53:25.495111934 +0000 UTC m=+2009.055039734" Mar 09 09:53:27 crc kubenswrapper[4971]: I0309 09:53:27.488860 4971 generic.go:334] "Generic (PLEG): container finished" podID="7bc66383-d8e7-48b2-a292-42a6b675f846" containerID="b2295612262f57f838d04515c97b73a9e972ac509fd97301b2af1f412177bbcf" exitCode=0 Mar 09 09:53:27 crc kubenswrapper[4971]: I0309 09:53:27.488940 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x" event={"ID":"7bc66383-d8e7-48b2-a292-42a6b675f846","Type":"ContainerDied","Data":"b2295612262f57f838d04515c97b73a9e972ac509fd97301b2af1f412177bbcf"} Mar 09 09:53:28 crc kubenswrapper[4971]: I0309 09:53:28.781290 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x" Mar 09 09:53:28 crc kubenswrapper[4971]: I0309 09:53:28.814641 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x"] Mar 09 09:53:28 crc kubenswrapper[4971]: I0309 09:53:28.820111 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x"] Mar 09 09:53:28 crc kubenswrapper[4971]: I0309 09:53:28.901467 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7bc66383-d8e7-48b2-a292-42a6b675f846-etc-swift\") pod \"7bc66383-d8e7-48b2-a292-42a6b675f846\" (UID: \"7bc66383-d8e7-48b2-a292-42a6b675f846\") " Mar 09 09:53:28 crc kubenswrapper[4971]: I0309 09:53:28.901570 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7bc66383-d8e7-48b2-a292-42a6b675f846-dispersionconf\") pod \"7bc66383-d8e7-48b2-a292-42a6b675f846\" (UID: \"7bc66383-d8e7-48b2-a292-42a6b675f846\") " Mar 09 09:53:28 crc kubenswrapper[4971]: I0309 09:53:28.901611 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7bc66383-d8e7-48b2-a292-42a6b675f846-ring-data-devices\") pod \"7bc66383-d8e7-48b2-a292-42a6b675f846\" (UID: \"7bc66383-d8e7-48b2-a292-42a6b675f846\") " Mar 09 09:53:28 crc kubenswrapper[4971]: I0309 09:53:28.901656 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jw7j\" (UniqueName: \"kubernetes.io/projected/7bc66383-d8e7-48b2-a292-42a6b675f846-kube-api-access-5jw7j\") pod \"7bc66383-d8e7-48b2-a292-42a6b675f846\" (UID: \"7bc66383-d8e7-48b2-a292-42a6b675f846\") " Mar 09 09:53:28 crc kubenswrapper[4971]: I0309 09:53:28.901723 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7bc66383-d8e7-48b2-a292-42a6b675f846-swiftconf\") pod \"7bc66383-d8e7-48b2-a292-42a6b675f846\" (UID: \"7bc66383-d8e7-48b2-a292-42a6b675f846\") " Mar 09 09:53:28 crc kubenswrapper[4971]: I0309 09:53:28.901767 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7bc66383-d8e7-48b2-a292-42a6b675f846-scripts\") pod \"7bc66383-d8e7-48b2-a292-42a6b675f846\" (UID: \"7bc66383-d8e7-48b2-a292-42a6b675f846\") " Mar 09 09:53:28 crc kubenswrapper[4971]: I0309 09:53:28.902227 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bc66383-d8e7-48b2-a292-42a6b675f846-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7bc66383-d8e7-48b2-a292-42a6b675f846" (UID: "7bc66383-d8e7-48b2-a292-42a6b675f846"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:53:28 crc kubenswrapper[4971]: I0309 09:53:28.903624 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bc66383-d8e7-48b2-a292-42a6b675f846-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7bc66383-d8e7-48b2-a292-42a6b675f846" (UID: "7bc66383-d8e7-48b2-a292-42a6b675f846"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:53:28 crc kubenswrapper[4971]: I0309 09:53:28.908618 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bc66383-d8e7-48b2-a292-42a6b675f846-kube-api-access-5jw7j" (OuterVolumeSpecName: "kube-api-access-5jw7j") pod "7bc66383-d8e7-48b2-a292-42a6b675f846" (UID: "7bc66383-d8e7-48b2-a292-42a6b675f846"). InnerVolumeSpecName "kube-api-access-5jw7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:53:28 crc kubenswrapper[4971]: I0309 09:53:28.925117 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bc66383-d8e7-48b2-a292-42a6b675f846-scripts" (OuterVolumeSpecName: "scripts") pod "7bc66383-d8e7-48b2-a292-42a6b675f846" (UID: "7bc66383-d8e7-48b2-a292-42a6b675f846"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:53:28 crc kubenswrapper[4971]: I0309 09:53:28.928664 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bc66383-d8e7-48b2-a292-42a6b675f846-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7bc66383-d8e7-48b2-a292-42a6b675f846" (UID: "7bc66383-d8e7-48b2-a292-42a6b675f846"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:28 crc kubenswrapper[4971]: I0309 09:53:28.943290 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bc66383-d8e7-48b2-a292-42a6b675f846-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7bc66383-d8e7-48b2-a292-42a6b675f846" (UID: "7bc66383-d8e7-48b2-a292-42a6b675f846"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:29 crc kubenswrapper[4971]: I0309 09:53:29.004172 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7bc66383-d8e7-48b2-a292-42a6b675f846-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:29 crc kubenswrapper[4971]: I0309 09:53:29.004253 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jw7j\" (UniqueName: \"kubernetes.io/projected/7bc66383-d8e7-48b2-a292-42a6b675f846-kube-api-access-5jw7j\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:29 crc kubenswrapper[4971]: I0309 09:53:29.004273 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7bc66383-d8e7-48b2-a292-42a6b675f846-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:29 crc kubenswrapper[4971]: I0309 09:53:29.004289 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7bc66383-d8e7-48b2-a292-42a6b675f846-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:29 crc kubenswrapper[4971]: I0309 09:53:29.004300 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7bc66383-d8e7-48b2-a292-42a6b675f846-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:29 crc kubenswrapper[4971]: I0309 09:53:29.004308 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7bc66383-d8e7-48b2-a292-42a6b675f846-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:29 crc kubenswrapper[4971]: I0309 09:53:29.162012 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bc66383-d8e7-48b2-a292-42a6b675f846" path="/var/lib/kubelet/pods/7bc66383-d8e7-48b2-a292-42a6b675f846/volumes" Mar 09 09:53:29 crc kubenswrapper[4971]: I0309 09:53:29.504924 4971 scope.go:117] "RemoveContainer" containerID="b2295612262f57f838d04515c97b73a9e972ac509fd97301b2af1f412177bbcf" Mar 09 09:53:29 crc kubenswrapper[4971]: I0309 09:53:29.504993 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tgl2x" Mar 09 09:53:29 crc kubenswrapper[4971]: I0309 09:53:29.954110 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl"] Mar 09 09:53:29 crc kubenswrapper[4971]: E0309 09:53:29.954819 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bc66383-d8e7-48b2-a292-42a6b675f846" containerName="swift-ring-rebalance" Mar 09 09:53:29 crc kubenswrapper[4971]: I0309 09:53:29.954836 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bc66383-d8e7-48b2-a292-42a6b675f846" containerName="swift-ring-rebalance" Mar 09 09:53:29 crc kubenswrapper[4971]: I0309 09:53:29.955018 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bc66383-d8e7-48b2-a292-42a6b675f846" containerName="swift-ring-rebalance" Mar 09 09:53:29 crc kubenswrapper[4971]: I0309 09:53:29.955698 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl" Mar 09 09:53:29 crc kubenswrapper[4971]: I0309 09:53:29.957544 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:53:29 crc kubenswrapper[4971]: I0309 09:53:29.958085 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:53:29 crc kubenswrapper[4971]: I0309 09:53:29.962886 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl"] Mar 09 09:53:30 crc kubenswrapper[4971]: I0309 09:53:30.021384 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-dispersionconf\") pod \"swift-ring-rebalance-debug-2gzsl\" (UID: \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl" Mar 09 09:53:30 crc kubenswrapper[4971]: I0309 09:53:30.021475 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-swiftconf\") pod \"swift-ring-rebalance-debug-2gzsl\" (UID: \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl" Mar 09 09:53:30 crc kubenswrapper[4971]: I0309 09:53:30.021529 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-scripts\") pod \"swift-ring-rebalance-debug-2gzsl\" (UID: \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl" Mar 09 09:53:30 crc kubenswrapper[4971]: I0309 09:53:30.021584 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9cjb\" (UniqueName: \"kubernetes.io/projected/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-kube-api-access-k9cjb\") pod \"swift-ring-rebalance-debug-2gzsl\" (UID: \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl" Mar 09 09:53:30 crc kubenswrapper[4971]: I0309 09:53:30.021630 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-etc-swift\") pod \"swift-ring-rebalance-debug-2gzsl\" (UID: \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl" Mar 09 09:53:30 crc kubenswrapper[4971]: I0309 09:53:30.021786 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-ring-data-devices\") pod \"swift-ring-rebalance-debug-2gzsl\" (UID: \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl" Mar 09 09:53:30 crc kubenswrapper[4971]: I0309 09:53:30.123807 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-ring-data-devices\") pod \"swift-ring-rebalance-debug-2gzsl\" (UID: \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl" Mar 09 09:53:30 crc kubenswrapper[4971]: I0309 09:53:30.123911 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-dispersionconf\") pod \"swift-ring-rebalance-debug-2gzsl\" (UID: \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl" Mar 09 09:53:30 crc kubenswrapper[4971]: I0309 09:53:30.123957 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-swiftconf\") pod \"swift-ring-rebalance-debug-2gzsl\" (UID: \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl" Mar 09 09:53:30 crc kubenswrapper[4971]: I0309 09:53:30.123982 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-scripts\") pod \"swift-ring-rebalance-debug-2gzsl\" (UID: \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl" Mar 09 09:53:30 crc kubenswrapper[4971]: I0309 09:53:30.124028 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9cjb\" (UniqueName: \"kubernetes.io/projected/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-kube-api-access-k9cjb\") pod \"swift-ring-rebalance-debug-2gzsl\" (UID: \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl" Mar 09 09:53:30 crc kubenswrapper[4971]: I0309 09:53:30.124077 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-etc-swift\") pod \"swift-ring-rebalance-debug-2gzsl\" (UID: \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl" Mar 09 09:53:30 crc kubenswrapper[4971]: I0309 09:53:30.124720 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-etc-swift\") pod \"swift-ring-rebalance-debug-2gzsl\" (UID: \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl" Mar 09 09:53:30 crc kubenswrapper[4971]: I0309 09:53:30.125049 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-scripts\") pod \"swift-ring-rebalance-debug-2gzsl\" (UID: \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl" Mar 09 09:53:30 crc kubenswrapper[4971]: I0309 09:53:30.125049 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-ring-data-devices\") pod \"swift-ring-rebalance-debug-2gzsl\" (UID: \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl" Mar 09 09:53:30 crc kubenswrapper[4971]: I0309 09:53:30.128475 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-dispersionconf\") pod \"swift-ring-rebalance-debug-2gzsl\" (UID: \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl" Mar 09 09:53:30 crc kubenswrapper[4971]: I0309 09:53:30.130001 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-swiftconf\") pod \"swift-ring-rebalance-debug-2gzsl\" (UID: \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl" Mar 09 09:53:30 crc kubenswrapper[4971]: I0309 09:53:30.139873 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9cjb\" (UniqueName: \"kubernetes.io/projected/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-kube-api-access-k9cjb\") pod \"swift-ring-rebalance-debug-2gzsl\" (UID: \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl" Mar 09 09:53:30 crc kubenswrapper[4971]: I0309 09:53:30.272138 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl" Mar 09 09:53:30 crc kubenswrapper[4971]: I0309 09:53:30.677777 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl"] Mar 09 09:53:31 crc kubenswrapper[4971]: I0309 09:53:31.522041 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl" event={"ID":"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e","Type":"ContainerStarted","Data":"47fdddbf06817cefed5a211ccd3459396039eaf1b67b3218d67c3d5dcc40ac18"} Mar 09 09:53:31 crc kubenswrapper[4971]: I0309 09:53:31.522294 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl" event={"ID":"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e","Type":"ContainerStarted","Data":"3269b626c441980f12a7ebd7495ffc41213f8fca415046b56a6b3ff8b1f44d14"} Mar 09 09:53:31 crc kubenswrapper[4971]: I0309 09:53:31.544330 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl" podStartSLOduration=2.544312851 podStartE2EDuration="2.544312851s" podCreationTimestamp="2026-03-09 09:53:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:53:31.537609763 +0000 UTC m=+2015.097537583" watchObservedRunningTime="2026-03-09 09:53:31.544312851 +0000 UTC m=+2015.104240661" Mar 09 09:53:32 crc kubenswrapper[4971]: I0309 09:53:32.151655 4971 scope.go:117] "RemoveContainer" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" Mar 09 09:53:32 crc kubenswrapper[4971]: E0309 09:53:32.151880 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 09:53:32 crc kubenswrapper[4971]: I0309 09:53:32.530916 4971 generic.go:334] "Generic (PLEG): container finished" podID="7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e" containerID="47fdddbf06817cefed5a211ccd3459396039eaf1b67b3218d67c3d5dcc40ac18" exitCode=0 Mar 09 09:53:32 crc kubenswrapper[4971]: I0309 09:53:32.530973 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl" event={"ID":"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e","Type":"ContainerDied","Data":"47fdddbf06817cefed5a211ccd3459396039eaf1b67b3218d67c3d5dcc40ac18"} Mar 09 09:53:33 crc kubenswrapper[4971]: I0309 09:53:33.784949 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl" Mar 09 09:53:33 crc kubenswrapper[4971]: I0309 09:53:33.814135 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl"] Mar 09 09:53:33 crc kubenswrapper[4971]: I0309 09:53:33.823278 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl"] Mar 09 09:53:33 crc kubenswrapper[4971]: I0309 09:53:33.874633 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-ring-data-devices\") pod \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\" (UID: \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\") " Mar 09 09:53:33 crc kubenswrapper[4971]: I0309 09:53:33.874768 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-scripts\") pod \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\" (UID: \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\") " Mar 09 09:53:33 crc kubenswrapper[4971]: I0309 09:53:33.874832 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-swiftconf\") pod \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\" (UID: \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\") " Mar 09 09:53:33 crc kubenswrapper[4971]: I0309 09:53:33.874850 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-etc-swift\") pod \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\" (UID: \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\") " Mar 09 09:53:33 crc kubenswrapper[4971]: I0309 09:53:33.874871 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9cjb\" (UniqueName: \"kubernetes.io/projected/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-kube-api-access-k9cjb\") pod \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\" (UID: \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\") " Mar 09 09:53:33 crc kubenswrapper[4971]: I0309 09:53:33.874926 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-dispersionconf\") pod \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\" (UID: \"7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e\") " Mar 09 09:53:33 crc kubenswrapper[4971]: I0309 09:53:33.876293 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e" (UID: "7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:53:33 crc kubenswrapper[4971]: I0309 09:53:33.876958 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e" (UID: "7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:53:33 crc kubenswrapper[4971]: I0309 09:53:33.881265 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-kube-api-access-k9cjb" (OuterVolumeSpecName: "kube-api-access-k9cjb") pod "7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e" (UID: "7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e"). InnerVolumeSpecName "kube-api-access-k9cjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:53:33 crc kubenswrapper[4971]: I0309 09:53:33.902490 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e" (UID: "7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:33 crc kubenswrapper[4971]: I0309 09:53:33.902810 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e" (UID: "7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:33 crc kubenswrapper[4971]: I0309 09:53:33.903490 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-scripts" (OuterVolumeSpecName: "scripts") pod "7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e" (UID: "7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:53:33 crc kubenswrapper[4971]: I0309 09:53:33.976896 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:33 crc kubenswrapper[4971]: I0309 09:53:33.976935 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:33 crc kubenswrapper[4971]: I0309 09:53:33.976946 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:33 crc kubenswrapper[4971]: I0309 09:53:33.976958 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9cjb\" (UniqueName: \"kubernetes.io/projected/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-kube-api-access-k9cjb\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:33 crc kubenswrapper[4971]: I0309 09:53:33.976969 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:33 crc kubenswrapper[4971]: I0309 09:53:33.976977 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:34 crc kubenswrapper[4971]: I0309 09:53:34.546690 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3269b626c441980f12a7ebd7495ffc41213f8fca415046b56a6b3ff8b1f44d14" Mar 09 09:53:34 crc kubenswrapper[4971]: I0309 09:53:34.546713 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2gzsl" Mar 09 09:53:34 crc kubenswrapper[4971]: I0309 09:53:34.941476 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5"] Mar 09 09:53:34 crc kubenswrapper[4971]: E0309 09:53:34.942084 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e" containerName="swift-ring-rebalance" Mar 09 09:53:34 crc kubenswrapper[4971]: I0309 09:53:34.942100 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e" containerName="swift-ring-rebalance" Mar 09 09:53:34 crc kubenswrapper[4971]: I0309 09:53:34.942239 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e" containerName="swift-ring-rebalance" Mar 09 09:53:34 crc kubenswrapper[4971]: I0309 09:53:34.942901 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5" Mar 09 09:53:34 crc kubenswrapper[4971]: I0309 09:53:34.944824 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:53:34 crc kubenswrapper[4971]: I0309 09:53:34.944846 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:53:34 crc kubenswrapper[4971]: I0309 09:53:34.955239 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5"] Mar 09 09:53:34 crc kubenswrapper[4971]: I0309 09:53:34.991948 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/aae10185-432c-4b63-9db5-789e49f9c9b0-etc-swift\") pod \"swift-ring-rebalance-debug-2mnm5\" (UID: \"aae10185-432c-4b63-9db5-789e49f9c9b0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5" Mar 09 09:53:34 crc kubenswrapper[4971]: I0309 09:53:34.992013 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/aae10185-432c-4b63-9db5-789e49f9c9b0-ring-data-devices\") pod \"swift-ring-rebalance-debug-2mnm5\" (UID: \"aae10185-432c-4b63-9db5-789e49f9c9b0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5" Mar 09 09:53:34 crc kubenswrapper[4971]: I0309 09:53:34.992114 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/aae10185-432c-4b63-9db5-789e49f9c9b0-dispersionconf\") pod \"swift-ring-rebalance-debug-2mnm5\" (UID: \"aae10185-432c-4b63-9db5-789e49f9c9b0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5" Mar 09 09:53:34 crc kubenswrapper[4971]: I0309 09:53:34.992211 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95jrm\" (UniqueName: \"kubernetes.io/projected/aae10185-432c-4b63-9db5-789e49f9c9b0-kube-api-access-95jrm\") pod \"swift-ring-rebalance-debug-2mnm5\" (UID: \"aae10185-432c-4b63-9db5-789e49f9c9b0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5" Mar 09 09:53:34 crc kubenswrapper[4971]: I0309 09:53:34.992270 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/aae10185-432c-4b63-9db5-789e49f9c9b0-swiftconf\") pod \"swift-ring-rebalance-debug-2mnm5\" (UID: \"aae10185-432c-4b63-9db5-789e49f9c9b0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5" Mar 09 09:53:34 crc kubenswrapper[4971]: I0309 09:53:34.992311 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aae10185-432c-4b63-9db5-789e49f9c9b0-scripts\") pod \"swift-ring-rebalance-debug-2mnm5\" (UID: \"aae10185-432c-4b63-9db5-789e49f9c9b0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5" Mar 09 09:53:35 crc kubenswrapper[4971]: I0309 09:53:35.094161 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95jrm\" (UniqueName: \"kubernetes.io/projected/aae10185-432c-4b63-9db5-789e49f9c9b0-kube-api-access-95jrm\") pod \"swift-ring-rebalance-debug-2mnm5\" (UID: \"aae10185-432c-4b63-9db5-789e49f9c9b0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5" Mar 09 09:53:35 crc kubenswrapper[4971]: I0309 09:53:35.094275 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/aae10185-432c-4b63-9db5-789e49f9c9b0-swiftconf\") pod \"swift-ring-rebalance-debug-2mnm5\" (UID: \"aae10185-432c-4b63-9db5-789e49f9c9b0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5" Mar 09 09:53:35 crc kubenswrapper[4971]: I0309 09:53:35.094378 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aae10185-432c-4b63-9db5-789e49f9c9b0-scripts\") pod \"swift-ring-rebalance-debug-2mnm5\" (UID: \"aae10185-432c-4b63-9db5-789e49f9c9b0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5" Mar 09 09:53:35 crc kubenswrapper[4971]: I0309 09:53:35.094406 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/aae10185-432c-4b63-9db5-789e49f9c9b0-etc-swift\") pod \"swift-ring-rebalance-debug-2mnm5\" (UID: \"aae10185-432c-4b63-9db5-789e49f9c9b0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5" Mar 09 09:53:35 crc kubenswrapper[4971]: I0309 09:53:35.094431 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/aae10185-432c-4b63-9db5-789e49f9c9b0-ring-data-devices\") pod \"swift-ring-rebalance-debug-2mnm5\" (UID: \"aae10185-432c-4b63-9db5-789e49f9c9b0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5" Mar 09 09:53:35 crc kubenswrapper[4971]: I0309 09:53:35.094509 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/aae10185-432c-4b63-9db5-789e49f9c9b0-dispersionconf\") pod \"swift-ring-rebalance-debug-2mnm5\" (UID: \"aae10185-432c-4b63-9db5-789e49f9c9b0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5" Mar 09 09:53:35 crc kubenswrapper[4971]: I0309 09:53:35.095156 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aae10185-432c-4b63-9db5-789e49f9c9b0-scripts\") pod \"swift-ring-rebalance-debug-2mnm5\" (UID: \"aae10185-432c-4b63-9db5-789e49f9c9b0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5" Mar 09 09:53:35 crc kubenswrapper[4971]: I0309 09:53:35.095397 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/aae10185-432c-4b63-9db5-789e49f9c9b0-ring-data-devices\") pod \"swift-ring-rebalance-debug-2mnm5\" (UID: \"aae10185-432c-4b63-9db5-789e49f9c9b0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5" Mar 09 09:53:35 crc kubenswrapper[4971]: I0309 09:53:35.095572 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/aae10185-432c-4b63-9db5-789e49f9c9b0-etc-swift\") pod \"swift-ring-rebalance-debug-2mnm5\" (UID: \"aae10185-432c-4b63-9db5-789e49f9c9b0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5" Mar 09 09:53:35 crc kubenswrapper[4971]: I0309 09:53:35.101568 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/aae10185-432c-4b63-9db5-789e49f9c9b0-swiftconf\") pod \"swift-ring-rebalance-debug-2mnm5\" (UID: \"aae10185-432c-4b63-9db5-789e49f9c9b0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5" Mar 09 09:53:35 crc kubenswrapper[4971]: I0309 09:53:35.110040 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/aae10185-432c-4b63-9db5-789e49f9c9b0-dispersionconf\") pod \"swift-ring-rebalance-debug-2mnm5\" (UID: \"aae10185-432c-4b63-9db5-789e49f9c9b0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5" Mar 09 09:53:35 crc kubenswrapper[4971]: I0309 09:53:35.114837 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95jrm\" (UniqueName: \"kubernetes.io/projected/aae10185-432c-4b63-9db5-789e49f9c9b0-kube-api-access-95jrm\") pod \"swift-ring-rebalance-debug-2mnm5\" (UID: \"aae10185-432c-4b63-9db5-789e49f9c9b0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5" Mar 09 09:53:35 crc kubenswrapper[4971]: I0309 09:53:35.161982 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e" path="/var/lib/kubelet/pods/7d0a0d92-2bab-4e5f-8cf2-8621b904fc9e/volumes" Mar 09 09:53:35 crc kubenswrapper[4971]: I0309 09:53:35.291907 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5" Mar 09 09:53:35 crc kubenswrapper[4971]: I0309 09:53:35.708019 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5"] Mar 09 09:53:36 crc kubenswrapper[4971]: I0309 09:53:36.561628 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5" event={"ID":"aae10185-432c-4b63-9db5-789e49f9c9b0","Type":"ContainerStarted","Data":"34361f49e943722262d3935de3cc3fb473a0c544160ef918c9a89065e6e29ad2"} Mar 09 09:53:36 crc kubenswrapper[4971]: I0309 09:53:36.561985 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5" event={"ID":"aae10185-432c-4b63-9db5-789e49f9c9b0","Type":"ContainerStarted","Data":"f221dff60176674b7778ed4c707f35cbda1ff2f7ce2cea70da30f48e03e497a7"} Mar 09 09:53:36 crc kubenswrapper[4971]: I0309 09:53:36.585702 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5" podStartSLOduration=2.5856858799999998 podStartE2EDuration="2.58568588s" podCreationTimestamp="2026-03-09 09:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:53:36.578496987 +0000 UTC m=+2020.138424797" watchObservedRunningTime="2026-03-09 09:53:36.58568588 +0000 UTC m=+2020.145613680" Mar 09 09:53:38 crc kubenswrapper[4971]: I0309 09:53:38.578122 4971 generic.go:334] "Generic (PLEG): container finished" podID="aae10185-432c-4b63-9db5-789e49f9c9b0" containerID="34361f49e943722262d3935de3cc3fb473a0c544160ef918c9a89065e6e29ad2" exitCode=0 Mar 09 09:53:38 crc kubenswrapper[4971]: I0309 09:53:38.578210 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5" event={"ID":"aae10185-432c-4b63-9db5-789e49f9c9b0","Type":"ContainerDied","Data":"34361f49e943722262d3935de3cc3fb473a0c544160ef918c9a89065e6e29ad2"} Mar 09 09:53:39 crc kubenswrapper[4971]: I0309 09:53:39.901898 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5" Mar 09 09:53:39 crc kubenswrapper[4971]: I0309 09:53:39.935124 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5"] Mar 09 09:53:39 crc kubenswrapper[4971]: I0309 09:53:39.942581 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5"] Mar 09 09:53:40 crc kubenswrapper[4971]: I0309 09:53:40.064701 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/aae10185-432c-4b63-9db5-789e49f9c9b0-dispersionconf\") pod \"aae10185-432c-4b63-9db5-789e49f9c9b0\" (UID: \"aae10185-432c-4b63-9db5-789e49f9c9b0\") " Mar 09 09:53:40 crc kubenswrapper[4971]: I0309 09:53:40.064802 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aae10185-432c-4b63-9db5-789e49f9c9b0-scripts\") pod \"aae10185-432c-4b63-9db5-789e49f9c9b0\" (UID: \"aae10185-432c-4b63-9db5-789e49f9c9b0\") " Mar 09 09:53:40 crc kubenswrapper[4971]: I0309 09:53:40.064830 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/aae10185-432c-4b63-9db5-789e49f9c9b0-etc-swift\") pod \"aae10185-432c-4b63-9db5-789e49f9c9b0\" (UID: \"aae10185-432c-4b63-9db5-789e49f9c9b0\") " Mar 09 09:53:40 crc kubenswrapper[4971]: I0309 09:53:40.064908 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/aae10185-432c-4b63-9db5-789e49f9c9b0-ring-data-devices\") pod \"aae10185-432c-4b63-9db5-789e49f9c9b0\" (UID: \"aae10185-432c-4b63-9db5-789e49f9c9b0\") " Mar 09 09:53:40 crc kubenswrapper[4971]: I0309 09:53:40.064934 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/aae10185-432c-4b63-9db5-789e49f9c9b0-swiftconf\") pod \"aae10185-432c-4b63-9db5-789e49f9c9b0\" (UID: \"aae10185-432c-4b63-9db5-789e49f9c9b0\") " Mar 09 09:53:40 crc kubenswrapper[4971]: I0309 09:53:40.064969 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95jrm\" (UniqueName: \"kubernetes.io/projected/aae10185-432c-4b63-9db5-789e49f9c9b0-kube-api-access-95jrm\") pod \"aae10185-432c-4b63-9db5-789e49f9c9b0\" (UID: \"aae10185-432c-4b63-9db5-789e49f9c9b0\") " Mar 09 09:53:40 crc kubenswrapper[4971]: I0309 09:53:40.065700 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aae10185-432c-4b63-9db5-789e49f9c9b0-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "aae10185-432c-4b63-9db5-789e49f9c9b0" (UID: "aae10185-432c-4b63-9db5-789e49f9c9b0"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:53:40 crc kubenswrapper[4971]: I0309 09:53:40.066080 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aae10185-432c-4b63-9db5-789e49f9c9b0-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "aae10185-432c-4b63-9db5-789e49f9c9b0" (UID: "aae10185-432c-4b63-9db5-789e49f9c9b0"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:53:40 crc kubenswrapper[4971]: I0309 09:53:40.072276 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae10185-432c-4b63-9db5-789e49f9c9b0-kube-api-access-95jrm" (OuterVolumeSpecName: "kube-api-access-95jrm") pod "aae10185-432c-4b63-9db5-789e49f9c9b0" (UID: "aae10185-432c-4b63-9db5-789e49f9c9b0"). InnerVolumeSpecName "kube-api-access-95jrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:53:40 crc kubenswrapper[4971]: E0309 09:53:40.088603 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aae10185-432c-4b63-9db5-789e49f9c9b0-scripts podName:aae10185-432c-4b63-9db5-789e49f9c9b0 nodeName:}" failed. No retries permitted until 2026-03-09 09:53:40.588578014 +0000 UTC m=+2024.148505824 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "scripts" (UniqueName: "kubernetes.io/configmap/aae10185-432c-4b63-9db5-789e49f9c9b0-scripts") pod "aae10185-432c-4b63-9db5-789e49f9c9b0" (UID: "aae10185-432c-4b63-9db5-789e49f9c9b0") : error deleting /var/lib/kubelet/pods/aae10185-432c-4b63-9db5-789e49f9c9b0/volume-subpaths: remove /var/lib/kubelet/pods/aae10185-432c-4b63-9db5-789e49f9c9b0/volume-subpaths: no such file or directory Mar 09 09:53:40 crc kubenswrapper[4971]: I0309 09:53:40.090209 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aae10185-432c-4b63-9db5-789e49f9c9b0-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "aae10185-432c-4b63-9db5-789e49f9c9b0" (UID: "aae10185-432c-4b63-9db5-789e49f9c9b0"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:40 crc kubenswrapper[4971]: I0309 09:53:40.092606 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aae10185-432c-4b63-9db5-789e49f9c9b0-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "aae10185-432c-4b63-9db5-789e49f9c9b0" (UID: "aae10185-432c-4b63-9db5-789e49f9c9b0"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:40 crc kubenswrapper[4971]: I0309 09:53:40.166835 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/aae10185-432c-4b63-9db5-789e49f9c9b0-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:40 crc kubenswrapper[4971]: I0309 09:53:40.166870 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/aae10185-432c-4b63-9db5-789e49f9c9b0-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:40 crc kubenswrapper[4971]: I0309 09:53:40.166880 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95jrm\" (UniqueName: \"kubernetes.io/projected/aae10185-432c-4b63-9db5-789e49f9c9b0-kube-api-access-95jrm\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:40 crc kubenswrapper[4971]: I0309 09:53:40.166890 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/aae10185-432c-4b63-9db5-789e49f9c9b0-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:40 crc kubenswrapper[4971]: I0309 09:53:40.166898 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/aae10185-432c-4b63-9db5-789e49f9c9b0-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:40 crc kubenswrapper[4971]: I0309 09:53:40.594365 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f221dff60176674b7778ed4c707f35cbda1ff2f7ce2cea70da30f48e03e497a7" Mar 09 09:53:40 crc kubenswrapper[4971]: I0309 09:53:40.594445 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnm5" Mar 09 09:53:40 crc kubenswrapper[4971]: I0309 09:53:40.674231 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aae10185-432c-4b63-9db5-789e49f9c9b0-scripts\") pod \"aae10185-432c-4b63-9db5-789e49f9c9b0\" (UID: \"aae10185-432c-4b63-9db5-789e49f9c9b0\") " Mar 09 09:53:40 crc kubenswrapper[4971]: I0309 09:53:40.674994 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aae10185-432c-4b63-9db5-789e49f9c9b0-scripts" (OuterVolumeSpecName: "scripts") pod "aae10185-432c-4b63-9db5-789e49f9c9b0" (UID: "aae10185-432c-4b63-9db5-789e49f9c9b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:53:40 crc kubenswrapper[4971]: I0309 09:53:40.776070 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aae10185-432c-4b63-9db5-789e49f9c9b0-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.050676 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8"] Mar 09 09:53:41 crc kubenswrapper[4971]: E0309 09:53:41.050990 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aae10185-432c-4b63-9db5-789e49f9c9b0" containerName="swift-ring-rebalance" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.051004 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae10185-432c-4b63-9db5-789e49f9c9b0" containerName="swift-ring-rebalance" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.051152 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="aae10185-432c-4b63-9db5-789e49f9c9b0" containerName="swift-ring-rebalance" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.051661 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.053892 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.061387 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8"] Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.062489 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.160657 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aae10185-432c-4b63-9db5-789e49f9c9b0" path="/var/lib/kubelet/pods/aae10185-432c-4b63-9db5-789e49f9c9b0/volumes" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.180638 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56jlc\" (UniqueName: \"kubernetes.io/projected/edac722d-aedb-45ab-93d2-f0a5551d1f80-kube-api-access-56jlc\") pod \"swift-ring-rebalance-debug-rmnq8\" (UID: \"edac722d-aedb-45ab-93d2-f0a5551d1f80\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.180699 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edac722d-aedb-45ab-93d2-f0a5551d1f80-scripts\") pod \"swift-ring-rebalance-debug-rmnq8\" (UID: \"edac722d-aedb-45ab-93d2-f0a5551d1f80\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.180764 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/edac722d-aedb-45ab-93d2-f0a5551d1f80-ring-data-devices\") pod \"swift-ring-rebalance-debug-rmnq8\" (UID: \"edac722d-aedb-45ab-93d2-f0a5551d1f80\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.180794 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/edac722d-aedb-45ab-93d2-f0a5551d1f80-dispersionconf\") pod \"swift-ring-rebalance-debug-rmnq8\" (UID: \"edac722d-aedb-45ab-93d2-f0a5551d1f80\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.180829 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/edac722d-aedb-45ab-93d2-f0a5551d1f80-etc-swift\") pod \"swift-ring-rebalance-debug-rmnq8\" (UID: \"edac722d-aedb-45ab-93d2-f0a5551d1f80\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.180854 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/edac722d-aedb-45ab-93d2-f0a5551d1f80-swiftconf\") pod \"swift-ring-rebalance-debug-rmnq8\" (UID: \"edac722d-aedb-45ab-93d2-f0a5551d1f80\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.282227 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edac722d-aedb-45ab-93d2-f0a5551d1f80-scripts\") pod \"swift-ring-rebalance-debug-rmnq8\" (UID: \"edac722d-aedb-45ab-93d2-f0a5551d1f80\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.282333 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/edac722d-aedb-45ab-93d2-f0a5551d1f80-ring-data-devices\") pod \"swift-ring-rebalance-debug-rmnq8\" (UID: \"edac722d-aedb-45ab-93d2-f0a5551d1f80\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.282374 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/edac722d-aedb-45ab-93d2-f0a5551d1f80-dispersionconf\") pod \"swift-ring-rebalance-debug-rmnq8\" (UID: \"edac722d-aedb-45ab-93d2-f0a5551d1f80\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.282411 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/edac722d-aedb-45ab-93d2-f0a5551d1f80-etc-swift\") pod \"swift-ring-rebalance-debug-rmnq8\" (UID: \"edac722d-aedb-45ab-93d2-f0a5551d1f80\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.282428 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/edac722d-aedb-45ab-93d2-f0a5551d1f80-swiftconf\") pod \"swift-ring-rebalance-debug-rmnq8\" (UID: \"edac722d-aedb-45ab-93d2-f0a5551d1f80\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.282561 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56jlc\" (UniqueName: \"kubernetes.io/projected/edac722d-aedb-45ab-93d2-f0a5551d1f80-kube-api-access-56jlc\") pod \"swift-ring-rebalance-debug-rmnq8\" (UID: \"edac722d-aedb-45ab-93d2-f0a5551d1f80\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.283225 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/edac722d-aedb-45ab-93d2-f0a5551d1f80-ring-data-devices\") pod \"swift-ring-rebalance-debug-rmnq8\" (UID: \"edac722d-aedb-45ab-93d2-f0a5551d1f80\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.283607 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/edac722d-aedb-45ab-93d2-f0a5551d1f80-etc-swift\") pod \"swift-ring-rebalance-debug-rmnq8\" (UID: \"edac722d-aedb-45ab-93d2-f0a5551d1f80\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.283792 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edac722d-aedb-45ab-93d2-f0a5551d1f80-scripts\") pod \"swift-ring-rebalance-debug-rmnq8\" (UID: \"edac722d-aedb-45ab-93d2-f0a5551d1f80\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.289446 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/edac722d-aedb-45ab-93d2-f0a5551d1f80-dispersionconf\") pod \"swift-ring-rebalance-debug-rmnq8\" (UID: \"edac722d-aedb-45ab-93d2-f0a5551d1f80\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.300453 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/edac722d-aedb-45ab-93d2-f0a5551d1f80-swiftconf\") pod \"swift-ring-rebalance-debug-rmnq8\" (UID: \"edac722d-aedb-45ab-93d2-f0a5551d1f80\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.301060 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56jlc\" (UniqueName: \"kubernetes.io/projected/edac722d-aedb-45ab-93d2-f0a5551d1f80-kube-api-access-56jlc\") pod \"swift-ring-rebalance-debug-rmnq8\" (UID: \"edac722d-aedb-45ab-93d2-f0a5551d1f80\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.365772 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8" Mar 09 09:53:41 crc kubenswrapper[4971]: I0309 09:53:41.685846 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8"] Mar 09 09:53:42 crc kubenswrapper[4971]: I0309 09:53:42.618693 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8" event={"ID":"edac722d-aedb-45ab-93d2-f0a5551d1f80","Type":"ContainerStarted","Data":"d282d14fcca4536ef1e21ceea9fef850f562b796bb9ec3f5ed35e04c8fd26570"} Mar 09 09:53:42 crc kubenswrapper[4971]: I0309 09:53:42.619040 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8" event={"ID":"edac722d-aedb-45ab-93d2-f0a5551d1f80","Type":"ContainerStarted","Data":"f56fc733dbc4c292da9b3fdfaa22957bf69d3b976546cc51ad19416d88b71b7b"} Mar 09 09:53:42 crc kubenswrapper[4971]: I0309 09:53:42.644145 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8" podStartSLOduration=1.6441237260000001 podStartE2EDuration="1.644123726s" podCreationTimestamp="2026-03-09 09:53:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:53:42.641713539 +0000 UTC m=+2026.201641349" watchObservedRunningTime="2026-03-09 09:53:42.644123726 +0000 UTC m=+2026.204051536" Mar 09 09:53:43 crc kubenswrapper[4971]: I0309 09:53:43.628680 4971 generic.go:334] "Generic (PLEG): container finished" podID="edac722d-aedb-45ab-93d2-f0a5551d1f80" containerID="d282d14fcca4536ef1e21ceea9fef850f562b796bb9ec3f5ed35e04c8fd26570" exitCode=0 Mar 09 09:53:43 crc kubenswrapper[4971]: I0309 09:53:43.628790 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8" event={"ID":"edac722d-aedb-45ab-93d2-f0a5551d1f80","Type":"ContainerDied","Data":"d282d14fcca4536ef1e21ceea9fef850f562b796bb9ec3f5ed35e04c8fd26570"} Mar 09 09:53:44 crc kubenswrapper[4971]: I0309 09:53:44.915299 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8" Mar 09 09:53:44 crc kubenswrapper[4971]: I0309 09:53:44.948065 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8"] Mar 09 09:53:44 crc kubenswrapper[4971]: I0309 09:53:44.952075 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8"] Mar 09 09:53:45 crc kubenswrapper[4971]: I0309 09:53:45.038106 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/edac722d-aedb-45ab-93d2-f0a5551d1f80-swiftconf\") pod \"edac722d-aedb-45ab-93d2-f0a5551d1f80\" (UID: \"edac722d-aedb-45ab-93d2-f0a5551d1f80\") " Mar 09 09:53:45 crc kubenswrapper[4971]: I0309 09:53:45.038208 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56jlc\" (UniqueName: \"kubernetes.io/projected/edac722d-aedb-45ab-93d2-f0a5551d1f80-kube-api-access-56jlc\") pod \"edac722d-aedb-45ab-93d2-f0a5551d1f80\" (UID: \"edac722d-aedb-45ab-93d2-f0a5551d1f80\") " Mar 09 09:53:45 crc kubenswrapper[4971]: I0309 09:53:45.038233 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/edac722d-aedb-45ab-93d2-f0a5551d1f80-dispersionconf\") pod \"edac722d-aedb-45ab-93d2-f0a5551d1f80\" (UID: \"edac722d-aedb-45ab-93d2-f0a5551d1f80\") " Mar 09 09:53:45 crc kubenswrapper[4971]: I0309 09:53:45.038279 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edac722d-aedb-45ab-93d2-f0a5551d1f80-scripts\") pod \"edac722d-aedb-45ab-93d2-f0a5551d1f80\" (UID: \"edac722d-aedb-45ab-93d2-f0a5551d1f80\") " Mar 09 09:53:45 crc kubenswrapper[4971]: I0309 09:53:45.038307 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/edac722d-aedb-45ab-93d2-f0a5551d1f80-ring-data-devices\") pod \"edac722d-aedb-45ab-93d2-f0a5551d1f80\" (UID: \"edac722d-aedb-45ab-93d2-f0a5551d1f80\") " Mar 09 09:53:45 crc kubenswrapper[4971]: I0309 09:53:45.038420 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/edac722d-aedb-45ab-93d2-f0a5551d1f80-etc-swift\") pod \"edac722d-aedb-45ab-93d2-f0a5551d1f80\" (UID: \"edac722d-aedb-45ab-93d2-f0a5551d1f80\") " Mar 09 09:53:45 crc kubenswrapper[4971]: I0309 09:53:45.039134 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edac722d-aedb-45ab-93d2-f0a5551d1f80-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "edac722d-aedb-45ab-93d2-f0a5551d1f80" (UID: "edac722d-aedb-45ab-93d2-f0a5551d1f80"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:53:45 crc kubenswrapper[4971]: I0309 09:53:45.039265 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edac722d-aedb-45ab-93d2-f0a5551d1f80-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "edac722d-aedb-45ab-93d2-f0a5551d1f80" (UID: "edac722d-aedb-45ab-93d2-f0a5551d1f80"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:53:45 crc kubenswrapper[4971]: I0309 09:53:45.043962 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edac722d-aedb-45ab-93d2-f0a5551d1f80-kube-api-access-56jlc" (OuterVolumeSpecName: "kube-api-access-56jlc") pod "edac722d-aedb-45ab-93d2-f0a5551d1f80" (UID: "edac722d-aedb-45ab-93d2-f0a5551d1f80"). InnerVolumeSpecName "kube-api-access-56jlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:53:45 crc kubenswrapper[4971]: I0309 09:53:45.056952 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edac722d-aedb-45ab-93d2-f0a5551d1f80-scripts" (OuterVolumeSpecName: "scripts") pod "edac722d-aedb-45ab-93d2-f0a5551d1f80" (UID: "edac722d-aedb-45ab-93d2-f0a5551d1f80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:53:45 crc kubenswrapper[4971]: I0309 09:53:45.058661 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edac722d-aedb-45ab-93d2-f0a5551d1f80-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "edac722d-aedb-45ab-93d2-f0a5551d1f80" (UID: "edac722d-aedb-45ab-93d2-f0a5551d1f80"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:45 crc kubenswrapper[4971]: I0309 09:53:45.062863 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edac722d-aedb-45ab-93d2-f0a5551d1f80-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "edac722d-aedb-45ab-93d2-f0a5551d1f80" (UID: "edac722d-aedb-45ab-93d2-f0a5551d1f80"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:45 crc kubenswrapper[4971]: I0309 09:53:45.139882 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56jlc\" (UniqueName: \"kubernetes.io/projected/edac722d-aedb-45ab-93d2-f0a5551d1f80-kube-api-access-56jlc\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:45 crc kubenswrapper[4971]: I0309 09:53:45.139919 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/edac722d-aedb-45ab-93d2-f0a5551d1f80-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:45 crc kubenswrapper[4971]: I0309 09:53:45.139931 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edac722d-aedb-45ab-93d2-f0a5551d1f80-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:45 crc kubenswrapper[4971]: I0309 09:53:45.139940 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/edac722d-aedb-45ab-93d2-f0a5551d1f80-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:45 crc kubenswrapper[4971]: I0309 09:53:45.139949 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/edac722d-aedb-45ab-93d2-f0a5551d1f80-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:45 crc kubenswrapper[4971]: I0309 09:53:45.139957 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/edac722d-aedb-45ab-93d2-f0a5551d1f80-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:45 crc kubenswrapper[4971]: I0309 09:53:45.162033 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edac722d-aedb-45ab-93d2-f0a5551d1f80" path="/var/lib/kubelet/pods/edac722d-aedb-45ab-93d2-f0a5551d1f80/volumes" Mar 09 09:53:45 crc kubenswrapper[4971]: I0309 09:53:45.653124 4971 scope.go:117] "RemoveContainer" containerID="d282d14fcca4536ef1e21ceea9fef850f562b796bb9ec3f5ed35e04c8fd26570" Mar 09 09:53:45 crc kubenswrapper[4971]: I0309 09:53:45.653158 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rmnq8" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.086277 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg"] Mar 09 09:53:46 crc kubenswrapper[4971]: E0309 09:53:46.086898 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edac722d-aedb-45ab-93d2-f0a5551d1f80" containerName="swift-ring-rebalance" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.086910 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="edac722d-aedb-45ab-93d2-f0a5551d1f80" containerName="swift-ring-rebalance" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.087038 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="edac722d-aedb-45ab-93d2-f0a5551d1f80" containerName="swift-ring-rebalance" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.099910 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.103750 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.104830 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.109730 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg"] Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.152574 4971 scope.go:117] "RemoveContainer" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" Mar 09 09:53:46 crc kubenswrapper[4971]: E0309 09:53:46.152935 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.256246 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d5c0d196-8792-4fb5-9f87-73016ec85e9a-dispersionconf\") pod \"swift-ring-rebalance-debug-vq7wg\" (UID: \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.256291 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j65x\" (UniqueName: \"kubernetes.io/projected/d5c0d196-8792-4fb5-9f87-73016ec85e9a-kube-api-access-6j65x\") pod \"swift-ring-rebalance-debug-vq7wg\" (UID: \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.256336 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d5c0d196-8792-4fb5-9f87-73016ec85e9a-swiftconf\") pod \"swift-ring-rebalance-debug-vq7wg\" (UID: \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.256411 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5c0d196-8792-4fb5-9f87-73016ec85e9a-scripts\") pod \"swift-ring-rebalance-debug-vq7wg\" (UID: \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.256445 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d5c0d196-8792-4fb5-9f87-73016ec85e9a-ring-data-devices\") pod \"swift-ring-rebalance-debug-vq7wg\" (UID: \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.256561 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d5c0d196-8792-4fb5-9f87-73016ec85e9a-etc-swift\") pod \"swift-ring-rebalance-debug-vq7wg\" (UID: \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.357930 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d5c0d196-8792-4fb5-9f87-73016ec85e9a-dispersionconf\") pod \"swift-ring-rebalance-debug-vq7wg\" (UID: \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.357994 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j65x\" (UniqueName: \"kubernetes.io/projected/d5c0d196-8792-4fb5-9f87-73016ec85e9a-kube-api-access-6j65x\") pod \"swift-ring-rebalance-debug-vq7wg\" (UID: \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.358057 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d5c0d196-8792-4fb5-9f87-73016ec85e9a-swiftconf\") pod \"swift-ring-rebalance-debug-vq7wg\" (UID: \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.358078 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5c0d196-8792-4fb5-9f87-73016ec85e9a-scripts\") pod \"swift-ring-rebalance-debug-vq7wg\" (UID: \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.358093 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d5c0d196-8792-4fb5-9f87-73016ec85e9a-ring-data-devices\") pod \"swift-ring-rebalance-debug-vq7wg\" (UID: \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.358165 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d5c0d196-8792-4fb5-9f87-73016ec85e9a-etc-swift\") pod \"swift-ring-rebalance-debug-vq7wg\" (UID: \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.358637 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d5c0d196-8792-4fb5-9f87-73016ec85e9a-etc-swift\") pod \"swift-ring-rebalance-debug-vq7wg\" (UID: \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.358941 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5c0d196-8792-4fb5-9f87-73016ec85e9a-scripts\") pod \"swift-ring-rebalance-debug-vq7wg\" (UID: \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.358985 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d5c0d196-8792-4fb5-9f87-73016ec85e9a-ring-data-devices\") pod \"swift-ring-rebalance-debug-vq7wg\" (UID: \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.361434 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d5c0d196-8792-4fb5-9f87-73016ec85e9a-dispersionconf\") pod \"swift-ring-rebalance-debug-vq7wg\" (UID: \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.361575 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d5c0d196-8792-4fb5-9f87-73016ec85e9a-swiftconf\") pod \"swift-ring-rebalance-debug-vq7wg\" (UID: \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.382710 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j65x\" (UniqueName: \"kubernetes.io/projected/d5c0d196-8792-4fb5-9f87-73016ec85e9a-kube-api-access-6j65x\") pod \"swift-ring-rebalance-debug-vq7wg\" (UID: \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.421209 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg" Mar 09 09:53:46 crc kubenswrapper[4971]: I0309 09:53:46.839512 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg"] Mar 09 09:53:47 crc kubenswrapper[4971]: I0309 09:53:47.671485 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg" event={"ID":"d5c0d196-8792-4fb5-9f87-73016ec85e9a","Type":"ContainerStarted","Data":"bfc6c65baef6ec6a03c68b7f356c51de958ad81a4f46d048aad101a602259734"} Mar 09 09:53:47 crc kubenswrapper[4971]: I0309 09:53:47.673000 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg" event={"ID":"d5c0d196-8792-4fb5-9f87-73016ec85e9a","Type":"ContainerStarted","Data":"aae04b56c733451aa2f18cd5bf00420de487a3e4771bd1d77244b9649016bf17"} Mar 09 09:53:47 crc kubenswrapper[4971]: I0309 09:53:47.694576 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg" podStartSLOduration=1.694551649 podStartE2EDuration="1.694551649s" podCreationTimestamp="2026-03-09 09:53:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:53:47.69031873 +0000 UTC m=+2031.250246550" watchObservedRunningTime="2026-03-09 09:53:47.694551649 +0000 UTC m=+2031.254479459" Mar 09 09:53:48 crc kubenswrapper[4971]: I0309 09:53:48.681727 4971 generic.go:334] "Generic (PLEG): container finished" podID="d5c0d196-8792-4fb5-9f87-73016ec85e9a" containerID="bfc6c65baef6ec6a03c68b7f356c51de958ad81a4f46d048aad101a602259734" exitCode=0 Mar 09 09:53:48 crc kubenswrapper[4971]: I0309 09:53:48.681802 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg" event={"ID":"d5c0d196-8792-4fb5-9f87-73016ec85e9a","Type":"ContainerDied","Data":"bfc6c65baef6ec6a03c68b7f356c51de958ad81a4f46d048aad101a602259734"} Mar 09 09:53:49 crc kubenswrapper[4971]: I0309 09:53:49.993286 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg" Mar 09 09:53:50 crc kubenswrapper[4971]: I0309 09:53:50.030934 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg"] Mar 09 09:53:50 crc kubenswrapper[4971]: I0309 09:53:50.036911 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg"] Mar 09 09:53:50 crc kubenswrapper[4971]: I0309 09:53:50.117606 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d5c0d196-8792-4fb5-9f87-73016ec85e9a-etc-swift\") pod \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\" (UID: \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\") " Mar 09 09:53:50 crc kubenswrapper[4971]: I0309 09:53:50.117755 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d5c0d196-8792-4fb5-9f87-73016ec85e9a-dispersionconf\") pod \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\" (UID: \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\") " Mar 09 09:53:50 crc kubenswrapper[4971]: I0309 09:53:50.117779 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d5c0d196-8792-4fb5-9f87-73016ec85e9a-swiftconf\") pod \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\" (UID: \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\") " Mar 09 09:53:50 crc kubenswrapper[4971]: I0309 09:53:50.117926 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5c0d196-8792-4fb5-9f87-73016ec85e9a-scripts\") pod \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\" (UID: \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\") " Mar 09 09:53:50 crc kubenswrapper[4971]: I0309 09:53:50.117966 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d5c0d196-8792-4fb5-9f87-73016ec85e9a-ring-data-devices\") pod \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\" (UID: \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\") " Mar 09 09:53:50 crc kubenswrapper[4971]: I0309 09:53:50.118040 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j65x\" (UniqueName: \"kubernetes.io/projected/d5c0d196-8792-4fb5-9f87-73016ec85e9a-kube-api-access-6j65x\") pod \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\" (UID: \"d5c0d196-8792-4fb5-9f87-73016ec85e9a\") " Mar 09 09:53:50 crc kubenswrapper[4971]: I0309 09:53:50.118415 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5c0d196-8792-4fb5-9f87-73016ec85e9a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d5c0d196-8792-4fb5-9f87-73016ec85e9a" (UID: "d5c0d196-8792-4fb5-9f87-73016ec85e9a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:53:50 crc kubenswrapper[4971]: I0309 09:53:50.118592 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5c0d196-8792-4fb5-9f87-73016ec85e9a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d5c0d196-8792-4fb5-9f87-73016ec85e9a" (UID: "d5c0d196-8792-4fb5-9f87-73016ec85e9a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:53:50 crc kubenswrapper[4971]: I0309 09:53:50.123579 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5c0d196-8792-4fb5-9f87-73016ec85e9a-kube-api-access-6j65x" (OuterVolumeSpecName: "kube-api-access-6j65x") pod "d5c0d196-8792-4fb5-9f87-73016ec85e9a" (UID: "d5c0d196-8792-4fb5-9f87-73016ec85e9a"). InnerVolumeSpecName "kube-api-access-6j65x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:53:50 crc kubenswrapper[4971]: I0309 09:53:50.138870 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5c0d196-8792-4fb5-9f87-73016ec85e9a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d5c0d196-8792-4fb5-9f87-73016ec85e9a" (UID: "d5c0d196-8792-4fb5-9f87-73016ec85e9a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:50 crc kubenswrapper[4971]: I0309 09:53:50.142246 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5c0d196-8792-4fb5-9f87-73016ec85e9a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d5c0d196-8792-4fb5-9f87-73016ec85e9a" (UID: "d5c0d196-8792-4fb5-9f87-73016ec85e9a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:50 crc kubenswrapper[4971]: I0309 09:53:50.142284 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5c0d196-8792-4fb5-9f87-73016ec85e9a-scripts" (OuterVolumeSpecName: "scripts") pod "d5c0d196-8792-4fb5-9f87-73016ec85e9a" (UID: "d5c0d196-8792-4fb5-9f87-73016ec85e9a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:53:50 crc kubenswrapper[4971]: I0309 09:53:50.219699 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d5c0d196-8792-4fb5-9f87-73016ec85e9a-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:50 crc kubenswrapper[4971]: I0309 09:53:50.219755 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d5c0d196-8792-4fb5-9f87-73016ec85e9a-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:50 crc kubenswrapper[4971]: I0309 09:53:50.219774 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5c0d196-8792-4fb5-9f87-73016ec85e9a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:50 crc kubenswrapper[4971]: I0309 09:53:50.219790 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d5c0d196-8792-4fb5-9f87-73016ec85e9a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:50 crc kubenswrapper[4971]: I0309 09:53:50.219806 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j65x\" (UniqueName: \"kubernetes.io/projected/d5c0d196-8792-4fb5-9f87-73016ec85e9a-kube-api-access-6j65x\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:50 crc kubenswrapper[4971]: I0309 09:53:50.219824 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d5c0d196-8792-4fb5-9f87-73016ec85e9a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:50 crc kubenswrapper[4971]: I0309 09:53:50.697398 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aae04b56c733451aa2f18cd5bf00420de487a3e4771bd1d77244b9649016bf17" Mar 09 09:53:50 crc kubenswrapper[4971]: I0309 09:53:50.697451 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vq7wg" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.160696 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5c0d196-8792-4fb5-9f87-73016ec85e9a" path="/var/lib/kubelet/pods/d5c0d196-8792-4fb5-9f87-73016ec85e9a/volumes" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.173270 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-87vzc"] Mar 09 09:53:51 crc kubenswrapper[4971]: E0309 09:53:51.173687 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5c0d196-8792-4fb5-9f87-73016ec85e9a" containerName="swift-ring-rebalance" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.173713 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c0d196-8792-4fb5-9f87-73016ec85e9a" containerName="swift-ring-rebalance" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.173879 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5c0d196-8792-4fb5-9f87-73016ec85e9a" containerName="swift-ring-rebalance" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.174513 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-87vzc" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.180759 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.181123 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.187504 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-87vzc"] Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.336207 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/48ee3330-8ad1-4605-a9da-98f27e18c803-etc-swift\") pod \"swift-ring-rebalance-debug-87vzc\" (UID: \"48ee3330-8ad1-4605-a9da-98f27e18c803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-87vzc" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.336266 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cml5m\" (UniqueName: \"kubernetes.io/projected/48ee3330-8ad1-4605-a9da-98f27e18c803-kube-api-access-cml5m\") pod \"swift-ring-rebalance-debug-87vzc\" (UID: \"48ee3330-8ad1-4605-a9da-98f27e18c803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-87vzc" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.336394 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/48ee3330-8ad1-4605-a9da-98f27e18c803-swiftconf\") pod \"swift-ring-rebalance-debug-87vzc\" (UID: \"48ee3330-8ad1-4605-a9da-98f27e18c803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-87vzc" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.336424 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/48ee3330-8ad1-4605-a9da-98f27e18c803-ring-data-devices\") pod \"swift-ring-rebalance-debug-87vzc\" (UID: \"48ee3330-8ad1-4605-a9da-98f27e18c803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-87vzc" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.336456 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48ee3330-8ad1-4605-a9da-98f27e18c803-scripts\") pod \"swift-ring-rebalance-debug-87vzc\" (UID: \"48ee3330-8ad1-4605-a9da-98f27e18c803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-87vzc" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.336524 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/48ee3330-8ad1-4605-a9da-98f27e18c803-dispersionconf\") pod \"swift-ring-rebalance-debug-87vzc\" (UID: \"48ee3330-8ad1-4605-a9da-98f27e18c803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-87vzc" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.437286 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/48ee3330-8ad1-4605-a9da-98f27e18c803-dispersionconf\") pod \"swift-ring-rebalance-debug-87vzc\" (UID: \"48ee3330-8ad1-4605-a9da-98f27e18c803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-87vzc" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.437591 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/48ee3330-8ad1-4605-a9da-98f27e18c803-etc-swift\") pod \"swift-ring-rebalance-debug-87vzc\" (UID: \"48ee3330-8ad1-4605-a9da-98f27e18c803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-87vzc" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.437713 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cml5m\" (UniqueName: \"kubernetes.io/projected/48ee3330-8ad1-4605-a9da-98f27e18c803-kube-api-access-cml5m\") pod \"swift-ring-rebalance-debug-87vzc\" (UID: \"48ee3330-8ad1-4605-a9da-98f27e18c803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-87vzc" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.437855 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/48ee3330-8ad1-4605-a9da-98f27e18c803-swiftconf\") pod \"swift-ring-rebalance-debug-87vzc\" (UID: \"48ee3330-8ad1-4605-a9da-98f27e18c803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-87vzc" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.437949 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/48ee3330-8ad1-4605-a9da-98f27e18c803-ring-data-devices\") pod \"swift-ring-rebalance-debug-87vzc\" (UID: \"48ee3330-8ad1-4605-a9da-98f27e18c803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-87vzc" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.438069 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48ee3330-8ad1-4605-a9da-98f27e18c803-scripts\") pod \"swift-ring-rebalance-debug-87vzc\" (UID: \"48ee3330-8ad1-4605-a9da-98f27e18c803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-87vzc" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.438181 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/48ee3330-8ad1-4605-a9da-98f27e18c803-etc-swift\") pod \"swift-ring-rebalance-debug-87vzc\" (UID: \"48ee3330-8ad1-4605-a9da-98f27e18c803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-87vzc" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.438675 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/48ee3330-8ad1-4605-a9da-98f27e18c803-ring-data-devices\") pod \"swift-ring-rebalance-debug-87vzc\" (UID: \"48ee3330-8ad1-4605-a9da-98f27e18c803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-87vzc" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.438779 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48ee3330-8ad1-4605-a9da-98f27e18c803-scripts\") pod \"swift-ring-rebalance-debug-87vzc\" (UID: \"48ee3330-8ad1-4605-a9da-98f27e18c803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-87vzc" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.440843 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/48ee3330-8ad1-4605-a9da-98f27e18c803-swiftconf\") pod \"swift-ring-rebalance-debug-87vzc\" (UID: \"48ee3330-8ad1-4605-a9da-98f27e18c803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-87vzc" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.446756 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/48ee3330-8ad1-4605-a9da-98f27e18c803-dispersionconf\") pod \"swift-ring-rebalance-debug-87vzc\" (UID: \"48ee3330-8ad1-4605-a9da-98f27e18c803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-87vzc" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.453764 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cml5m\" (UniqueName: \"kubernetes.io/projected/48ee3330-8ad1-4605-a9da-98f27e18c803-kube-api-access-cml5m\") pod \"swift-ring-rebalance-debug-87vzc\" (UID: \"48ee3330-8ad1-4605-a9da-98f27e18c803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-87vzc" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.492402 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-87vzc" Mar 09 09:53:51 crc kubenswrapper[4971]: I0309 09:53:51.904337 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-87vzc"] Mar 09 09:53:51 crc kubenswrapper[4971]: W0309 09:53:51.907685 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48ee3330_8ad1_4605_a9da_98f27e18c803.slice/crio-dc425dd700cfaeca89dca96ae894a93de89d0454b9f1c578da46fb1f9c602d93 WatchSource:0}: Error finding container dc425dd700cfaeca89dca96ae894a93de89d0454b9f1c578da46fb1f9c602d93: Status 404 returned error can't find the container with id dc425dd700cfaeca89dca96ae894a93de89d0454b9f1c578da46fb1f9c602d93 Mar 09 09:53:52 crc kubenswrapper[4971]: I0309 09:53:52.715578 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-87vzc" event={"ID":"48ee3330-8ad1-4605-a9da-98f27e18c803","Type":"ContainerStarted","Data":"ed94226fc9f4effcd3f9f6bc4737c67be5445728269fa8ac24980b70dd292ec1"} Mar 09 09:53:52 crc kubenswrapper[4971]: I0309 09:53:52.715921 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-87vzc" event={"ID":"48ee3330-8ad1-4605-a9da-98f27e18c803","Type":"ContainerStarted","Data":"dc425dd700cfaeca89dca96ae894a93de89d0454b9f1c578da46fb1f9c602d93"} Mar 09 09:53:52 crc kubenswrapper[4971]: I0309 09:53:52.733882 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-87vzc" podStartSLOduration=1.733861091 podStartE2EDuration="1.733861091s" podCreationTimestamp="2026-03-09 09:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:53:52.729479197 +0000 UTC m=+2036.289407017" watchObservedRunningTime="2026-03-09 09:53:52.733861091 +0000 UTC m=+2036.293788901" Mar 09 09:53:53 crc kubenswrapper[4971]: I0309 09:53:53.724940 4971 generic.go:334] "Generic (PLEG): container finished" podID="48ee3330-8ad1-4605-a9da-98f27e18c803" containerID="ed94226fc9f4effcd3f9f6bc4737c67be5445728269fa8ac24980b70dd292ec1" exitCode=0 Mar 09 09:53:53 crc kubenswrapper[4971]: I0309 09:53:53.725001 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-87vzc" event={"ID":"48ee3330-8ad1-4605-a9da-98f27e18c803","Type":"ContainerDied","Data":"ed94226fc9f4effcd3f9f6bc4737c67be5445728269fa8ac24980b70dd292ec1"} Mar 09 09:53:54 crc kubenswrapper[4971]: I0309 09:53:54.984885 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-87vzc" Mar 09 09:53:55 crc kubenswrapper[4971]: I0309 09:53:55.020526 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-87vzc"] Mar 09 09:53:55 crc kubenswrapper[4971]: I0309 09:53:55.028853 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-87vzc"] Mar 09 09:53:55 crc kubenswrapper[4971]: I0309 09:53:55.088661 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/48ee3330-8ad1-4605-a9da-98f27e18c803-ring-data-devices\") pod \"48ee3330-8ad1-4605-a9da-98f27e18c803\" (UID: \"48ee3330-8ad1-4605-a9da-98f27e18c803\") " Mar 09 09:53:55 crc kubenswrapper[4971]: I0309 09:53:55.088702 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cml5m\" (UniqueName: \"kubernetes.io/projected/48ee3330-8ad1-4605-a9da-98f27e18c803-kube-api-access-cml5m\") pod \"48ee3330-8ad1-4605-a9da-98f27e18c803\" (UID: \"48ee3330-8ad1-4605-a9da-98f27e18c803\") " Mar 09 09:53:55 crc kubenswrapper[4971]: I0309 09:53:55.088739 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/48ee3330-8ad1-4605-a9da-98f27e18c803-swiftconf\") pod \"48ee3330-8ad1-4605-a9da-98f27e18c803\" (UID: \"48ee3330-8ad1-4605-a9da-98f27e18c803\") " Mar 09 09:53:55 crc kubenswrapper[4971]: I0309 09:53:55.088782 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48ee3330-8ad1-4605-a9da-98f27e18c803-scripts\") pod \"48ee3330-8ad1-4605-a9da-98f27e18c803\" (UID: \"48ee3330-8ad1-4605-a9da-98f27e18c803\") " Mar 09 09:53:55 crc kubenswrapper[4971]: I0309 09:53:55.088842 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/48ee3330-8ad1-4605-a9da-98f27e18c803-etc-swift\") pod \"48ee3330-8ad1-4605-a9da-98f27e18c803\" (UID: \"48ee3330-8ad1-4605-a9da-98f27e18c803\") " Mar 09 09:53:55 crc kubenswrapper[4971]: I0309 09:53:55.088931 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/48ee3330-8ad1-4605-a9da-98f27e18c803-dispersionconf\") pod \"48ee3330-8ad1-4605-a9da-98f27e18c803\" (UID: \"48ee3330-8ad1-4605-a9da-98f27e18c803\") " Mar 09 09:53:55 crc kubenswrapper[4971]: I0309 09:53:55.089428 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48ee3330-8ad1-4605-a9da-98f27e18c803-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "48ee3330-8ad1-4605-a9da-98f27e18c803" (UID: "48ee3330-8ad1-4605-a9da-98f27e18c803"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:53:55 crc kubenswrapper[4971]: I0309 09:53:55.089840 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48ee3330-8ad1-4605-a9da-98f27e18c803-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "48ee3330-8ad1-4605-a9da-98f27e18c803" (UID: "48ee3330-8ad1-4605-a9da-98f27e18c803"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:53:55 crc kubenswrapper[4971]: I0309 09:53:55.094190 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48ee3330-8ad1-4605-a9da-98f27e18c803-kube-api-access-cml5m" (OuterVolumeSpecName: "kube-api-access-cml5m") pod "48ee3330-8ad1-4605-a9da-98f27e18c803" (UID: "48ee3330-8ad1-4605-a9da-98f27e18c803"). InnerVolumeSpecName "kube-api-access-cml5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:53:55 crc kubenswrapper[4971]: I0309 09:53:55.107438 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48ee3330-8ad1-4605-a9da-98f27e18c803-scripts" (OuterVolumeSpecName: "scripts") pod "48ee3330-8ad1-4605-a9da-98f27e18c803" (UID: "48ee3330-8ad1-4605-a9da-98f27e18c803"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:53:55 crc kubenswrapper[4971]: I0309 09:53:55.109609 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ee3330-8ad1-4605-a9da-98f27e18c803-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "48ee3330-8ad1-4605-a9da-98f27e18c803" (UID: "48ee3330-8ad1-4605-a9da-98f27e18c803"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:55 crc kubenswrapper[4971]: I0309 09:53:55.113328 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ee3330-8ad1-4605-a9da-98f27e18c803-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "48ee3330-8ad1-4605-a9da-98f27e18c803" (UID: "48ee3330-8ad1-4605-a9da-98f27e18c803"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:55 crc kubenswrapper[4971]: I0309 09:53:55.160229 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48ee3330-8ad1-4605-a9da-98f27e18c803" path="/var/lib/kubelet/pods/48ee3330-8ad1-4605-a9da-98f27e18c803/volumes" Mar 09 09:53:55 crc kubenswrapper[4971]: I0309 09:53:55.191130 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48ee3330-8ad1-4605-a9da-98f27e18c803-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:55 crc kubenswrapper[4971]: I0309 09:53:55.191154 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/48ee3330-8ad1-4605-a9da-98f27e18c803-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:55 crc kubenswrapper[4971]: I0309 09:53:55.191164 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/48ee3330-8ad1-4605-a9da-98f27e18c803-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:55 crc kubenswrapper[4971]: I0309 09:53:55.191173 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/48ee3330-8ad1-4605-a9da-98f27e18c803-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:55 crc kubenswrapper[4971]: I0309 09:53:55.191182 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cml5m\" (UniqueName: \"kubernetes.io/projected/48ee3330-8ad1-4605-a9da-98f27e18c803-kube-api-access-cml5m\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:55 crc kubenswrapper[4971]: I0309 09:53:55.191190 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/48ee3330-8ad1-4605-a9da-98f27e18c803-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:55 crc kubenswrapper[4971]: I0309 09:53:55.742242 4971 scope.go:117] "RemoveContainer" containerID="ed94226fc9f4effcd3f9f6bc4737c67be5445728269fa8ac24980b70dd292ec1" Mar 09 09:53:55 crc kubenswrapper[4971]: I0309 09:53:55.742317 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-87vzc" Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.147224 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t8pql"] Mar 09 09:53:56 crc kubenswrapper[4971]: E0309 09:53:56.147764 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48ee3330-8ad1-4605-a9da-98f27e18c803" containerName="swift-ring-rebalance" Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.147788 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="48ee3330-8ad1-4605-a9da-98f27e18c803" containerName="swift-ring-rebalance" Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.148065 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="48ee3330-8ad1-4605-a9da-98f27e18c803" containerName="swift-ring-rebalance" Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.148907 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t8pql" Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.151309 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.152226 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.153527 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t8pql"] Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.205821 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gh9t\" (UniqueName: \"kubernetes.io/projected/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-kube-api-access-2gh9t\") pod \"swift-ring-rebalance-debug-t8pql\" (UID: \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t8pql" Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.205992 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-dispersionconf\") pod \"swift-ring-rebalance-debug-t8pql\" (UID: \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t8pql" Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.206119 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-scripts\") pod \"swift-ring-rebalance-debug-t8pql\" (UID: \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t8pql" Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.206205 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-ring-data-devices\") pod \"swift-ring-rebalance-debug-t8pql\" (UID: \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t8pql" Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.206334 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-swiftconf\") pod \"swift-ring-rebalance-debug-t8pql\" (UID: \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t8pql" Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.206443 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-etc-swift\") pod \"swift-ring-rebalance-debug-t8pql\" (UID: \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t8pql" Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.307562 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-etc-swift\") pod \"swift-ring-rebalance-debug-t8pql\" (UID: \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t8pql" Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.307656 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gh9t\" (UniqueName: \"kubernetes.io/projected/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-kube-api-access-2gh9t\") pod \"swift-ring-rebalance-debug-t8pql\" (UID: \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t8pql" Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.307691 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-dispersionconf\") pod \"swift-ring-rebalance-debug-t8pql\" (UID: \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t8pql" Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.307762 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-scripts\") pod \"swift-ring-rebalance-debug-t8pql\" (UID: \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t8pql" Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.307794 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-ring-data-devices\") pod \"swift-ring-rebalance-debug-t8pql\" (UID: \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t8pql" Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.307841 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-swiftconf\") pod \"swift-ring-rebalance-debug-t8pql\" (UID: \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t8pql" Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.308194 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-etc-swift\") pod \"swift-ring-rebalance-debug-t8pql\" (UID: \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t8pql" Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.308596 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-ring-data-devices\") pod \"swift-ring-rebalance-debug-t8pql\" (UID: \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t8pql" Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.308705 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-scripts\") pod \"swift-ring-rebalance-debug-t8pql\" (UID: \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t8pql" Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.312277 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-dispersionconf\") pod \"swift-ring-rebalance-debug-t8pql\" (UID: \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t8pql" Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.313980 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-swiftconf\") pod \"swift-ring-rebalance-debug-t8pql\" (UID: \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t8pql" Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.326015 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gh9t\" (UniqueName: \"kubernetes.io/projected/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-kube-api-access-2gh9t\") pod \"swift-ring-rebalance-debug-t8pql\" (UID: \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t8pql" Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.476563 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t8pql" Mar 09 09:53:56 crc kubenswrapper[4971]: I0309 09:53:56.881636 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t8pql"] Mar 09 09:53:56 crc kubenswrapper[4971]: W0309 09:53:56.885415 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42bf9093_3e65_44e7_8cb6_8e7b0f50e5a2.slice/crio-a679ce667899676b87cfc39f9a1bcc2b2a179f32a3a8018741154df3c1c80d98 WatchSource:0}: Error finding container a679ce667899676b87cfc39f9a1bcc2b2a179f32a3a8018741154df3c1c80d98: Status 404 returned error can't find the container with id a679ce667899676b87cfc39f9a1bcc2b2a179f32a3a8018741154df3c1c80d98 Mar 09 09:53:57 crc kubenswrapper[4971]: I0309 09:53:57.157663 4971 scope.go:117] "RemoveContainer" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" Mar 09 09:53:57 crc kubenswrapper[4971]: E0309 09:53:57.158370 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 09:53:57 crc kubenswrapper[4971]: I0309 09:53:57.773505 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t8pql" event={"ID":"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2","Type":"ContainerStarted","Data":"3148319edd60aa94712d85ca00a82a4c1396f1a74d3c620745666993e43a563a"} Mar 09 09:53:57 crc kubenswrapper[4971]: I0309 09:53:57.773554 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t8pql" event={"ID":"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2","Type":"ContainerStarted","Data":"a679ce667899676b87cfc39f9a1bcc2b2a179f32a3a8018741154df3c1c80d98"} Mar 09 09:53:58 crc kubenswrapper[4971]: I0309 09:53:58.785833 4971 generic.go:334] "Generic (PLEG): container finished" podID="42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2" containerID="3148319edd60aa94712d85ca00a82a4c1396f1a74d3c620745666993e43a563a" exitCode=0 Mar 09 09:53:58 crc kubenswrapper[4971]: I0309 09:53:58.785918 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t8pql" event={"ID":"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2","Type":"ContainerDied","Data":"3148319edd60aa94712d85ca00a82a4c1396f1a74d3c620745666993e43a563a"} Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.062939 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t8pql" Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.096397 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t8pql"] Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.103509 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t8pql"] Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.137314 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550834-5m28w"] Mar 09 09:54:00 crc kubenswrapper[4971]: E0309 09:54:00.137671 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2" containerName="swift-ring-rebalance" Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.137690 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2" containerName="swift-ring-rebalance" Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.137829 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2" containerName="swift-ring-rebalance" Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.138315 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550834-5m28w" Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.140885 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-xhrv2" Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.140930 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.141171 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.148514 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550834-5m28w"] Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.166138 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-etc-swift\") pod \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\" (UID: \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\") " Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.166195 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-scripts\") pod \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\" (UID: \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\") " Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.166235 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-ring-data-devices\") pod \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\" (UID: \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\") " Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.166334 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-dispersionconf\") pod \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\" (UID: \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\") " Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.166383 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-swiftconf\") pod \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\" (UID: \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\") " Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.166436 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gh9t\" (UniqueName: \"kubernetes.io/projected/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-kube-api-access-2gh9t\") pod \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\" (UID: \"42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2\") " Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.167016 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2" (UID: "42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.168584 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2" (UID: "42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.173958 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-kube-api-access-2gh9t" (OuterVolumeSpecName: "kube-api-access-2gh9t") pod "42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2" (UID: "42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2"). InnerVolumeSpecName "kube-api-access-2gh9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.189931 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-scripts" (OuterVolumeSpecName: "scripts") pod "42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2" (UID: "42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.192590 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2" (UID: "42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.195643 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2" (UID: "42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.268429 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4ngs\" (UniqueName: \"kubernetes.io/projected/afb29bf2-443e-4d7b-956a-001dcf4455e9-kube-api-access-m4ngs\") pod \"auto-csr-approver-29550834-5m28w\" (UID: \"afb29bf2-443e-4d7b-956a-001dcf4455e9\") " pod="openshift-infra/auto-csr-approver-29550834-5m28w" Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.268736 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.268752 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.268763 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.268773 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.268784 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.268794 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gh9t\" (UniqueName: \"kubernetes.io/projected/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2-kube-api-access-2gh9t\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.370600 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4ngs\" (UniqueName: \"kubernetes.io/projected/afb29bf2-443e-4d7b-956a-001dcf4455e9-kube-api-access-m4ngs\") pod \"auto-csr-approver-29550834-5m28w\" (UID: \"afb29bf2-443e-4d7b-956a-001dcf4455e9\") " pod="openshift-infra/auto-csr-approver-29550834-5m28w" Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.387330 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4ngs\" (UniqueName: \"kubernetes.io/projected/afb29bf2-443e-4d7b-956a-001dcf4455e9-kube-api-access-m4ngs\") pod \"auto-csr-approver-29550834-5m28w\" (UID: \"afb29bf2-443e-4d7b-956a-001dcf4455e9\") " pod="openshift-infra/auto-csr-approver-29550834-5m28w" Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.457522 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550834-5m28w" Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.810037 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a679ce667899676b87cfc39f9a1bcc2b2a179f32a3a8018741154df3c1c80d98" Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.810213 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t8pql" Mar 09 09:54:00 crc kubenswrapper[4971]: I0309 09:54:00.880226 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550834-5m28w"] Mar 09 09:54:00 crc kubenswrapper[4971]: W0309 09:54:00.887058 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafb29bf2_443e_4d7b_956a_001dcf4455e9.slice/crio-ca8fec7c01ac359f3a30a7bf8fef836a4d26da7f6c6f67403d5a2644da9173ce WatchSource:0}: Error finding container ca8fec7c01ac359f3a30a7bf8fef836a4d26da7f6c6f67403d5a2644da9173ce: Status 404 returned error can't find the container with id ca8fec7c01ac359f3a30a7bf8fef836a4d26da7f6c6f67403d5a2644da9173ce Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.160680 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2" path="/var/lib/kubelet/pods/42bf9093-3e65-44e7-8cb6-8e7b0f50e5a2/volumes" Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.228835 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx"] Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.230082 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx" Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.233531 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.236061 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.239340 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx"] Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.285485 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/727701ee-3475-4bf5-af07-2b6ae8372e29-ring-data-devices\") pod \"swift-ring-rebalance-debug-jt4fx\" (UID: \"727701ee-3475-4bf5-af07-2b6ae8372e29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx" Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.285570 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/727701ee-3475-4bf5-af07-2b6ae8372e29-etc-swift\") pod \"swift-ring-rebalance-debug-jt4fx\" (UID: \"727701ee-3475-4bf5-af07-2b6ae8372e29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx" Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.285601 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/727701ee-3475-4bf5-af07-2b6ae8372e29-swiftconf\") pod \"swift-ring-rebalance-debug-jt4fx\" (UID: \"727701ee-3475-4bf5-af07-2b6ae8372e29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx" Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.285620 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfgtl\" (UniqueName: \"kubernetes.io/projected/727701ee-3475-4bf5-af07-2b6ae8372e29-kube-api-access-sfgtl\") pod \"swift-ring-rebalance-debug-jt4fx\" (UID: \"727701ee-3475-4bf5-af07-2b6ae8372e29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx" Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.285677 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/727701ee-3475-4bf5-af07-2b6ae8372e29-scripts\") pod \"swift-ring-rebalance-debug-jt4fx\" (UID: \"727701ee-3475-4bf5-af07-2b6ae8372e29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx" Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.285692 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/727701ee-3475-4bf5-af07-2b6ae8372e29-dispersionconf\") pod \"swift-ring-rebalance-debug-jt4fx\" (UID: \"727701ee-3475-4bf5-af07-2b6ae8372e29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx" Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.387322 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/727701ee-3475-4bf5-af07-2b6ae8372e29-ring-data-devices\") pod \"swift-ring-rebalance-debug-jt4fx\" (UID: \"727701ee-3475-4bf5-af07-2b6ae8372e29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx" Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.387418 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/727701ee-3475-4bf5-af07-2b6ae8372e29-etc-swift\") pod \"swift-ring-rebalance-debug-jt4fx\" (UID: \"727701ee-3475-4bf5-af07-2b6ae8372e29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx" Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.387462 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/727701ee-3475-4bf5-af07-2b6ae8372e29-swiftconf\") pod \"swift-ring-rebalance-debug-jt4fx\" (UID: \"727701ee-3475-4bf5-af07-2b6ae8372e29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx" Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.387485 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfgtl\" (UniqueName: \"kubernetes.io/projected/727701ee-3475-4bf5-af07-2b6ae8372e29-kube-api-access-sfgtl\") pod \"swift-ring-rebalance-debug-jt4fx\" (UID: \"727701ee-3475-4bf5-af07-2b6ae8372e29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx" Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.387545 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/727701ee-3475-4bf5-af07-2b6ae8372e29-scripts\") pod \"swift-ring-rebalance-debug-jt4fx\" (UID: \"727701ee-3475-4bf5-af07-2b6ae8372e29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx" Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.387566 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/727701ee-3475-4bf5-af07-2b6ae8372e29-dispersionconf\") pod \"swift-ring-rebalance-debug-jt4fx\" (UID: \"727701ee-3475-4bf5-af07-2b6ae8372e29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx" Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.388008 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/727701ee-3475-4bf5-af07-2b6ae8372e29-etc-swift\") pod \"swift-ring-rebalance-debug-jt4fx\" (UID: \"727701ee-3475-4bf5-af07-2b6ae8372e29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx" Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.388882 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/727701ee-3475-4bf5-af07-2b6ae8372e29-scripts\") pod \"swift-ring-rebalance-debug-jt4fx\" (UID: \"727701ee-3475-4bf5-af07-2b6ae8372e29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx" Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.388986 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/727701ee-3475-4bf5-af07-2b6ae8372e29-ring-data-devices\") pod \"swift-ring-rebalance-debug-jt4fx\" (UID: \"727701ee-3475-4bf5-af07-2b6ae8372e29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx" Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.393193 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/727701ee-3475-4bf5-af07-2b6ae8372e29-dispersionconf\") pod \"swift-ring-rebalance-debug-jt4fx\" (UID: \"727701ee-3475-4bf5-af07-2b6ae8372e29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx" Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.394274 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/727701ee-3475-4bf5-af07-2b6ae8372e29-swiftconf\") pod \"swift-ring-rebalance-debug-jt4fx\" (UID: \"727701ee-3475-4bf5-af07-2b6ae8372e29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx" Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.405202 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfgtl\" (UniqueName: \"kubernetes.io/projected/727701ee-3475-4bf5-af07-2b6ae8372e29-kube-api-access-sfgtl\") pod \"swift-ring-rebalance-debug-jt4fx\" (UID: \"727701ee-3475-4bf5-af07-2b6ae8372e29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx" Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.555880 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx" Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.822590 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550834-5m28w" event={"ID":"afb29bf2-443e-4d7b-956a-001dcf4455e9","Type":"ContainerStarted","Data":"ca8fec7c01ac359f3a30a7bf8fef836a4d26da7f6c6f67403d5a2644da9173ce"} Mar 09 09:54:01 crc kubenswrapper[4971]: I0309 09:54:01.825915 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx"] Mar 09 09:54:01 crc kubenswrapper[4971]: W0309 09:54:01.832509 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod727701ee_3475_4bf5_af07_2b6ae8372e29.slice/crio-a35f75e05520590eaa1bc1596bfc94a12987a7d08c92f1a5ef2a668b12d52a15 WatchSource:0}: Error finding container a35f75e05520590eaa1bc1596bfc94a12987a7d08c92f1a5ef2a668b12d52a15: Status 404 returned error can't find the container with id a35f75e05520590eaa1bc1596bfc94a12987a7d08c92f1a5ef2a668b12d52a15 Mar 09 09:54:02 crc kubenswrapper[4971]: I0309 09:54:02.831917 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx" event={"ID":"727701ee-3475-4bf5-af07-2b6ae8372e29","Type":"ContainerStarted","Data":"1d2d9f4d574f0f9f5b31c9f7e0a87f1817f41fbe020e6e08463e724497da4a8d"} Mar 09 09:54:02 crc kubenswrapper[4971]: I0309 09:54:02.832553 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx" event={"ID":"727701ee-3475-4bf5-af07-2b6ae8372e29","Type":"ContainerStarted","Data":"a35f75e05520590eaa1bc1596bfc94a12987a7d08c92f1a5ef2a668b12d52a15"} Mar 09 09:54:02 crc kubenswrapper[4971]: I0309 09:54:02.834041 4971 generic.go:334] "Generic (PLEG): container finished" podID="afb29bf2-443e-4d7b-956a-001dcf4455e9" containerID="37bd5fa24ff71a172d90b87134a7011fd18934e20dd6fa4857ba70b1e0b2ac47" exitCode=0 Mar 09 09:54:02 crc kubenswrapper[4971]: I0309 09:54:02.834075 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550834-5m28w" event={"ID":"afb29bf2-443e-4d7b-956a-001dcf4455e9","Type":"ContainerDied","Data":"37bd5fa24ff71a172d90b87134a7011fd18934e20dd6fa4857ba70b1e0b2ac47"} Mar 09 09:54:02 crc kubenswrapper[4971]: I0309 09:54:02.851323 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx" podStartSLOduration=1.8513057659999999 podStartE2EDuration="1.851305766s" podCreationTimestamp="2026-03-09 09:54:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:54:02.847624742 +0000 UTC m=+2046.407552562" watchObservedRunningTime="2026-03-09 09:54:02.851305766 +0000 UTC m=+2046.411233576" Mar 09 09:54:03 crc kubenswrapper[4971]: I0309 09:54:03.843650 4971 generic.go:334] "Generic (PLEG): container finished" podID="727701ee-3475-4bf5-af07-2b6ae8372e29" containerID="1d2d9f4d574f0f9f5b31c9f7e0a87f1817f41fbe020e6e08463e724497da4a8d" exitCode=0 Mar 09 09:54:03 crc kubenswrapper[4971]: I0309 09:54:03.843824 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx" event={"ID":"727701ee-3475-4bf5-af07-2b6ae8372e29","Type":"ContainerDied","Data":"1d2d9f4d574f0f9f5b31c9f7e0a87f1817f41fbe020e6e08463e724497da4a8d"} Mar 09 09:54:04 crc kubenswrapper[4971]: I0309 09:54:04.127153 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550834-5m28w" Mar 09 09:54:04 crc kubenswrapper[4971]: I0309 09:54:04.231782 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4ngs\" (UniqueName: \"kubernetes.io/projected/afb29bf2-443e-4d7b-956a-001dcf4455e9-kube-api-access-m4ngs\") pod \"afb29bf2-443e-4d7b-956a-001dcf4455e9\" (UID: \"afb29bf2-443e-4d7b-956a-001dcf4455e9\") " Mar 09 09:54:04 crc kubenswrapper[4971]: I0309 09:54:04.237826 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afb29bf2-443e-4d7b-956a-001dcf4455e9-kube-api-access-m4ngs" (OuterVolumeSpecName: "kube-api-access-m4ngs") pod "afb29bf2-443e-4d7b-956a-001dcf4455e9" (UID: "afb29bf2-443e-4d7b-956a-001dcf4455e9"). InnerVolumeSpecName "kube-api-access-m4ngs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:54:04 crc kubenswrapper[4971]: I0309 09:54:04.335465 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4ngs\" (UniqueName: \"kubernetes.io/projected/afb29bf2-443e-4d7b-956a-001dcf4455e9-kube-api-access-m4ngs\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:04 crc kubenswrapper[4971]: I0309 09:54:04.869435 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550834-5m28w" Mar 09 09:54:04 crc kubenswrapper[4971]: I0309 09:54:04.869532 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550834-5m28w" event={"ID":"afb29bf2-443e-4d7b-956a-001dcf4455e9","Type":"ContainerDied","Data":"ca8fec7c01ac359f3a30a7bf8fef836a4d26da7f6c6f67403d5a2644da9173ce"} Mar 09 09:54:04 crc kubenswrapper[4971]: I0309 09:54:04.869569 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca8fec7c01ac359f3a30a7bf8fef836a4d26da7f6c6f67403d5a2644da9173ce" Mar 09 09:54:05 crc kubenswrapper[4971]: I0309 09:54:05.156549 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx" Mar 09 09:54:05 crc kubenswrapper[4971]: I0309 09:54:05.193059 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx"] Mar 09 09:54:05 crc kubenswrapper[4971]: I0309 09:54:05.213715 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550828-8x5pt"] Mar 09 09:54:05 crc kubenswrapper[4971]: I0309 09:54:05.222956 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550828-8x5pt"] Mar 09 09:54:05 crc kubenswrapper[4971]: I0309 09:54:05.229555 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx"] Mar 09 09:54:05 crc kubenswrapper[4971]: I0309 09:54:05.248143 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/727701ee-3475-4bf5-af07-2b6ae8372e29-scripts\") pod \"727701ee-3475-4bf5-af07-2b6ae8372e29\" (UID: \"727701ee-3475-4bf5-af07-2b6ae8372e29\") " Mar 09 09:54:05 crc kubenswrapper[4971]: I0309 09:54:05.248235 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/727701ee-3475-4bf5-af07-2b6ae8372e29-dispersionconf\") pod \"727701ee-3475-4bf5-af07-2b6ae8372e29\" (UID: \"727701ee-3475-4bf5-af07-2b6ae8372e29\") " Mar 09 09:54:05 crc kubenswrapper[4971]: I0309 09:54:05.248284 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/727701ee-3475-4bf5-af07-2b6ae8372e29-swiftconf\") pod \"727701ee-3475-4bf5-af07-2b6ae8372e29\" (UID: \"727701ee-3475-4bf5-af07-2b6ae8372e29\") " Mar 09 09:54:05 crc kubenswrapper[4971]: I0309 09:54:05.248387 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/727701ee-3475-4bf5-af07-2b6ae8372e29-etc-swift\") pod \"727701ee-3475-4bf5-af07-2b6ae8372e29\" (UID: \"727701ee-3475-4bf5-af07-2b6ae8372e29\") " Mar 09 09:54:05 crc kubenswrapper[4971]: I0309 09:54:05.248428 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/727701ee-3475-4bf5-af07-2b6ae8372e29-ring-data-devices\") pod \"727701ee-3475-4bf5-af07-2b6ae8372e29\" (UID: \"727701ee-3475-4bf5-af07-2b6ae8372e29\") " Mar 09 09:54:05 crc kubenswrapper[4971]: I0309 09:54:05.248472 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfgtl\" (UniqueName: \"kubernetes.io/projected/727701ee-3475-4bf5-af07-2b6ae8372e29-kube-api-access-sfgtl\") pod \"727701ee-3475-4bf5-af07-2b6ae8372e29\" (UID: \"727701ee-3475-4bf5-af07-2b6ae8372e29\") " Mar 09 09:54:05 crc kubenswrapper[4971]: I0309 09:54:05.249313 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/727701ee-3475-4bf5-af07-2b6ae8372e29-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "727701ee-3475-4bf5-af07-2b6ae8372e29" (UID: "727701ee-3475-4bf5-af07-2b6ae8372e29"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:54:05 crc kubenswrapper[4971]: I0309 09:54:05.249521 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/727701ee-3475-4bf5-af07-2b6ae8372e29-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "727701ee-3475-4bf5-af07-2b6ae8372e29" (UID: "727701ee-3475-4bf5-af07-2b6ae8372e29"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:54:05 crc kubenswrapper[4971]: I0309 09:54:05.252943 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/727701ee-3475-4bf5-af07-2b6ae8372e29-kube-api-access-sfgtl" (OuterVolumeSpecName: "kube-api-access-sfgtl") pod "727701ee-3475-4bf5-af07-2b6ae8372e29" (UID: "727701ee-3475-4bf5-af07-2b6ae8372e29"). InnerVolumeSpecName "kube-api-access-sfgtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:54:05 crc kubenswrapper[4971]: I0309 09:54:05.267245 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/727701ee-3475-4bf5-af07-2b6ae8372e29-scripts" (OuterVolumeSpecName: "scripts") pod "727701ee-3475-4bf5-af07-2b6ae8372e29" (UID: "727701ee-3475-4bf5-af07-2b6ae8372e29"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:54:05 crc kubenswrapper[4971]: I0309 09:54:05.277547 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/727701ee-3475-4bf5-af07-2b6ae8372e29-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "727701ee-3475-4bf5-af07-2b6ae8372e29" (UID: "727701ee-3475-4bf5-af07-2b6ae8372e29"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:54:05 crc kubenswrapper[4971]: I0309 09:54:05.278019 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/727701ee-3475-4bf5-af07-2b6ae8372e29-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "727701ee-3475-4bf5-af07-2b6ae8372e29" (UID: "727701ee-3475-4bf5-af07-2b6ae8372e29"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:54:05 crc kubenswrapper[4971]: I0309 09:54:05.350688 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/727701ee-3475-4bf5-af07-2b6ae8372e29-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:05 crc kubenswrapper[4971]: I0309 09:54:05.350735 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/727701ee-3475-4bf5-af07-2b6ae8372e29-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:05 crc kubenswrapper[4971]: I0309 09:54:05.350751 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/727701ee-3475-4bf5-af07-2b6ae8372e29-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:05 crc kubenswrapper[4971]: I0309 09:54:05.350762 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/727701ee-3475-4bf5-af07-2b6ae8372e29-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:05 crc kubenswrapper[4971]: I0309 09:54:05.350773 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/727701ee-3475-4bf5-af07-2b6ae8372e29-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:05 crc kubenswrapper[4971]: I0309 09:54:05.350787 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfgtl\" (UniqueName: \"kubernetes.io/projected/727701ee-3475-4bf5-af07-2b6ae8372e29-kube-api-access-sfgtl\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:05 crc kubenswrapper[4971]: I0309 09:54:05.880880 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a35f75e05520590eaa1bc1596bfc94a12987a7d08c92f1a5ef2a668b12d52a15" Mar 09 09:54:05 crc kubenswrapper[4971]: I0309 09:54:05.880932 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jt4fx" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.338402 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb"] Mar 09 09:54:06 crc kubenswrapper[4971]: E0309 09:54:06.338685 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="727701ee-3475-4bf5-af07-2b6ae8372e29" containerName="swift-ring-rebalance" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.338700 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="727701ee-3475-4bf5-af07-2b6ae8372e29" containerName="swift-ring-rebalance" Mar 09 09:54:06 crc kubenswrapper[4971]: E0309 09:54:06.338742 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb29bf2-443e-4d7b-956a-001dcf4455e9" containerName="oc" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.338748 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb29bf2-443e-4d7b-956a-001dcf4455e9" containerName="oc" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.338873 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="afb29bf2-443e-4d7b-956a-001dcf4455e9" containerName="oc" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.338889 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="727701ee-3475-4bf5-af07-2b6ae8372e29" containerName="swift-ring-rebalance" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.339452 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.341197 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.341737 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.351792 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb"] Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.465171 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/da2b33ac-265e-4086-b87e-9ce6457997a9-dispersionconf\") pod \"swift-ring-rebalance-debug-f7ljb\" (UID: \"da2b33ac-265e-4086-b87e-9ce6457997a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.465319 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/da2b33ac-265e-4086-b87e-9ce6457997a9-ring-data-devices\") pod \"swift-ring-rebalance-debug-f7ljb\" (UID: \"da2b33ac-265e-4086-b87e-9ce6457997a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.465587 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/da2b33ac-265e-4086-b87e-9ce6457997a9-etc-swift\") pod \"swift-ring-rebalance-debug-f7ljb\" (UID: \"da2b33ac-265e-4086-b87e-9ce6457997a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.465877 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da2b33ac-265e-4086-b87e-9ce6457997a9-scripts\") pod \"swift-ring-rebalance-debug-f7ljb\" (UID: \"da2b33ac-265e-4086-b87e-9ce6457997a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.465945 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/da2b33ac-265e-4086-b87e-9ce6457997a9-swiftconf\") pod \"swift-ring-rebalance-debug-f7ljb\" (UID: \"da2b33ac-265e-4086-b87e-9ce6457997a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.466020 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhm6w\" (UniqueName: \"kubernetes.io/projected/da2b33ac-265e-4086-b87e-9ce6457997a9-kube-api-access-dhm6w\") pod \"swift-ring-rebalance-debug-f7ljb\" (UID: \"da2b33ac-265e-4086-b87e-9ce6457997a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.567555 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da2b33ac-265e-4086-b87e-9ce6457997a9-scripts\") pod \"swift-ring-rebalance-debug-f7ljb\" (UID: \"da2b33ac-265e-4086-b87e-9ce6457997a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.567620 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/da2b33ac-265e-4086-b87e-9ce6457997a9-swiftconf\") pod \"swift-ring-rebalance-debug-f7ljb\" (UID: \"da2b33ac-265e-4086-b87e-9ce6457997a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.567662 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhm6w\" (UniqueName: \"kubernetes.io/projected/da2b33ac-265e-4086-b87e-9ce6457997a9-kube-api-access-dhm6w\") pod \"swift-ring-rebalance-debug-f7ljb\" (UID: \"da2b33ac-265e-4086-b87e-9ce6457997a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.567753 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/da2b33ac-265e-4086-b87e-9ce6457997a9-dispersionconf\") pod \"swift-ring-rebalance-debug-f7ljb\" (UID: \"da2b33ac-265e-4086-b87e-9ce6457997a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.567788 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/da2b33ac-265e-4086-b87e-9ce6457997a9-ring-data-devices\") pod \"swift-ring-rebalance-debug-f7ljb\" (UID: \"da2b33ac-265e-4086-b87e-9ce6457997a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.567831 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/da2b33ac-265e-4086-b87e-9ce6457997a9-etc-swift\") pod \"swift-ring-rebalance-debug-f7ljb\" (UID: \"da2b33ac-265e-4086-b87e-9ce6457997a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.568450 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/da2b33ac-265e-4086-b87e-9ce6457997a9-etc-swift\") pod \"swift-ring-rebalance-debug-f7ljb\" (UID: \"da2b33ac-265e-4086-b87e-9ce6457997a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.568493 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da2b33ac-265e-4086-b87e-9ce6457997a9-scripts\") pod \"swift-ring-rebalance-debug-f7ljb\" (UID: \"da2b33ac-265e-4086-b87e-9ce6457997a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.568680 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/da2b33ac-265e-4086-b87e-9ce6457997a9-ring-data-devices\") pod \"swift-ring-rebalance-debug-f7ljb\" (UID: \"da2b33ac-265e-4086-b87e-9ce6457997a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.574182 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/da2b33ac-265e-4086-b87e-9ce6457997a9-swiftconf\") pod \"swift-ring-rebalance-debug-f7ljb\" (UID: \"da2b33ac-265e-4086-b87e-9ce6457997a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.574497 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/da2b33ac-265e-4086-b87e-9ce6457997a9-dispersionconf\") pod \"swift-ring-rebalance-debug-f7ljb\" (UID: \"da2b33ac-265e-4086-b87e-9ce6457997a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.588144 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhm6w\" (UniqueName: \"kubernetes.io/projected/da2b33ac-265e-4086-b87e-9ce6457997a9-kube-api-access-dhm6w\") pod \"swift-ring-rebalance-debug-f7ljb\" (UID: \"da2b33ac-265e-4086-b87e-9ce6457997a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb" Mar 09 09:54:06 crc kubenswrapper[4971]: I0309 09:54:06.654017 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb" Mar 09 09:54:07 crc kubenswrapper[4971]: I0309 09:54:07.060287 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb"] Mar 09 09:54:07 crc kubenswrapper[4971]: I0309 09:54:07.161776 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="727701ee-3475-4bf5-af07-2b6ae8372e29" path="/var/lib/kubelet/pods/727701ee-3475-4bf5-af07-2b6ae8372e29/volumes" Mar 09 09:54:07 crc kubenswrapper[4971]: I0309 09:54:07.163515 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6f21a0a-06fd-4f66-bef5-c17554e9aae7" path="/var/lib/kubelet/pods/a6f21a0a-06fd-4f66-bef5-c17554e9aae7/volumes" Mar 09 09:54:07 crc kubenswrapper[4971]: I0309 09:54:07.909453 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb" event={"ID":"da2b33ac-265e-4086-b87e-9ce6457997a9","Type":"ContainerStarted","Data":"fc3b02182033c25c6e76bf709aacf02c7cf9741afd534ab7124f0e09f70bdf8a"} Mar 09 09:54:07 crc kubenswrapper[4971]: I0309 09:54:07.909507 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb" event={"ID":"da2b33ac-265e-4086-b87e-9ce6457997a9","Type":"ContainerStarted","Data":"deeaefbb624816c181da2c9646886fd4b59ec6136f56167ca7b0ab70926e1b75"} Mar 09 09:54:07 crc kubenswrapper[4971]: I0309 09:54:07.951366 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb" podStartSLOduration=1.9513247790000001 podStartE2EDuration="1.951324779s" podCreationTimestamp="2026-03-09 09:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:54:07.946182154 +0000 UTC m=+2051.506109964" watchObservedRunningTime="2026-03-09 09:54:07.951324779 +0000 UTC m=+2051.511252589" Mar 09 09:54:08 crc kubenswrapper[4971]: I0309 09:54:08.916240 4971 generic.go:334] "Generic (PLEG): container finished" podID="da2b33ac-265e-4086-b87e-9ce6457997a9" containerID="fc3b02182033c25c6e76bf709aacf02c7cf9741afd534ab7124f0e09f70bdf8a" exitCode=0 Mar 09 09:54:08 crc kubenswrapper[4971]: I0309 09:54:08.916279 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb" event={"ID":"da2b33ac-265e-4086-b87e-9ce6457997a9","Type":"ContainerDied","Data":"fc3b02182033c25c6e76bf709aacf02c7cf9741afd534ab7124f0e09f70bdf8a"} Mar 09 09:54:09 crc kubenswrapper[4971]: I0309 09:54:09.152878 4971 scope.go:117] "RemoveContainer" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" Mar 09 09:54:09 crc kubenswrapper[4971]: E0309 09:54:09.153501 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 09:54:10 crc kubenswrapper[4971]: I0309 09:54:10.198731 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb" Mar 09 09:54:10 crc kubenswrapper[4971]: I0309 09:54:10.230063 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb"] Mar 09 09:54:10 crc kubenswrapper[4971]: I0309 09:54:10.240927 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb"] Mar 09 09:54:10 crc kubenswrapper[4971]: I0309 09:54:10.355284 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/da2b33ac-265e-4086-b87e-9ce6457997a9-ring-data-devices\") pod \"da2b33ac-265e-4086-b87e-9ce6457997a9\" (UID: \"da2b33ac-265e-4086-b87e-9ce6457997a9\") " Mar 09 09:54:10 crc kubenswrapper[4971]: I0309 09:54:10.355569 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/da2b33ac-265e-4086-b87e-9ce6457997a9-dispersionconf\") pod \"da2b33ac-265e-4086-b87e-9ce6457997a9\" (UID: \"da2b33ac-265e-4086-b87e-9ce6457997a9\") " Mar 09 09:54:10 crc kubenswrapper[4971]: I0309 09:54:10.355773 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da2b33ac-265e-4086-b87e-9ce6457997a9-scripts\") pod \"da2b33ac-265e-4086-b87e-9ce6457997a9\" (UID: \"da2b33ac-265e-4086-b87e-9ce6457997a9\") " Mar 09 09:54:10 crc kubenswrapper[4971]: I0309 09:54:10.355912 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/da2b33ac-265e-4086-b87e-9ce6457997a9-etc-swift\") pod \"da2b33ac-265e-4086-b87e-9ce6457997a9\" (UID: \"da2b33ac-265e-4086-b87e-9ce6457997a9\") " Mar 09 09:54:10 crc kubenswrapper[4971]: I0309 09:54:10.356091 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/da2b33ac-265e-4086-b87e-9ce6457997a9-swiftconf\") pod \"da2b33ac-265e-4086-b87e-9ce6457997a9\" (UID: \"da2b33ac-265e-4086-b87e-9ce6457997a9\") " Mar 09 09:54:10 crc kubenswrapper[4971]: I0309 09:54:10.356211 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhm6w\" (UniqueName: \"kubernetes.io/projected/da2b33ac-265e-4086-b87e-9ce6457997a9-kube-api-access-dhm6w\") pod \"da2b33ac-265e-4086-b87e-9ce6457997a9\" (UID: \"da2b33ac-265e-4086-b87e-9ce6457997a9\") " Mar 09 09:54:10 crc kubenswrapper[4971]: I0309 09:54:10.356206 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da2b33ac-265e-4086-b87e-9ce6457997a9-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "da2b33ac-265e-4086-b87e-9ce6457997a9" (UID: "da2b33ac-265e-4086-b87e-9ce6457997a9"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:54:10 crc kubenswrapper[4971]: I0309 09:54:10.356830 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da2b33ac-265e-4086-b87e-9ce6457997a9-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "da2b33ac-265e-4086-b87e-9ce6457997a9" (UID: "da2b33ac-265e-4086-b87e-9ce6457997a9"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:54:10 crc kubenswrapper[4971]: I0309 09:54:10.357024 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/da2b33ac-265e-4086-b87e-9ce6457997a9-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:10 crc kubenswrapper[4971]: I0309 09:54:10.357143 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/da2b33ac-265e-4086-b87e-9ce6457997a9-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:10 crc kubenswrapper[4971]: I0309 09:54:10.366569 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da2b33ac-265e-4086-b87e-9ce6457997a9-kube-api-access-dhm6w" (OuterVolumeSpecName: "kube-api-access-dhm6w") pod "da2b33ac-265e-4086-b87e-9ce6457997a9" (UID: "da2b33ac-265e-4086-b87e-9ce6457997a9"). InnerVolumeSpecName "kube-api-access-dhm6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:54:10 crc kubenswrapper[4971]: I0309 09:54:10.385281 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da2b33ac-265e-4086-b87e-9ce6457997a9-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "da2b33ac-265e-4086-b87e-9ce6457997a9" (UID: "da2b33ac-265e-4086-b87e-9ce6457997a9"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:54:10 crc kubenswrapper[4971]: I0309 09:54:10.386294 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da2b33ac-265e-4086-b87e-9ce6457997a9-scripts" (OuterVolumeSpecName: "scripts") pod "da2b33ac-265e-4086-b87e-9ce6457997a9" (UID: "da2b33ac-265e-4086-b87e-9ce6457997a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:54:10 crc kubenswrapper[4971]: I0309 09:54:10.388741 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da2b33ac-265e-4086-b87e-9ce6457997a9-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "da2b33ac-265e-4086-b87e-9ce6457997a9" (UID: "da2b33ac-265e-4086-b87e-9ce6457997a9"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:54:10 crc kubenswrapper[4971]: I0309 09:54:10.458505 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/da2b33ac-265e-4086-b87e-9ce6457997a9-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:10 crc kubenswrapper[4971]: I0309 09:54:10.458561 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhm6w\" (UniqueName: \"kubernetes.io/projected/da2b33ac-265e-4086-b87e-9ce6457997a9-kube-api-access-dhm6w\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:10 crc kubenswrapper[4971]: I0309 09:54:10.458577 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/da2b33ac-265e-4086-b87e-9ce6457997a9-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:10 crc kubenswrapper[4971]: I0309 09:54:10.458589 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da2b33ac-265e-4086-b87e-9ce6457997a9-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:10 crc kubenswrapper[4971]: I0309 09:54:10.935302 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="deeaefbb624816c181da2c9646886fd4b59ec6136f56167ca7b0ab70926e1b75" Mar 09 09:54:10 crc kubenswrapper[4971]: I0309 09:54:10.935439 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7ljb" Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.161840 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da2b33ac-265e-4086-b87e-9ce6457997a9" path="/var/lib/kubelet/pods/da2b33ac-265e-4086-b87e-9ce6457997a9/volumes" Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.376342 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rtf76"] Mar 09 09:54:11 crc kubenswrapper[4971]: E0309 09:54:11.376865 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da2b33ac-265e-4086-b87e-9ce6457997a9" containerName="swift-ring-rebalance" Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.376881 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2b33ac-265e-4086-b87e-9ce6457997a9" containerName="swift-ring-rebalance" Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.377077 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="da2b33ac-265e-4086-b87e-9ce6457997a9" containerName="swift-ring-rebalance" Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.377729 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtf76" Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.385528 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rtf76"] Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.429521 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.429761 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.474486 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zfkb\" (UniqueName: \"kubernetes.io/projected/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-kube-api-access-7zfkb\") pod \"swift-ring-rebalance-debug-rtf76\" (UID: \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtf76" Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.474542 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-ring-data-devices\") pod \"swift-ring-rebalance-debug-rtf76\" (UID: \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtf76" Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.474561 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-scripts\") pod \"swift-ring-rebalance-debug-rtf76\" (UID: \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtf76" Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.474585 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-dispersionconf\") pod \"swift-ring-rebalance-debug-rtf76\" (UID: \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtf76" Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.474653 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-etc-swift\") pod \"swift-ring-rebalance-debug-rtf76\" (UID: \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtf76" Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.474672 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-swiftconf\") pod \"swift-ring-rebalance-debug-rtf76\" (UID: \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtf76" Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.576737 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-etc-swift\") pod \"swift-ring-rebalance-debug-rtf76\" (UID: \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtf76" Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.576798 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-swiftconf\") pod \"swift-ring-rebalance-debug-rtf76\" (UID: \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtf76" Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.576878 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zfkb\" (UniqueName: \"kubernetes.io/projected/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-kube-api-access-7zfkb\") pod \"swift-ring-rebalance-debug-rtf76\" (UID: \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtf76" Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.576906 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-ring-data-devices\") pod \"swift-ring-rebalance-debug-rtf76\" (UID: \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtf76" Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.576928 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-scripts\") pod \"swift-ring-rebalance-debug-rtf76\" (UID: \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtf76" Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.576955 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-dispersionconf\") pod \"swift-ring-rebalance-debug-rtf76\" (UID: \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtf76" Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.577396 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-etc-swift\") pod \"swift-ring-rebalance-debug-rtf76\" (UID: \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtf76" Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.577953 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-ring-data-devices\") pod \"swift-ring-rebalance-debug-rtf76\" (UID: \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtf76" Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.578170 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-scripts\") pod \"swift-ring-rebalance-debug-rtf76\" (UID: \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtf76" Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.580418 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-swiftconf\") pod \"swift-ring-rebalance-debug-rtf76\" (UID: \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtf76" Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.581254 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-dispersionconf\") pod \"swift-ring-rebalance-debug-rtf76\" (UID: \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtf76" Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.594106 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zfkb\" (UniqueName: \"kubernetes.io/projected/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-kube-api-access-7zfkb\") pod \"swift-ring-rebalance-debug-rtf76\" (UID: \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtf76" Mar 09 09:54:11 crc kubenswrapper[4971]: I0309 09:54:11.761427 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtf76" Mar 09 09:54:12 crc kubenswrapper[4971]: I0309 09:54:12.168463 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rtf76"] Mar 09 09:54:12 crc kubenswrapper[4971]: I0309 09:54:12.955806 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtf76" event={"ID":"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411","Type":"ContainerStarted","Data":"cdb8b706eb4a3e8df58f981c50ae4504f7725109f031d3d4724ec6d7c8530b43"} Mar 09 09:54:12 crc kubenswrapper[4971]: I0309 09:54:12.956128 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtf76" event={"ID":"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411","Type":"ContainerStarted","Data":"dc0e02609b590aa63803730e9157d30226464d9437e8b01ac6fc470d9b699d88"} Mar 09 09:54:12 crc kubenswrapper[4971]: I0309 09:54:12.975452 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtf76" podStartSLOduration=1.975431191 podStartE2EDuration="1.975431191s" podCreationTimestamp="2026-03-09 09:54:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:54:12.973537308 +0000 UTC m=+2056.533465128" watchObservedRunningTime="2026-03-09 09:54:12.975431191 +0000 UTC m=+2056.535358991" Mar 09 09:54:13 crc kubenswrapper[4971]: I0309 09:54:13.963970 4971 generic.go:334] "Generic (PLEG): container finished" podID="cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411" containerID="cdb8b706eb4a3e8df58f981c50ae4504f7725109f031d3d4724ec6d7c8530b43" exitCode=0 Mar 09 09:54:13 crc kubenswrapper[4971]: I0309 09:54:13.964014 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtf76" event={"ID":"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411","Type":"ContainerDied","Data":"cdb8b706eb4a3e8df58f981c50ae4504f7725109f031d3d4724ec6d7c8530b43"} Mar 09 09:54:15 crc kubenswrapper[4971]: I0309 09:54:15.235028 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtf76" Mar 09 09:54:15 crc kubenswrapper[4971]: I0309 09:54:15.267470 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rtf76"] Mar 09 09:54:15 crc kubenswrapper[4971]: I0309 09:54:15.272793 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rtf76"] Mar 09 09:54:15 crc kubenswrapper[4971]: I0309 09:54:15.330938 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-etc-swift\") pod \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\" (UID: \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\") " Mar 09 09:54:15 crc kubenswrapper[4971]: I0309 09:54:15.331017 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-ring-data-devices\") pod \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\" (UID: \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\") " Mar 09 09:54:15 crc kubenswrapper[4971]: I0309 09:54:15.331078 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-dispersionconf\") pod \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\" (UID: \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\") " Mar 09 09:54:15 crc kubenswrapper[4971]: I0309 09:54:15.331105 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zfkb\" (UniqueName: \"kubernetes.io/projected/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-kube-api-access-7zfkb\") pod \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\" (UID: \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\") " Mar 09 09:54:15 crc kubenswrapper[4971]: I0309 09:54:15.331974 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-swiftconf\") pod \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\" (UID: \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\") " Mar 09 09:54:15 crc kubenswrapper[4971]: I0309 09:54:15.331973 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411" (UID: "cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:54:15 crc kubenswrapper[4971]: I0309 09:54:15.332002 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-scripts\") pod \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\" (UID: \"cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411\") " Mar 09 09:54:15 crc kubenswrapper[4971]: I0309 09:54:15.332014 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411" (UID: "cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:54:15 crc kubenswrapper[4971]: I0309 09:54:15.332513 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:15 crc kubenswrapper[4971]: I0309 09:54:15.332537 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:15 crc kubenswrapper[4971]: I0309 09:54:15.338593 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-kube-api-access-7zfkb" (OuterVolumeSpecName: "kube-api-access-7zfkb") pod "cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411" (UID: "cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411"). InnerVolumeSpecName "kube-api-access-7zfkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:54:15 crc kubenswrapper[4971]: I0309 09:54:15.353139 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411" (UID: "cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:54:15 crc kubenswrapper[4971]: I0309 09:54:15.356374 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-scripts" (OuterVolumeSpecName: "scripts") pod "cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411" (UID: "cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:54:15 crc kubenswrapper[4971]: I0309 09:54:15.357880 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411" (UID: "cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:54:15 crc kubenswrapper[4971]: I0309 09:54:15.433472 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:15 crc kubenswrapper[4971]: I0309 09:54:15.433504 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:15 crc kubenswrapper[4971]: I0309 09:54:15.433517 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:15 crc kubenswrapper[4971]: I0309 09:54:15.433528 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zfkb\" (UniqueName: \"kubernetes.io/projected/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411-kube-api-access-7zfkb\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:15 crc kubenswrapper[4971]: I0309 09:54:15.981211 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc0e02609b590aa63803730e9157d30226464d9437e8b01ac6fc470d9b699d88" Mar 09 09:54:15 crc kubenswrapper[4971]: I0309 09:54:15.981507 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtf76" Mar 09 09:54:16 crc kubenswrapper[4971]: I0309 09:54:16.412399 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-l8jds"] Mar 09 09:54:16 crc kubenswrapper[4971]: E0309 09:54:16.413171 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411" containerName="swift-ring-rebalance" Mar 09 09:54:16 crc kubenswrapper[4971]: I0309 09:54:16.413196 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411" containerName="swift-ring-rebalance" Mar 09 09:54:16 crc kubenswrapper[4971]: I0309 09:54:16.413561 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411" containerName="swift-ring-rebalance" Mar 09 09:54:16 crc kubenswrapper[4971]: I0309 09:54:16.414326 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l8jds" Mar 09 09:54:16 crc kubenswrapper[4971]: I0309 09:54:16.417083 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:54:16 crc kubenswrapper[4971]: I0309 09:54:16.417328 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:54:16 crc kubenswrapper[4971]: I0309 09:54:16.421382 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-l8jds"] Mar 09 09:54:16 crc kubenswrapper[4971]: I0309 09:54:16.555307 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqkj7\" (UniqueName: \"kubernetes.io/projected/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-kube-api-access-bqkj7\") pod \"swift-ring-rebalance-debug-l8jds\" (UID: \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l8jds" Mar 09 09:54:16 crc kubenswrapper[4971]: I0309 09:54:16.555393 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-ring-data-devices\") pod \"swift-ring-rebalance-debug-l8jds\" (UID: \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l8jds" Mar 09 09:54:16 crc kubenswrapper[4971]: I0309 09:54:16.555453 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-etc-swift\") pod \"swift-ring-rebalance-debug-l8jds\" (UID: \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l8jds" Mar 09 09:54:16 crc kubenswrapper[4971]: I0309 09:54:16.555534 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-scripts\") pod \"swift-ring-rebalance-debug-l8jds\" (UID: \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l8jds" Mar 09 09:54:16 crc kubenswrapper[4971]: I0309 09:54:16.555566 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-swiftconf\") pod \"swift-ring-rebalance-debug-l8jds\" (UID: \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l8jds" Mar 09 09:54:16 crc kubenswrapper[4971]: I0309 09:54:16.555590 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-dispersionconf\") pod \"swift-ring-rebalance-debug-l8jds\" (UID: \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l8jds" Mar 09 09:54:16 crc kubenswrapper[4971]: I0309 09:54:16.657481 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-scripts\") pod \"swift-ring-rebalance-debug-l8jds\" (UID: \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l8jds" Mar 09 09:54:16 crc kubenswrapper[4971]: I0309 09:54:16.657550 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-swiftconf\") pod \"swift-ring-rebalance-debug-l8jds\" (UID: \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l8jds" Mar 09 09:54:16 crc kubenswrapper[4971]: I0309 09:54:16.657582 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-dispersionconf\") pod \"swift-ring-rebalance-debug-l8jds\" (UID: \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l8jds" Mar 09 09:54:16 crc kubenswrapper[4971]: I0309 09:54:16.657644 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqkj7\" (UniqueName: \"kubernetes.io/projected/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-kube-api-access-bqkj7\") pod \"swift-ring-rebalance-debug-l8jds\" (UID: \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l8jds" Mar 09 09:54:16 crc kubenswrapper[4971]: I0309 09:54:16.657674 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-ring-data-devices\") pod \"swift-ring-rebalance-debug-l8jds\" (UID: \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l8jds" Mar 09 09:54:16 crc kubenswrapper[4971]: I0309 09:54:16.657742 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-etc-swift\") pod \"swift-ring-rebalance-debug-l8jds\" (UID: \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l8jds" Mar 09 09:54:16 crc kubenswrapper[4971]: I0309 09:54:16.658319 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-etc-swift\") pod \"swift-ring-rebalance-debug-l8jds\" (UID: \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l8jds" Mar 09 09:54:16 crc kubenswrapper[4971]: I0309 09:54:16.658489 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-ring-data-devices\") pod \"swift-ring-rebalance-debug-l8jds\" (UID: \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l8jds" Mar 09 09:54:16 crc kubenswrapper[4971]: I0309 09:54:16.658827 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-scripts\") pod \"swift-ring-rebalance-debug-l8jds\" (UID: \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l8jds" Mar 09 09:54:16 crc kubenswrapper[4971]: I0309 09:54:16.660893 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-dispersionconf\") pod \"swift-ring-rebalance-debug-l8jds\" (UID: \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l8jds" Mar 09 09:54:16 crc kubenswrapper[4971]: I0309 09:54:16.660956 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-swiftconf\") pod \"swift-ring-rebalance-debug-l8jds\" (UID: \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l8jds" Mar 09 09:54:16 crc kubenswrapper[4971]: I0309 09:54:16.679532 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqkj7\" (UniqueName: \"kubernetes.io/projected/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-kube-api-access-bqkj7\") pod \"swift-ring-rebalance-debug-l8jds\" (UID: \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l8jds" Mar 09 09:54:16 crc kubenswrapper[4971]: I0309 09:54:16.753950 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l8jds" Mar 09 09:54:17 crc kubenswrapper[4971]: I0309 09:54:17.162154 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411" path="/var/lib/kubelet/pods/cb714d38-5c6f-4d5f-9ea4-cd2d51f4b411/volumes" Mar 09 09:54:17 crc kubenswrapper[4971]: I0309 09:54:17.192781 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-l8jds"] Mar 09 09:54:17 crc kubenswrapper[4971]: W0309 09:54:17.204063 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3bbf0a9_19ee_46ff_8046_ca0095974f7e.slice/crio-782f70c1d65037d68c70b776f0950a6f97d84c09e6cea8556beb2468ca693c02 WatchSource:0}: Error finding container 782f70c1d65037d68c70b776f0950a6f97d84c09e6cea8556beb2468ca693c02: Status 404 returned error can't find the container with id 782f70c1d65037d68c70b776f0950a6f97d84c09e6cea8556beb2468ca693c02 Mar 09 09:54:17 crc kubenswrapper[4971]: I0309 09:54:17.998666 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l8jds" event={"ID":"b3bbf0a9-19ee-46ff-8046-ca0095974f7e","Type":"ContainerStarted","Data":"f810a40392bc23040cc1f561eef6f9f4ddcd593ed4be1db42731881536b6c9e6"} Mar 09 09:54:17 crc kubenswrapper[4971]: I0309 09:54:17.999037 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l8jds" event={"ID":"b3bbf0a9-19ee-46ff-8046-ca0095974f7e","Type":"ContainerStarted","Data":"782f70c1d65037d68c70b776f0950a6f97d84c09e6cea8556beb2468ca693c02"} Mar 09 09:54:18 crc kubenswrapper[4971]: I0309 09:54:18.027109 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l8jds" podStartSLOduration=2.02707839 podStartE2EDuration="2.02707839s" podCreationTimestamp="2026-03-09 09:54:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:54:18.023491289 +0000 UTC m=+2061.583419089" watchObservedRunningTime="2026-03-09 09:54:18.02707839 +0000 UTC m=+2061.587006210" Mar 09 09:54:19 crc kubenswrapper[4971]: I0309 09:54:19.008304 4971 generic.go:334] "Generic (PLEG): container finished" podID="b3bbf0a9-19ee-46ff-8046-ca0095974f7e" containerID="f810a40392bc23040cc1f561eef6f9f4ddcd593ed4be1db42731881536b6c9e6" exitCode=0 Mar 09 09:54:19 crc kubenswrapper[4971]: I0309 09:54:19.008408 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l8jds" event={"ID":"b3bbf0a9-19ee-46ff-8046-ca0095974f7e","Type":"ContainerDied","Data":"f810a40392bc23040cc1f561eef6f9f4ddcd593ed4be1db42731881536b6c9e6"} Mar 09 09:54:20 crc kubenswrapper[4971]: I0309 09:54:20.294623 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l8jds" Mar 09 09:54:20 crc kubenswrapper[4971]: I0309 09:54:20.329196 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-l8jds"] Mar 09 09:54:20 crc kubenswrapper[4971]: I0309 09:54:20.336292 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-l8jds"] Mar 09 09:54:20 crc kubenswrapper[4971]: I0309 09:54:20.411679 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-etc-swift\") pod \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\" (UID: \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\") " Mar 09 09:54:20 crc kubenswrapper[4971]: I0309 09:54:20.411741 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqkj7\" (UniqueName: \"kubernetes.io/projected/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-kube-api-access-bqkj7\") pod \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\" (UID: \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\") " Mar 09 09:54:20 crc kubenswrapper[4971]: I0309 09:54:20.411791 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-scripts\") pod \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\" (UID: \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\") " Mar 09 09:54:20 crc kubenswrapper[4971]: I0309 09:54:20.411833 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-ring-data-devices\") pod \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\" (UID: \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\") " Mar 09 09:54:20 crc kubenswrapper[4971]: I0309 09:54:20.411859 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-swiftconf\") pod \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\" (UID: \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\") " Mar 09 09:54:20 crc kubenswrapper[4971]: I0309 09:54:20.411901 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-dispersionconf\") pod \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\" (UID: \"b3bbf0a9-19ee-46ff-8046-ca0095974f7e\") " Mar 09 09:54:20 crc kubenswrapper[4971]: I0309 09:54:20.412636 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b3bbf0a9-19ee-46ff-8046-ca0095974f7e" (UID: "b3bbf0a9-19ee-46ff-8046-ca0095974f7e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:54:20 crc kubenswrapper[4971]: I0309 09:54:20.412776 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b3bbf0a9-19ee-46ff-8046-ca0095974f7e" (UID: "b3bbf0a9-19ee-46ff-8046-ca0095974f7e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:54:20 crc kubenswrapper[4971]: I0309 09:54:20.420621 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-kube-api-access-bqkj7" (OuterVolumeSpecName: "kube-api-access-bqkj7") pod "b3bbf0a9-19ee-46ff-8046-ca0095974f7e" (UID: "b3bbf0a9-19ee-46ff-8046-ca0095974f7e"). InnerVolumeSpecName "kube-api-access-bqkj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:54:20 crc kubenswrapper[4971]: I0309 09:54:20.435550 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-scripts" (OuterVolumeSpecName: "scripts") pod "b3bbf0a9-19ee-46ff-8046-ca0095974f7e" (UID: "b3bbf0a9-19ee-46ff-8046-ca0095974f7e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:54:20 crc kubenswrapper[4971]: I0309 09:54:20.438633 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b3bbf0a9-19ee-46ff-8046-ca0095974f7e" (UID: "b3bbf0a9-19ee-46ff-8046-ca0095974f7e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:54:20 crc kubenswrapper[4971]: I0309 09:54:20.439206 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b3bbf0a9-19ee-46ff-8046-ca0095974f7e" (UID: "b3bbf0a9-19ee-46ff-8046-ca0095974f7e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:54:20 crc kubenswrapper[4971]: I0309 09:54:20.513414 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:20 crc kubenswrapper[4971]: I0309 09:54:20.513462 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:20 crc kubenswrapper[4971]: I0309 09:54:20.513476 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:20 crc kubenswrapper[4971]: I0309 09:54:20.513487 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:20 crc kubenswrapper[4971]: I0309 09:54:20.513497 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:20 crc kubenswrapper[4971]: I0309 09:54:20.513507 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqkj7\" (UniqueName: \"kubernetes.io/projected/b3bbf0a9-19ee-46ff-8046-ca0095974f7e-kube-api-access-bqkj7\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.026872 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="782f70c1d65037d68c70b776f0950a6f97d84c09e6cea8556beb2468ca693c02" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.026940 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l8jds" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.160708 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3bbf0a9-19ee-46ff-8046-ca0095974f7e" path="/var/lib/kubelet/pods/b3bbf0a9-19ee-46ff-8046-ca0095974f7e/volumes" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.458746 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wchdm"] Mar 09 09:54:21 crc kubenswrapper[4971]: E0309 09:54:21.459050 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3bbf0a9-19ee-46ff-8046-ca0095974f7e" containerName="swift-ring-rebalance" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.459066 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3bbf0a9-19ee-46ff-8046-ca0095974f7e" containerName="swift-ring-rebalance" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.459256 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3bbf0a9-19ee-46ff-8046-ca0095974f7e" containerName="swift-ring-rebalance" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.459765 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wchdm" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.461801 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.467763 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wchdm"] Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.470483 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.527736 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sh52\" (UniqueName: \"kubernetes.io/projected/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-kube-api-access-7sh52\") pod \"swift-ring-rebalance-debug-wchdm\" (UID: \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wchdm" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.527795 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-dispersionconf\") pod \"swift-ring-rebalance-debug-wchdm\" (UID: \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wchdm" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.527875 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-scripts\") pod \"swift-ring-rebalance-debug-wchdm\" (UID: \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wchdm" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.527932 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-ring-data-devices\") pod \"swift-ring-rebalance-debug-wchdm\" (UID: \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wchdm" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.527996 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-swiftconf\") pod \"swift-ring-rebalance-debug-wchdm\" (UID: \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wchdm" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.528015 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-etc-swift\") pod \"swift-ring-rebalance-debug-wchdm\" (UID: \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wchdm" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.629684 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-scripts\") pod \"swift-ring-rebalance-debug-wchdm\" (UID: \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wchdm" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.629763 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-ring-data-devices\") pod \"swift-ring-rebalance-debug-wchdm\" (UID: \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wchdm" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.629815 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-swiftconf\") pod \"swift-ring-rebalance-debug-wchdm\" (UID: \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wchdm" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.629831 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-etc-swift\") pod \"swift-ring-rebalance-debug-wchdm\" (UID: \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wchdm" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.629867 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sh52\" (UniqueName: \"kubernetes.io/projected/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-kube-api-access-7sh52\") pod \"swift-ring-rebalance-debug-wchdm\" (UID: \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wchdm" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.629887 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-dispersionconf\") pod \"swift-ring-rebalance-debug-wchdm\" (UID: \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wchdm" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.630528 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-scripts\") pod \"swift-ring-rebalance-debug-wchdm\" (UID: \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wchdm" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.630828 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-ring-data-devices\") pod \"swift-ring-rebalance-debug-wchdm\" (UID: \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wchdm" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.630851 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-etc-swift\") pod \"swift-ring-rebalance-debug-wchdm\" (UID: \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wchdm" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.634497 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-dispersionconf\") pod \"swift-ring-rebalance-debug-wchdm\" (UID: \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wchdm" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.637181 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-swiftconf\") pod \"swift-ring-rebalance-debug-wchdm\" (UID: \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wchdm" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.658471 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sh52\" (UniqueName: \"kubernetes.io/projected/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-kube-api-access-7sh52\") pod \"swift-ring-rebalance-debug-wchdm\" (UID: \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wchdm" Mar 09 09:54:21 crc kubenswrapper[4971]: I0309 09:54:21.776026 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wchdm" Mar 09 09:54:22 crc kubenswrapper[4971]: I0309 09:54:22.152851 4971 scope.go:117] "RemoveContainer" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" Mar 09 09:54:22 crc kubenswrapper[4971]: E0309 09:54:22.153403 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 09:54:22 crc kubenswrapper[4971]: I0309 09:54:22.193757 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wchdm"] Mar 09 09:54:22 crc kubenswrapper[4971]: W0309 09:54:22.195802 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod973facd7_3d2e_41c8_aa0f_c0e92eb3260d.slice/crio-9c022e6e13f6a03759adb4eb98908fa64ade6fcc15fa4d759a7090cc65984f01 WatchSource:0}: Error finding container 9c022e6e13f6a03759adb4eb98908fa64ade6fcc15fa4d759a7090cc65984f01: Status 404 returned error can't find the container with id 9c022e6e13f6a03759adb4eb98908fa64ade6fcc15fa4d759a7090cc65984f01 Mar 09 09:54:23 crc kubenswrapper[4971]: I0309 09:54:23.063834 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wchdm" event={"ID":"973facd7-3d2e-41c8-aa0f-c0e92eb3260d","Type":"ContainerStarted","Data":"401ab593de2c1adf552ea91a59a10c4422a4e147a2aad4b7488448868ee23ae0"} Mar 09 09:54:23 crc kubenswrapper[4971]: I0309 09:54:23.065334 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wchdm" event={"ID":"973facd7-3d2e-41c8-aa0f-c0e92eb3260d","Type":"ContainerStarted","Data":"9c022e6e13f6a03759adb4eb98908fa64ade6fcc15fa4d759a7090cc65984f01"} Mar 09 09:54:23 crc kubenswrapper[4971]: I0309 09:54:23.080601 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wchdm" podStartSLOduration=2.08056024 podStartE2EDuration="2.08056024s" podCreationTimestamp="2026-03-09 09:54:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:54:23.07950935 +0000 UTC m=+2066.639437190" watchObservedRunningTime="2026-03-09 09:54:23.08056024 +0000 UTC m=+2066.640488050" Mar 09 09:54:24 crc kubenswrapper[4971]: I0309 09:54:24.089880 4971 generic.go:334] "Generic (PLEG): container finished" podID="973facd7-3d2e-41c8-aa0f-c0e92eb3260d" containerID="401ab593de2c1adf552ea91a59a10c4422a4e147a2aad4b7488448868ee23ae0" exitCode=0 Mar 09 09:54:24 crc kubenswrapper[4971]: I0309 09:54:24.089961 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wchdm" event={"ID":"973facd7-3d2e-41c8-aa0f-c0e92eb3260d","Type":"ContainerDied","Data":"401ab593de2c1adf552ea91a59a10c4422a4e147a2aad4b7488448868ee23ae0"} Mar 09 09:54:25 crc kubenswrapper[4971]: I0309 09:54:25.372719 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wchdm" Mar 09 09:54:25 crc kubenswrapper[4971]: I0309 09:54:25.409835 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wchdm"] Mar 09 09:54:25 crc kubenswrapper[4971]: I0309 09:54:25.416982 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wchdm"] Mar 09 09:54:25 crc kubenswrapper[4971]: I0309 09:54:25.494438 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-swiftconf\") pod \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\" (UID: \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\") " Mar 09 09:54:25 crc kubenswrapper[4971]: I0309 09:54:25.494520 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sh52\" (UniqueName: \"kubernetes.io/projected/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-kube-api-access-7sh52\") pod \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\" (UID: \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\") " Mar 09 09:54:25 crc kubenswrapper[4971]: I0309 09:54:25.494555 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-dispersionconf\") pod \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\" (UID: \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\") " Mar 09 09:54:25 crc kubenswrapper[4971]: I0309 09:54:25.494606 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-scripts\") pod \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\" (UID: \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\") " Mar 09 09:54:25 crc kubenswrapper[4971]: I0309 09:54:25.494649 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-ring-data-devices\") pod \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\" (UID: \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\") " Mar 09 09:54:25 crc kubenswrapper[4971]: I0309 09:54:25.494739 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-etc-swift\") pod \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\" (UID: \"973facd7-3d2e-41c8-aa0f-c0e92eb3260d\") " Mar 09 09:54:25 crc kubenswrapper[4971]: I0309 09:54:25.495666 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "973facd7-3d2e-41c8-aa0f-c0e92eb3260d" (UID: "973facd7-3d2e-41c8-aa0f-c0e92eb3260d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:54:25 crc kubenswrapper[4971]: I0309 09:54:25.495814 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "973facd7-3d2e-41c8-aa0f-c0e92eb3260d" (UID: "973facd7-3d2e-41c8-aa0f-c0e92eb3260d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:54:25 crc kubenswrapper[4971]: I0309 09:54:25.500453 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-kube-api-access-7sh52" (OuterVolumeSpecName: "kube-api-access-7sh52") pod "973facd7-3d2e-41c8-aa0f-c0e92eb3260d" (UID: "973facd7-3d2e-41c8-aa0f-c0e92eb3260d"). InnerVolumeSpecName "kube-api-access-7sh52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:54:25 crc kubenswrapper[4971]: I0309 09:54:25.516610 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-scripts" (OuterVolumeSpecName: "scripts") pod "973facd7-3d2e-41c8-aa0f-c0e92eb3260d" (UID: "973facd7-3d2e-41c8-aa0f-c0e92eb3260d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:54:25 crc kubenswrapper[4971]: I0309 09:54:25.519472 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "973facd7-3d2e-41c8-aa0f-c0e92eb3260d" (UID: "973facd7-3d2e-41c8-aa0f-c0e92eb3260d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:54:25 crc kubenswrapper[4971]: I0309 09:54:25.524797 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "973facd7-3d2e-41c8-aa0f-c0e92eb3260d" (UID: "973facd7-3d2e-41c8-aa0f-c0e92eb3260d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:54:25 crc kubenswrapper[4971]: I0309 09:54:25.596222 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:25 crc kubenswrapper[4971]: I0309 09:54:25.596466 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:25 crc kubenswrapper[4971]: I0309 09:54:25.596576 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sh52\" (UniqueName: \"kubernetes.io/projected/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-kube-api-access-7sh52\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:25 crc kubenswrapper[4971]: I0309 09:54:25.596698 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:25 crc kubenswrapper[4971]: I0309 09:54:25.596792 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:25 crc kubenswrapper[4971]: I0309 09:54:25.596869 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/973facd7-3d2e-41c8-aa0f-c0e92eb3260d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.108395 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c022e6e13f6a03759adb4eb98908fa64ade6fcc15fa4d759a7090cc65984f01" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.108463 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wchdm" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.546827 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2"] Mar 09 09:54:26 crc kubenswrapper[4971]: E0309 09:54:26.547124 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="973facd7-3d2e-41c8-aa0f-c0e92eb3260d" containerName="swift-ring-rebalance" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.547135 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="973facd7-3d2e-41c8-aa0f-c0e92eb3260d" containerName="swift-ring-rebalance" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.547286 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="973facd7-3d2e-41c8-aa0f-c0e92eb3260d" containerName="swift-ring-rebalance" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.547868 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.561487 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.561622 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.571712 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2"] Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.614522 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-ring-data-devices\") pod \"swift-ring-rebalance-debug-d9pd2\" (UID: \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.614608 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-dispersionconf\") pod \"swift-ring-rebalance-debug-d9pd2\" (UID: \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.614658 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-etc-swift\") pod \"swift-ring-rebalance-debug-d9pd2\" (UID: \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.614690 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-scripts\") pod \"swift-ring-rebalance-debug-d9pd2\" (UID: \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.614735 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85n49\" (UniqueName: \"kubernetes.io/projected/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-kube-api-access-85n49\") pod \"swift-ring-rebalance-debug-d9pd2\" (UID: \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.614828 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-swiftconf\") pod \"swift-ring-rebalance-debug-d9pd2\" (UID: \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.716039 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-ring-data-devices\") pod \"swift-ring-rebalance-debug-d9pd2\" (UID: \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.716123 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-dispersionconf\") pod \"swift-ring-rebalance-debug-d9pd2\" (UID: \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.716159 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-etc-swift\") pod \"swift-ring-rebalance-debug-d9pd2\" (UID: \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.716177 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-scripts\") pod \"swift-ring-rebalance-debug-d9pd2\" (UID: \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.716204 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85n49\" (UniqueName: \"kubernetes.io/projected/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-kube-api-access-85n49\") pod \"swift-ring-rebalance-debug-d9pd2\" (UID: \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.716237 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-swiftconf\") pod \"swift-ring-rebalance-debug-d9pd2\" (UID: \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.717020 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-etc-swift\") pod \"swift-ring-rebalance-debug-d9pd2\" (UID: \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.717033 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-ring-data-devices\") pod \"swift-ring-rebalance-debug-d9pd2\" (UID: \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.717331 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-scripts\") pod \"swift-ring-rebalance-debug-d9pd2\" (UID: \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.721027 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-swiftconf\") pod \"swift-ring-rebalance-debug-d9pd2\" (UID: \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.721434 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-dispersionconf\") pod \"swift-ring-rebalance-debug-d9pd2\" (UID: \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.735285 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85n49\" (UniqueName: \"kubernetes.io/projected/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-kube-api-access-85n49\") pod \"swift-ring-rebalance-debug-d9pd2\" (UID: \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2" Mar 09 09:54:26 crc kubenswrapper[4971]: I0309 09:54:26.880065 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2" Mar 09 09:54:27 crc kubenswrapper[4971]: I0309 09:54:27.161444 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="973facd7-3d2e-41c8-aa0f-c0e92eb3260d" path="/var/lib/kubelet/pods/973facd7-3d2e-41c8-aa0f-c0e92eb3260d/volumes" Mar 09 09:54:27 crc kubenswrapper[4971]: I0309 09:54:27.370784 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2"] Mar 09 09:54:28 crc kubenswrapper[4971]: I0309 09:54:28.145172 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2" event={"ID":"9f40f000-324d-4d5d-8318-5d86aa1fc7f7","Type":"ContainerStarted","Data":"a9cdeef9247d9026bae7da844d5381632899881955d9b4084c0f6cbda751b840"} Mar 09 09:54:28 crc kubenswrapper[4971]: I0309 09:54:28.145617 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2" event={"ID":"9f40f000-324d-4d5d-8318-5d86aa1fc7f7","Type":"ContainerStarted","Data":"fa3868013e98affcbd00987bff306c9a2b92ebbfdf0b904128bba9e1b58b33c6"} Mar 09 09:54:28 crc kubenswrapper[4971]: I0309 09:54:28.174365 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2" podStartSLOduration=2.174318055 podStartE2EDuration="2.174318055s" podCreationTimestamp="2026-03-09 09:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:54:28.167332308 +0000 UTC m=+2071.727260118" watchObservedRunningTime="2026-03-09 09:54:28.174318055 +0000 UTC m=+2071.734245865" Mar 09 09:54:29 crc kubenswrapper[4971]: I0309 09:54:29.154481 4971 generic.go:334] "Generic (PLEG): container finished" podID="9f40f000-324d-4d5d-8318-5d86aa1fc7f7" containerID="a9cdeef9247d9026bae7da844d5381632899881955d9b4084c0f6cbda751b840" exitCode=0 Mar 09 09:54:29 crc kubenswrapper[4971]: I0309 09:54:29.160816 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2" event={"ID":"9f40f000-324d-4d5d-8318-5d86aa1fc7f7","Type":"ContainerDied","Data":"a9cdeef9247d9026bae7da844d5381632899881955d9b4084c0f6cbda751b840"} Mar 09 09:54:30 crc kubenswrapper[4971]: I0309 09:54:30.435321 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2" Mar 09 09:54:30 crc kubenswrapper[4971]: I0309 09:54:30.469663 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2"] Mar 09 09:54:30 crc kubenswrapper[4971]: I0309 09:54:30.474409 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2"] Mar 09 09:54:30 crc kubenswrapper[4971]: I0309 09:54:30.575336 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-scripts\") pod \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\" (UID: \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\") " Mar 09 09:54:30 crc kubenswrapper[4971]: I0309 09:54:30.575429 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-ring-data-devices\") pod \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\" (UID: \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\") " Mar 09 09:54:30 crc kubenswrapper[4971]: I0309 09:54:30.575551 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-dispersionconf\") pod \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\" (UID: \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\") " Mar 09 09:54:30 crc kubenswrapper[4971]: I0309 09:54:30.575587 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-swiftconf\") pod \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\" (UID: \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\") " Mar 09 09:54:30 crc kubenswrapper[4971]: I0309 09:54:30.575613 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85n49\" (UniqueName: \"kubernetes.io/projected/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-kube-api-access-85n49\") pod \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\" (UID: \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\") " Mar 09 09:54:30 crc kubenswrapper[4971]: I0309 09:54:30.575645 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-etc-swift\") pod \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\" (UID: \"9f40f000-324d-4d5d-8318-5d86aa1fc7f7\") " Mar 09 09:54:30 crc kubenswrapper[4971]: I0309 09:54:30.575928 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9f40f000-324d-4d5d-8318-5d86aa1fc7f7" (UID: "9f40f000-324d-4d5d-8318-5d86aa1fc7f7"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:54:30 crc kubenswrapper[4971]: I0309 09:54:30.576399 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9f40f000-324d-4d5d-8318-5d86aa1fc7f7" (UID: "9f40f000-324d-4d5d-8318-5d86aa1fc7f7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:54:30 crc kubenswrapper[4971]: I0309 09:54:30.592799 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-kube-api-access-85n49" (OuterVolumeSpecName: "kube-api-access-85n49") pod "9f40f000-324d-4d5d-8318-5d86aa1fc7f7" (UID: "9f40f000-324d-4d5d-8318-5d86aa1fc7f7"). InnerVolumeSpecName "kube-api-access-85n49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:54:30 crc kubenswrapper[4971]: I0309 09:54:30.597857 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-scripts" (OuterVolumeSpecName: "scripts") pod "9f40f000-324d-4d5d-8318-5d86aa1fc7f7" (UID: "9f40f000-324d-4d5d-8318-5d86aa1fc7f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:54:30 crc kubenswrapper[4971]: I0309 09:54:30.600837 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9f40f000-324d-4d5d-8318-5d86aa1fc7f7" (UID: "9f40f000-324d-4d5d-8318-5d86aa1fc7f7"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:54:30 crc kubenswrapper[4971]: I0309 09:54:30.602114 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9f40f000-324d-4d5d-8318-5d86aa1fc7f7" (UID: "9f40f000-324d-4d5d-8318-5d86aa1fc7f7"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:54:30 crc kubenswrapper[4971]: I0309 09:54:30.677322 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:30 crc kubenswrapper[4971]: I0309 09:54:30.677381 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:30 crc kubenswrapper[4971]: I0309 09:54:30.677395 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:30 crc kubenswrapper[4971]: I0309 09:54:30.677404 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:30 crc kubenswrapper[4971]: I0309 09:54:30.677416 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85n49\" (UniqueName: \"kubernetes.io/projected/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-kube-api-access-85n49\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:30 crc kubenswrapper[4971]: I0309 09:54:30.677425 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9f40f000-324d-4d5d-8318-5d86aa1fc7f7-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.160374 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f40f000-324d-4d5d-8318-5d86aa1fc7f7" path="/var/lib/kubelet/pods/9f40f000-324d-4d5d-8318-5d86aa1fc7f7/volumes" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.169830 4971 scope.go:117] "RemoveContainer" containerID="a9cdeef9247d9026bae7da844d5381632899881955d9b4084c0f6cbda751b840" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.169923 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-d9pd2" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.603480 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w"] Mar 09 09:54:31 crc kubenswrapper[4971]: E0309 09:54:31.604209 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f40f000-324d-4d5d-8318-5d86aa1fc7f7" containerName="swift-ring-rebalance" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.604222 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f40f000-324d-4d5d-8318-5d86aa1fc7f7" containerName="swift-ring-rebalance" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.604418 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f40f000-324d-4d5d-8318-5d86aa1fc7f7" containerName="swift-ring-rebalance" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.604962 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.607425 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.607615 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.611617 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w"] Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.692905 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c2e57025-5e3e-43cc-814e-e97479d54db4-ring-data-devices\") pod \"swift-ring-rebalance-debug-7vp6w\" (UID: \"c2e57025-5e3e-43cc-814e-e97479d54db4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.692973 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c2e57025-5e3e-43cc-814e-e97479d54db4-etc-swift\") pod \"swift-ring-rebalance-debug-7vp6w\" (UID: \"c2e57025-5e3e-43cc-814e-e97479d54db4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.693036 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2e57025-5e3e-43cc-814e-e97479d54db4-scripts\") pod \"swift-ring-rebalance-debug-7vp6w\" (UID: \"c2e57025-5e3e-43cc-814e-e97479d54db4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.693064 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c2e57025-5e3e-43cc-814e-e97479d54db4-dispersionconf\") pod \"swift-ring-rebalance-debug-7vp6w\" (UID: \"c2e57025-5e3e-43cc-814e-e97479d54db4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.693215 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnrpm\" (UniqueName: \"kubernetes.io/projected/c2e57025-5e3e-43cc-814e-e97479d54db4-kube-api-access-mnrpm\") pod \"swift-ring-rebalance-debug-7vp6w\" (UID: \"c2e57025-5e3e-43cc-814e-e97479d54db4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.693341 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c2e57025-5e3e-43cc-814e-e97479d54db4-swiftconf\") pod \"swift-ring-rebalance-debug-7vp6w\" (UID: \"c2e57025-5e3e-43cc-814e-e97479d54db4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.795660 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c2e57025-5e3e-43cc-814e-e97479d54db4-ring-data-devices\") pod \"swift-ring-rebalance-debug-7vp6w\" (UID: \"c2e57025-5e3e-43cc-814e-e97479d54db4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.795743 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c2e57025-5e3e-43cc-814e-e97479d54db4-etc-swift\") pod \"swift-ring-rebalance-debug-7vp6w\" (UID: \"c2e57025-5e3e-43cc-814e-e97479d54db4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.795794 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2e57025-5e3e-43cc-814e-e97479d54db4-scripts\") pod \"swift-ring-rebalance-debug-7vp6w\" (UID: \"c2e57025-5e3e-43cc-814e-e97479d54db4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.795820 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c2e57025-5e3e-43cc-814e-e97479d54db4-dispersionconf\") pod \"swift-ring-rebalance-debug-7vp6w\" (UID: \"c2e57025-5e3e-43cc-814e-e97479d54db4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.795864 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnrpm\" (UniqueName: \"kubernetes.io/projected/c2e57025-5e3e-43cc-814e-e97479d54db4-kube-api-access-mnrpm\") pod \"swift-ring-rebalance-debug-7vp6w\" (UID: \"c2e57025-5e3e-43cc-814e-e97479d54db4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.795931 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c2e57025-5e3e-43cc-814e-e97479d54db4-swiftconf\") pod \"swift-ring-rebalance-debug-7vp6w\" (UID: \"c2e57025-5e3e-43cc-814e-e97479d54db4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.796373 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c2e57025-5e3e-43cc-814e-e97479d54db4-etc-swift\") pod \"swift-ring-rebalance-debug-7vp6w\" (UID: \"c2e57025-5e3e-43cc-814e-e97479d54db4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.796692 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c2e57025-5e3e-43cc-814e-e97479d54db4-ring-data-devices\") pod \"swift-ring-rebalance-debug-7vp6w\" (UID: \"c2e57025-5e3e-43cc-814e-e97479d54db4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.797076 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2e57025-5e3e-43cc-814e-e97479d54db4-scripts\") pod \"swift-ring-rebalance-debug-7vp6w\" (UID: \"c2e57025-5e3e-43cc-814e-e97479d54db4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.802624 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c2e57025-5e3e-43cc-814e-e97479d54db4-swiftconf\") pod \"swift-ring-rebalance-debug-7vp6w\" (UID: \"c2e57025-5e3e-43cc-814e-e97479d54db4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.803644 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c2e57025-5e3e-43cc-814e-e97479d54db4-dispersionconf\") pod \"swift-ring-rebalance-debug-7vp6w\" (UID: \"c2e57025-5e3e-43cc-814e-e97479d54db4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.815190 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnrpm\" (UniqueName: \"kubernetes.io/projected/c2e57025-5e3e-43cc-814e-e97479d54db4-kube-api-access-mnrpm\") pod \"swift-ring-rebalance-debug-7vp6w\" (UID: \"c2e57025-5e3e-43cc-814e-e97479d54db4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w" Mar 09 09:54:31 crc kubenswrapper[4971]: I0309 09:54:31.935335 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w" Mar 09 09:54:32 crc kubenswrapper[4971]: I0309 09:54:32.357854 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w"] Mar 09 09:54:32 crc kubenswrapper[4971]: W0309 09:54:32.363246 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2e57025_5e3e_43cc_814e_e97479d54db4.slice/crio-c81ad96a1a671211c1b3a4a17687da82c90944befd52ff898ec5f8136db967d2 WatchSource:0}: Error finding container c81ad96a1a671211c1b3a4a17687da82c90944befd52ff898ec5f8136db967d2: Status 404 returned error can't find the container with id c81ad96a1a671211c1b3a4a17687da82c90944befd52ff898ec5f8136db967d2 Mar 09 09:54:33 crc kubenswrapper[4971]: I0309 09:54:33.192362 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w" event={"ID":"c2e57025-5e3e-43cc-814e-e97479d54db4","Type":"ContainerStarted","Data":"88bc415847a0320875c8b7783b8af684e29dcef469f488e1ffb54622628cf7c6"} Mar 09 09:54:33 crc kubenswrapper[4971]: I0309 09:54:33.192654 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w" event={"ID":"c2e57025-5e3e-43cc-814e-e97479d54db4","Type":"ContainerStarted","Data":"c81ad96a1a671211c1b3a4a17687da82c90944befd52ff898ec5f8136db967d2"} Mar 09 09:54:34 crc kubenswrapper[4971]: I0309 09:54:34.202813 4971 generic.go:334] "Generic (PLEG): container finished" podID="c2e57025-5e3e-43cc-814e-e97479d54db4" containerID="88bc415847a0320875c8b7783b8af684e29dcef469f488e1ffb54622628cf7c6" exitCode=0 Mar 09 09:54:34 crc kubenswrapper[4971]: I0309 09:54:34.202864 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w" event={"ID":"c2e57025-5e3e-43cc-814e-e97479d54db4","Type":"ContainerDied","Data":"88bc415847a0320875c8b7783b8af684e29dcef469f488e1ffb54622628cf7c6"} Mar 09 09:54:35 crc kubenswrapper[4971]: I0309 09:54:35.493263 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w" Mar 09 09:54:35 crc kubenswrapper[4971]: I0309 09:54:35.549199 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c2e57025-5e3e-43cc-814e-e97479d54db4-ring-data-devices\") pod \"c2e57025-5e3e-43cc-814e-e97479d54db4\" (UID: \"c2e57025-5e3e-43cc-814e-e97479d54db4\") " Mar 09 09:54:35 crc kubenswrapper[4971]: I0309 09:54:35.549253 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c2e57025-5e3e-43cc-814e-e97479d54db4-swiftconf\") pod \"c2e57025-5e3e-43cc-814e-e97479d54db4\" (UID: \"c2e57025-5e3e-43cc-814e-e97479d54db4\") " Mar 09 09:54:35 crc kubenswrapper[4971]: I0309 09:54:35.549307 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c2e57025-5e3e-43cc-814e-e97479d54db4-etc-swift\") pod \"c2e57025-5e3e-43cc-814e-e97479d54db4\" (UID: \"c2e57025-5e3e-43cc-814e-e97479d54db4\") " Mar 09 09:54:35 crc kubenswrapper[4971]: I0309 09:54:35.549370 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrpm\" (UniqueName: \"kubernetes.io/projected/c2e57025-5e3e-43cc-814e-e97479d54db4-kube-api-access-mnrpm\") pod \"c2e57025-5e3e-43cc-814e-e97479d54db4\" (UID: \"c2e57025-5e3e-43cc-814e-e97479d54db4\") " Mar 09 09:54:35 crc kubenswrapper[4971]: I0309 09:54:35.549461 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2e57025-5e3e-43cc-814e-e97479d54db4-scripts\") pod \"c2e57025-5e3e-43cc-814e-e97479d54db4\" (UID: \"c2e57025-5e3e-43cc-814e-e97479d54db4\") " Mar 09 09:54:35 crc kubenswrapper[4971]: I0309 09:54:35.549514 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c2e57025-5e3e-43cc-814e-e97479d54db4-dispersionconf\") pod \"c2e57025-5e3e-43cc-814e-e97479d54db4\" (UID: \"c2e57025-5e3e-43cc-814e-e97479d54db4\") " Mar 09 09:54:35 crc kubenswrapper[4971]: I0309 09:54:35.550327 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2e57025-5e3e-43cc-814e-e97479d54db4-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c2e57025-5e3e-43cc-814e-e97479d54db4" (UID: "c2e57025-5e3e-43cc-814e-e97479d54db4"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:54:35 crc kubenswrapper[4971]: I0309 09:54:35.550741 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2e57025-5e3e-43cc-814e-e97479d54db4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c2e57025-5e3e-43cc-814e-e97479d54db4" (UID: "c2e57025-5e3e-43cc-814e-e97479d54db4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:54:35 crc kubenswrapper[4971]: I0309 09:54:35.556233 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2e57025-5e3e-43cc-814e-e97479d54db4-kube-api-access-mnrpm" (OuterVolumeSpecName: "kube-api-access-mnrpm") pod "c2e57025-5e3e-43cc-814e-e97479d54db4" (UID: "c2e57025-5e3e-43cc-814e-e97479d54db4"). InnerVolumeSpecName "kube-api-access-mnrpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:54:35 crc kubenswrapper[4971]: I0309 09:54:35.569605 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2e57025-5e3e-43cc-814e-e97479d54db4-scripts" (OuterVolumeSpecName: "scripts") pod "c2e57025-5e3e-43cc-814e-e97479d54db4" (UID: "c2e57025-5e3e-43cc-814e-e97479d54db4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:54:35 crc kubenswrapper[4971]: I0309 09:54:35.575648 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2e57025-5e3e-43cc-814e-e97479d54db4-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c2e57025-5e3e-43cc-814e-e97479d54db4" (UID: "c2e57025-5e3e-43cc-814e-e97479d54db4"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:54:35 crc kubenswrapper[4971]: I0309 09:54:35.576176 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2e57025-5e3e-43cc-814e-e97479d54db4-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c2e57025-5e3e-43cc-814e-e97479d54db4" (UID: "c2e57025-5e3e-43cc-814e-e97479d54db4"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:54:35 crc kubenswrapper[4971]: I0309 09:54:35.651319 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2e57025-5e3e-43cc-814e-e97479d54db4-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:35 crc kubenswrapper[4971]: I0309 09:54:35.651695 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c2e57025-5e3e-43cc-814e-e97479d54db4-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:35 crc kubenswrapper[4971]: I0309 09:54:35.651713 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c2e57025-5e3e-43cc-814e-e97479d54db4-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:35 crc kubenswrapper[4971]: I0309 09:54:35.651726 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c2e57025-5e3e-43cc-814e-e97479d54db4-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:35 crc kubenswrapper[4971]: I0309 09:54:35.651737 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c2e57025-5e3e-43cc-814e-e97479d54db4-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:35 crc kubenswrapper[4971]: I0309 09:54:35.651750 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrpm\" (UniqueName: \"kubernetes.io/projected/c2e57025-5e3e-43cc-814e-e97479d54db4-kube-api-access-mnrpm\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:36 crc kubenswrapper[4971]: I0309 09:54:36.152425 4971 scope.go:117] "RemoveContainer" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" Mar 09 09:54:36 crc kubenswrapper[4971]: E0309 09:54:36.152683 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 09:54:36 crc kubenswrapper[4971]: I0309 09:54:36.221273 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w" event={"ID":"c2e57025-5e3e-43cc-814e-e97479d54db4","Type":"ContainerDied","Data":"c81ad96a1a671211c1b3a4a17687da82c90944befd52ff898ec5f8136db967d2"} Mar 09 09:54:36 crc kubenswrapper[4971]: I0309 09:54:36.221304 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w" Mar 09 09:54:36 crc kubenswrapper[4971]: I0309 09:54:36.221317 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c81ad96a1a671211c1b3a4a17687da82c90944befd52ff898ec5f8136db967d2" Mar 09 09:54:36 crc kubenswrapper[4971]: I0309 09:54:36.338107 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w"] Mar 09 09:54:36 crc kubenswrapper[4971]: I0309 09:54:36.344251 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7vp6w"] Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.162636 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2e57025-5e3e-43cc-814e-e97479d54db4" path="/var/lib/kubelet/pods/c2e57025-5e3e-43cc-814e-e97479d54db4/volumes" Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.468851 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl"] Mar 09 09:54:37 crc kubenswrapper[4971]: E0309 09:54:37.469400 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2e57025-5e3e-43cc-814e-e97479d54db4" containerName="swift-ring-rebalance" Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.469424 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2e57025-5e3e-43cc-814e-e97479d54db4" containerName="swift-ring-rebalance" Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.469655 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2e57025-5e3e-43cc-814e-e97479d54db4" containerName="swift-ring-rebalance" Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.470417 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl" Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.472365 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.472599 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.476146 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl"] Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.585443 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-dispersionconf\") pod \"swift-ring-rebalance-debug-t4sfl\" (UID: \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl" Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.585495 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-ring-data-devices\") pod \"swift-ring-rebalance-debug-t4sfl\" (UID: \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl" Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.585570 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6j9m\" (UniqueName: \"kubernetes.io/projected/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-kube-api-access-z6j9m\") pod \"swift-ring-rebalance-debug-t4sfl\" (UID: \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl" Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.585663 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-swiftconf\") pod \"swift-ring-rebalance-debug-t4sfl\" (UID: \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl" Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.585688 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-scripts\") pod \"swift-ring-rebalance-debug-t4sfl\" (UID: \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl" Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.585705 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-etc-swift\") pod \"swift-ring-rebalance-debug-t4sfl\" (UID: \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl" Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.686462 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-swiftconf\") pod \"swift-ring-rebalance-debug-t4sfl\" (UID: \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl" Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.686510 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-scripts\") pod \"swift-ring-rebalance-debug-t4sfl\" (UID: \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl" Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.686527 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-etc-swift\") pod \"swift-ring-rebalance-debug-t4sfl\" (UID: \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl" Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.686567 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-dispersionconf\") pod \"swift-ring-rebalance-debug-t4sfl\" (UID: \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl" Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.686586 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-ring-data-devices\") pod \"swift-ring-rebalance-debug-t4sfl\" (UID: \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl" Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.686637 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6j9m\" (UniqueName: \"kubernetes.io/projected/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-kube-api-access-z6j9m\") pod \"swift-ring-rebalance-debug-t4sfl\" (UID: \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl" Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.687490 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-etc-swift\") pod \"swift-ring-rebalance-debug-t4sfl\" (UID: \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl" Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.687706 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-scripts\") pod \"swift-ring-rebalance-debug-t4sfl\" (UID: \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl" Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.687761 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-ring-data-devices\") pod \"swift-ring-rebalance-debug-t4sfl\" (UID: \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl" Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.690098 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-swiftconf\") pod \"swift-ring-rebalance-debug-t4sfl\" (UID: \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl" Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.693735 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-dispersionconf\") pod \"swift-ring-rebalance-debug-t4sfl\" (UID: \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl" Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.702898 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6j9m\" (UniqueName: \"kubernetes.io/projected/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-kube-api-access-z6j9m\") pod \"swift-ring-rebalance-debug-t4sfl\" (UID: \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl" Mar 09 09:54:37 crc kubenswrapper[4971]: I0309 09:54:37.786968 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl" Mar 09 09:54:38 crc kubenswrapper[4971]: I0309 09:54:38.235682 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl"] Mar 09 09:54:38 crc kubenswrapper[4971]: W0309 09:54:38.238526 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31e25d95_3f20_4639_8ae4_6ce3fb85dd1f.slice/crio-d5747b8fc7be23320d074a468bedd32bcacb1b92052922a019974d43c427a415 WatchSource:0}: Error finding container d5747b8fc7be23320d074a468bedd32bcacb1b92052922a019974d43c427a415: Status 404 returned error can't find the container with id d5747b8fc7be23320d074a468bedd32bcacb1b92052922a019974d43c427a415 Mar 09 09:54:39 crc kubenswrapper[4971]: I0309 09:54:39.244500 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl" event={"ID":"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f","Type":"ContainerStarted","Data":"c10f2007aa9d1529898e8b8bebdfdbed76f2c34c81e3a788d20064acb41b463f"} Mar 09 09:54:39 crc kubenswrapper[4971]: I0309 09:54:39.244874 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl" event={"ID":"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f","Type":"ContainerStarted","Data":"d5747b8fc7be23320d074a468bedd32bcacb1b92052922a019974d43c427a415"} Mar 09 09:54:39 crc kubenswrapper[4971]: I0309 09:54:39.266094 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl" podStartSLOduration=2.266074244 podStartE2EDuration="2.266074244s" podCreationTimestamp="2026-03-09 09:54:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:54:39.25882673 +0000 UTC m=+2082.818754540" watchObservedRunningTime="2026-03-09 09:54:39.266074244 +0000 UTC m=+2082.826002064" Mar 09 09:54:40 crc kubenswrapper[4971]: I0309 09:54:40.253336 4971 generic.go:334] "Generic (PLEG): container finished" podID="31e25d95-3f20-4639-8ae4-6ce3fb85dd1f" containerID="c10f2007aa9d1529898e8b8bebdfdbed76f2c34c81e3a788d20064acb41b463f" exitCode=0 Mar 09 09:54:40 crc kubenswrapper[4971]: I0309 09:54:40.254130 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl" event={"ID":"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f","Type":"ContainerDied","Data":"c10f2007aa9d1529898e8b8bebdfdbed76f2c34c81e3a788d20064acb41b463f"} Mar 09 09:54:41 crc kubenswrapper[4971]: I0309 09:54:41.545672 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl" Mar 09 09:54:41 crc kubenswrapper[4971]: I0309 09:54:41.587003 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl"] Mar 09 09:54:41 crc kubenswrapper[4971]: I0309 09:54:41.592327 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl"] Mar 09 09:54:41 crc kubenswrapper[4971]: I0309 09:54:41.645242 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-dispersionconf\") pod \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\" (UID: \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\") " Mar 09 09:54:41 crc kubenswrapper[4971]: I0309 09:54:41.645594 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-scripts\") pod \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\" (UID: \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\") " Mar 09 09:54:41 crc kubenswrapper[4971]: I0309 09:54:41.645700 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6j9m\" (UniqueName: \"kubernetes.io/projected/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-kube-api-access-z6j9m\") pod \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\" (UID: \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\") " Mar 09 09:54:41 crc kubenswrapper[4971]: I0309 09:54:41.645832 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-swiftconf\") pod \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\" (UID: \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\") " Mar 09 09:54:41 crc kubenswrapper[4971]: I0309 09:54:41.645933 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-etc-swift\") pod \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\" (UID: \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\") " Mar 09 09:54:41 crc kubenswrapper[4971]: I0309 09:54:41.646076 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-ring-data-devices\") pod \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\" (UID: \"31e25d95-3f20-4639-8ae4-6ce3fb85dd1f\") " Mar 09 09:54:41 crc kubenswrapper[4971]: I0309 09:54:41.646586 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "31e25d95-3f20-4639-8ae4-6ce3fb85dd1f" (UID: "31e25d95-3f20-4639-8ae4-6ce3fb85dd1f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:54:41 crc kubenswrapper[4971]: I0309 09:54:41.646824 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "31e25d95-3f20-4639-8ae4-6ce3fb85dd1f" (UID: "31e25d95-3f20-4639-8ae4-6ce3fb85dd1f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:54:41 crc kubenswrapper[4971]: I0309 09:54:41.647079 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:41 crc kubenswrapper[4971]: I0309 09:54:41.647144 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:41 crc kubenswrapper[4971]: I0309 09:54:41.655659 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-kube-api-access-z6j9m" (OuterVolumeSpecName: "kube-api-access-z6j9m") pod "31e25d95-3f20-4639-8ae4-6ce3fb85dd1f" (UID: "31e25d95-3f20-4639-8ae4-6ce3fb85dd1f"). InnerVolumeSpecName "kube-api-access-z6j9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:54:41 crc kubenswrapper[4971]: I0309 09:54:41.669688 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-scripts" (OuterVolumeSpecName: "scripts") pod "31e25d95-3f20-4639-8ae4-6ce3fb85dd1f" (UID: "31e25d95-3f20-4639-8ae4-6ce3fb85dd1f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:54:41 crc kubenswrapper[4971]: I0309 09:54:41.670981 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "31e25d95-3f20-4639-8ae4-6ce3fb85dd1f" (UID: "31e25d95-3f20-4639-8ae4-6ce3fb85dd1f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:54:41 crc kubenswrapper[4971]: I0309 09:54:41.676511 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "31e25d95-3f20-4639-8ae4-6ce3fb85dd1f" (UID: "31e25d95-3f20-4639-8ae4-6ce3fb85dd1f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:54:41 crc kubenswrapper[4971]: I0309 09:54:41.748645 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:41 crc kubenswrapper[4971]: I0309 09:54:41.748690 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:41 crc kubenswrapper[4971]: I0309 09:54:41.748706 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:41 crc kubenswrapper[4971]: I0309 09:54:41.748724 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6j9m\" (UniqueName: \"kubernetes.io/projected/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f-kube-api-access-z6j9m\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.274045 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5747b8fc7be23320d074a468bedd32bcacb1b92052922a019974d43c427a415" Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.274133 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t4sfl" Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.728458 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw"] Mar 09 09:54:42 crc kubenswrapper[4971]: E0309 09:54:42.729203 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31e25d95-3f20-4639-8ae4-6ce3fb85dd1f" containerName="swift-ring-rebalance" Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.729298 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e25d95-3f20-4639-8ae4-6ce3fb85dd1f" containerName="swift-ring-rebalance" Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.729712 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="31e25d95-3f20-4639-8ae4-6ce3fb85dd1f" containerName="swift-ring-rebalance" Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.730424 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw" Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.735079 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.739905 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw"] Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.741971 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.863782 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-swiftconf\") pod \"swift-ring-rebalance-debug-mtpzw\" (UID: \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw" Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.863855 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tzhb\" (UniqueName: \"kubernetes.io/projected/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-kube-api-access-8tzhb\") pod \"swift-ring-rebalance-debug-mtpzw\" (UID: \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw" Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.863991 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-ring-data-devices\") pod \"swift-ring-rebalance-debug-mtpzw\" (UID: \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw" Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.864063 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-scripts\") pod \"swift-ring-rebalance-debug-mtpzw\" (UID: \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw" Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.864111 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-dispersionconf\") pod \"swift-ring-rebalance-debug-mtpzw\" (UID: \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw" Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.864198 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-etc-swift\") pod \"swift-ring-rebalance-debug-mtpzw\" (UID: \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw" Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.966142 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-swiftconf\") pod \"swift-ring-rebalance-debug-mtpzw\" (UID: \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw" Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.966237 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tzhb\" (UniqueName: \"kubernetes.io/projected/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-kube-api-access-8tzhb\") pod \"swift-ring-rebalance-debug-mtpzw\" (UID: \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw" Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.966291 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-ring-data-devices\") pod \"swift-ring-rebalance-debug-mtpzw\" (UID: \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw" Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.966325 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-scripts\") pod \"swift-ring-rebalance-debug-mtpzw\" (UID: \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw" Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.966425 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-dispersionconf\") pod \"swift-ring-rebalance-debug-mtpzw\" (UID: \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw" Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.966470 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-etc-swift\") pod \"swift-ring-rebalance-debug-mtpzw\" (UID: \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw" Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.967120 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-etc-swift\") pod \"swift-ring-rebalance-debug-mtpzw\" (UID: \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw" Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.967981 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-ring-data-devices\") pod \"swift-ring-rebalance-debug-mtpzw\" (UID: \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw" Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.968056 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-scripts\") pod \"swift-ring-rebalance-debug-mtpzw\" (UID: \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw" Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.971131 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-dispersionconf\") pod \"swift-ring-rebalance-debug-mtpzw\" (UID: \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw" Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.987169 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-swiftconf\") pod \"swift-ring-rebalance-debug-mtpzw\" (UID: \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw" Mar 09 09:54:42 crc kubenswrapper[4971]: I0309 09:54:42.988740 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tzhb\" (UniqueName: \"kubernetes.io/projected/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-kube-api-access-8tzhb\") pod \"swift-ring-rebalance-debug-mtpzw\" (UID: \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw" Mar 09 09:54:43 crc kubenswrapper[4971]: I0309 09:54:43.051070 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw" Mar 09 09:54:43 crc kubenswrapper[4971]: I0309 09:54:43.167396 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31e25d95-3f20-4639-8ae4-6ce3fb85dd1f" path="/var/lib/kubelet/pods/31e25d95-3f20-4639-8ae4-6ce3fb85dd1f/volumes" Mar 09 09:54:43 crc kubenswrapper[4971]: I0309 09:54:43.492631 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw"] Mar 09 09:54:44 crc kubenswrapper[4971]: I0309 09:54:44.291457 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw" event={"ID":"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66","Type":"ContainerStarted","Data":"306afb51bef24f497157bb49958a90b16e7a8c61bbb7e379725274c10fc16c83"} Mar 09 09:54:44 crc kubenswrapper[4971]: I0309 09:54:44.291782 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw" event={"ID":"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66","Type":"ContainerStarted","Data":"082cd0a719e7993e96a17e661860094d54d8a4049e67f5d87f70081574901e03"} Mar 09 09:54:45 crc kubenswrapper[4971]: I0309 09:54:45.304226 4971 generic.go:334] "Generic (PLEG): container finished" podID="79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66" containerID="306afb51bef24f497157bb49958a90b16e7a8c61bbb7e379725274c10fc16c83" exitCode=0 Mar 09 09:54:45 crc kubenswrapper[4971]: I0309 09:54:45.304291 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw" event={"ID":"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66","Type":"ContainerDied","Data":"306afb51bef24f497157bb49958a90b16e7a8c61bbb7e379725274c10fc16c83"} Mar 09 09:54:46 crc kubenswrapper[4971]: I0309 09:54:46.558470 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw" Mar 09 09:54:46 crc kubenswrapper[4971]: I0309 09:54:46.585817 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw"] Mar 09 09:54:46 crc kubenswrapper[4971]: I0309 09:54:46.592506 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw"] Mar 09 09:54:46 crc kubenswrapper[4971]: I0309 09:54:46.658920 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-ring-data-devices\") pod \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\" (UID: \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\") " Mar 09 09:54:46 crc kubenswrapper[4971]: I0309 09:54:46.658968 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-dispersionconf\") pod \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\" (UID: \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\") " Mar 09 09:54:46 crc kubenswrapper[4971]: I0309 09:54:46.659019 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-etc-swift\") pod \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\" (UID: \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\") " Mar 09 09:54:46 crc kubenswrapper[4971]: I0309 09:54:46.659045 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tzhb\" (UniqueName: \"kubernetes.io/projected/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-kube-api-access-8tzhb\") pod \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\" (UID: \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\") " Mar 09 09:54:46 crc kubenswrapper[4971]: I0309 09:54:46.659122 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-swiftconf\") pod \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\" (UID: \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\") " Mar 09 09:54:46 crc kubenswrapper[4971]: I0309 09:54:46.659141 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-scripts\") pod \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\" (UID: \"79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66\") " Mar 09 09:54:46 crc kubenswrapper[4971]: I0309 09:54:46.659495 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66" (UID: "79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:54:46 crc kubenswrapper[4971]: I0309 09:54:46.659811 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66" (UID: "79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:54:46 crc kubenswrapper[4971]: I0309 09:54:46.665295 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-kube-api-access-8tzhb" (OuterVolumeSpecName: "kube-api-access-8tzhb") pod "79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66" (UID: "79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66"). InnerVolumeSpecName "kube-api-access-8tzhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:54:46 crc kubenswrapper[4971]: I0309 09:54:46.681568 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66" (UID: "79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:54:46 crc kubenswrapper[4971]: I0309 09:54:46.682861 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-scripts" (OuterVolumeSpecName: "scripts") pod "79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66" (UID: "79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:54:46 crc kubenswrapper[4971]: I0309 09:54:46.686581 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66" (UID: "79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:54:46 crc kubenswrapper[4971]: I0309 09:54:46.761093 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:46 crc kubenswrapper[4971]: I0309 09:54:46.761146 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tzhb\" (UniqueName: \"kubernetes.io/projected/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-kube-api-access-8tzhb\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:46 crc kubenswrapper[4971]: I0309 09:54:46.761163 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:46 crc kubenswrapper[4971]: I0309 09:54:46.761174 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:46 crc kubenswrapper[4971]: I0309 09:54:46.761186 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:46 crc kubenswrapper[4971]: I0309 09:54:46.761198 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.161359 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66" path="/var/lib/kubelet/pods/79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66/volumes" Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.320280 4971 scope.go:117] "RemoveContainer" containerID="306afb51bef24f497157bb49958a90b16e7a8c61bbb7e379725274c10fc16c83" Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.320385 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mtpzw" Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.733716 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-md4jt"] Mar 09 09:54:47 crc kubenswrapper[4971]: E0309 09:54:47.734365 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66" containerName="swift-ring-rebalance" Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.734386 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66" containerName="swift-ring-rebalance" Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.734603 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="79377a2a-07c9-4c9b-bbd6-9cb24bfc7c66" containerName="swift-ring-rebalance" Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.735173 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-md4jt" Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.737573 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.738925 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.740029 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-md4jt"] Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.877368 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/81d9be96-f9f3-46a5-b857-071f4ed3ff57-ring-data-devices\") pod \"swift-ring-rebalance-debug-md4jt\" (UID: \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md4jt" Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.877436 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-674tm\" (UniqueName: \"kubernetes.io/projected/81d9be96-f9f3-46a5-b857-071f4ed3ff57-kube-api-access-674tm\") pod \"swift-ring-rebalance-debug-md4jt\" (UID: \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md4jt" Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.877484 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/81d9be96-f9f3-46a5-b857-071f4ed3ff57-swiftconf\") pod \"swift-ring-rebalance-debug-md4jt\" (UID: \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md4jt" Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.877532 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81d9be96-f9f3-46a5-b857-071f4ed3ff57-scripts\") pod \"swift-ring-rebalance-debug-md4jt\" (UID: \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md4jt" Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.877555 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/81d9be96-f9f3-46a5-b857-071f4ed3ff57-dispersionconf\") pod \"swift-ring-rebalance-debug-md4jt\" (UID: \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md4jt" Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.877617 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/81d9be96-f9f3-46a5-b857-071f4ed3ff57-etc-swift\") pod \"swift-ring-rebalance-debug-md4jt\" (UID: \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md4jt" Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.978467 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81d9be96-f9f3-46a5-b857-071f4ed3ff57-scripts\") pod \"swift-ring-rebalance-debug-md4jt\" (UID: \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md4jt" Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.978530 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/81d9be96-f9f3-46a5-b857-071f4ed3ff57-dispersionconf\") pod \"swift-ring-rebalance-debug-md4jt\" (UID: \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md4jt" Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.978565 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/81d9be96-f9f3-46a5-b857-071f4ed3ff57-etc-swift\") pod \"swift-ring-rebalance-debug-md4jt\" (UID: \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md4jt" Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.978662 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/81d9be96-f9f3-46a5-b857-071f4ed3ff57-ring-data-devices\") pod \"swift-ring-rebalance-debug-md4jt\" (UID: \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md4jt" Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.978698 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-674tm\" (UniqueName: \"kubernetes.io/projected/81d9be96-f9f3-46a5-b857-071f4ed3ff57-kube-api-access-674tm\") pod \"swift-ring-rebalance-debug-md4jt\" (UID: \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md4jt" Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.978759 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/81d9be96-f9f3-46a5-b857-071f4ed3ff57-swiftconf\") pod \"swift-ring-rebalance-debug-md4jt\" (UID: \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md4jt" Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.979234 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/81d9be96-f9f3-46a5-b857-071f4ed3ff57-etc-swift\") pod \"swift-ring-rebalance-debug-md4jt\" (UID: \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md4jt" Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.979441 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/81d9be96-f9f3-46a5-b857-071f4ed3ff57-ring-data-devices\") pod \"swift-ring-rebalance-debug-md4jt\" (UID: \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md4jt" Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.979442 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81d9be96-f9f3-46a5-b857-071f4ed3ff57-scripts\") pod \"swift-ring-rebalance-debug-md4jt\" (UID: \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md4jt" Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.984492 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/81d9be96-f9f3-46a5-b857-071f4ed3ff57-swiftconf\") pod \"swift-ring-rebalance-debug-md4jt\" (UID: \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md4jt" Mar 09 09:54:47 crc kubenswrapper[4971]: I0309 09:54:47.987846 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/81d9be96-f9f3-46a5-b857-071f4ed3ff57-dispersionconf\") pod \"swift-ring-rebalance-debug-md4jt\" (UID: \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md4jt" Mar 09 09:54:48 crc kubenswrapper[4971]: I0309 09:54:48.003257 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-674tm\" (UniqueName: \"kubernetes.io/projected/81d9be96-f9f3-46a5-b857-071f4ed3ff57-kube-api-access-674tm\") pod \"swift-ring-rebalance-debug-md4jt\" (UID: \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md4jt" Mar 09 09:54:48 crc kubenswrapper[4971]: I0309 09:54:48.049021 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-md4jt" Mar 09 09:54:48 crc kubenswrapper[4971]: I0309 09:54:48.529835 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-md4jt"] Mar 09 09:54:49 crc kubenswrapper[4971]: I0309 09:54:49.342777 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-md4jt" event={"ID":"81d9be96-f9f3-46a5-b857-071f4ed3ff57","Type":"ContainerStarted","Data":"e62bbf56a2d4847a368cdb1694811859dbf258a333ec4cf664bd68ab2be6cc9b"} Mar 09 09:54:49 crc kubenswrapper[4971]: I0309 09:54:49.343166 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-md4jt" event={"ID":"81d9be96-f9f3-46a5-b857-071f4ed3ff57","Type":"ContainerStarted","Data":"adbe2669ac401a462ce50f15359d76d80462f7fb57d107449e89962d227c263b"} Mar 09 09:54:49 crc kubenswrapper[4971]: I0309 09:54:49.363782 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-md4jt" podStartSLOduration=2.363763752 podStartE2EDuration="2.363763752s" podCreationTimestamp="2026-03-09 09:54:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:54:49.358940526 +0000 UTC m=+2092.918868356" watchObservedRunningTime="2026-03-09 09:54:49.363763752 +0000 UTC m=+2092.923691562" Mar 09 09:54:50 crc kubenswrapper[4971]: I0309 09:54:50.152550 4971 scope.go:117] "RemoveContainer" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" Mar 09 09:54:50 crc kubenswrapper[4971]: I0309 09:54:50.355026 4971 generic.go:334] "Generic (PLEG): container finished" podID="81d9be96-f9f3-46a5-b857-071f4ed3ff57" containerID="e62bbf56a2d4847a368cdb1694811859dbf258a333ec4cf664bd68ab2be6cc9b" exitCode=0 Mar 09 09:54:50 crc kubenswrapper[4971]: I0309 09:54:50.355075 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-md4jt" event={"ID":"81d9be96-f9f3-46a5-b857-071f4ed3ff57","Type":"ContainerDied","Data":"e62bbf56a2d4847a368cdb1694811859dbf258a333ec4cf664bd68ab2be6cc9b"} Mar 09 09:54:50 crc kubenswrapper[4971]: I0309 09:54:50.358995 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" event={"ID":"05fde3ad-1182-4b15-bb1a-f365ecc92d75","Type":"ContainerStarted","Data":"fb854a481092dad066a02e66c2ebd6763e161f9c45ef6671e752ecdc7ae089b9"} Mar 09 09:54:51 crc kubenswrapper[4971]: I0309 09:54:51.643022 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-md4jt" Mar 09 09:54:51 crc kubenswrapper[4971]: I0309 09:54:51.674072 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-md4jt"] Mar 09 09:54:51 crc kubenswrapper[4971]: I0309 09:54:51.686481 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-md4jt"] Mar 09 09:54:51 crc kubenswrapper[4971]: I0309 09:54:51.736645 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/81d9be96-f9f3-46a5-b857-071f4ed3ff57-etc-swift\") pod \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\" (UID: \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\") " Mar 09 09:54:51 crc kubenswrapper[4971]: I0309 09:54:51.736707 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/81d9be96-f9f3-46a5-b857-071f4ed3ff57-swiftconf\") pod \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\" (UID: \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\") " Mar 09 09:54:51 crc kubenswrapper[4971]: I0309 09:54:51.736740 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81d9be96-f9f3-46a5-b857-071f4ed3ff57-scripts\") pod \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\" (UID: \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\") " Mar 09 09:54:51 crc kubenswrapper[4971]: I0309 09:54:51.736781 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-674tm\" (UniqueName: \"kubernetes.io/projected/81d9be96-f9f3-46a5-b857-071f4ed3ff57-kube-api-access-674tm\") pod \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\" (UID: \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\") " Mar 09 09:54:51 crc kubenswrapper[4971]: I0309 09:54:51.736842 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/81d9be96-f9f3-46a5-b857-071f4ed3ff57-dispersionconf\") pod \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\" (UID: \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\") " Mar 09 09:54:51 crc kubenswrapper[4971]: I0309 09:54:51.736880 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/81d9be96-f9f3-46a5-b857-071f4ed3ff57-ring-data-devices\") pod \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\" (UID: \"81d9be96-f9f3-46a5-b857-071f4ed3ff57\") " Mar 09 09:54:51 crc kubenswrapper[4971]: I0309 09:54:51.737567 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81d9be96-f9f3-46a5-b857-071f4ed3ff57-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "81d9be96-f9f3-46a5-b857-071f4ed3ff57" (UID: "81d9be96-f9f3-46a5-b857-071f4ed3ff57"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:54:51 crc kubenswrapper[4971]: I0309 09:54:51.738284 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81d9be96-f9f3-46a5-b857-071f4ed3ff57-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "81d9be96-f9f3-46a5-b857-071f4ed3ff57" (UID: "81d9be96-f9f3-46a5-b857-071f4ed3ff57"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:54:51 crc kubenswrapper[4971]: I0309 09:54:51.741599 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81d9be96-f9f3-46a5-b857-071f4ed3ff57-kube-api-access-674tm" (OuterVolumeSpecName: "kube-api-access-674tm") pod "81d9be96-f9f3-46a5-b857-071f4ed3ff57" (UID: "81d9be96-f9f3-46a5-b857-071f4ed3ff57"). InnerVolumeSpecName "kube-api-access-674tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:54:51 crc kubenswrapper[4971]: I0309 09:54:51.758108 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81d9be96-f9f3-46a5-b857-071f4ed3ff57-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "81d9be96-f9f3-46a5-b857-071f4ed3ff57" (UID: "81d9be96-f9f3-46a5-b857-071f4ed3ff57"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:54:51 crc kubenswrapper[4971]: I0309 09:54:51.758171 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81d9be96-f9f3-46a5-b857-071f4ed3ff57-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "81d9be96-f9f3-46a5-b857-071f4ed3ff57" (UID: "81d9be96-f9f3-46a5-b857-071f4ed3ff57"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:54:51 crc kubenswrapper[4971]: I0309 09:54:51.766652 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81d9be96-f9f3-46a5-b857-071f4ed3ff57-scripts" (OuterVolumeSpecName: "scripts") pod "81d9be96-f9f3-46a5-b857-071f4ed3ff57" (UID: "81d9be96-f9f3-46a5-b857-071f4ed3ff57"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:54:51 crc kubenswrapper[4971]: I0309 09:54:51.838294 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/81d9be96-f9f3-46a5-b857-071f4ed3ff57-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:51 crc kubenswrapper[4971]: I0309 09:54:51.838335 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/81d9be96-f9f3-46a5-b857-071f4ed3ff57-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:51 crc kubenswrapper[4971]: I0309 09:54:51.838411 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/81d9be96-f9f3-46a5-b857-071f4ed3ff57-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:51 crc kubenswrapper[4971]: I0309 09:54:51.838425 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/81d9be96-f9f3-46a5-b857-071f4ed3ff57-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:51 crc kubenswrapper[4971]: I0309 09:54:51.838439 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81d9be96-f9f3-46a5-b857-071f4ed3ff57-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:51 crc kubenswrapper[4971]: I0309 09:54:51.838451 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-674tm\" (UniqueName: \"kubernetes.io/projected/81d9be96-f9f3-46a5-b857-071f4ed3ff57-kube-api-access-674tm\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:52 crc kubenswrapper[4971]: I0309 09:54:52.377210 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adbe2669ac401a462ce50f15359d76d80462f7fb57d107449e89962d227c263b" Mar 09 09:54:52 crc kubenswrapper[4971]: I0309 09:54:52.377245 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-md4jt" Mar 09 09:54:52 crc kubenswrapper[4971]: I0309 09:54:52.820503 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q"] Mar 09 09:54:52 crc kubenswrapper[4971]: E0309 09:54:52.820812 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81d9be96-f9f3-46a5-b857-071f4ed3ff57" containerName="swift-ring-rebalance" Mar 09 09:54:52 crc kubenswrapper[4971]: I0309 09:54:52.820823 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="81d9be96-f9f3-46a5-b857-071f4ed3ff57" containerName="swift-ring-rebalance" Mar 09 09:54:52 crc kubenswrapper[4971]: I0309 09:54:52.820955 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="81d9be96-f9f3-46a5-b857-071f4ed3ff57" containerName="swift-ring-rebalance" Mar 09 09:54:52 crc kubenswrapper[4971]: I0309 09:54:52.821504 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q" Mar 09 09:54:52 crc kubenswrapper[4971]: I0309 09:54:52.823848 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:54:52 crc kubenswrapper[4971]: I0309 09:54:52.828756 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:54:52 crc kubenswrapper[4971]: I0309 09:54:52.832055 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q"] Mar 09 09:54:52 crc kubenswrapper[4971]: I0309 09:54:52.955459 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3a012e5c-0a72-421f-8833-e5ab3e9466a9-etc-swift\") pod \"swift-ring-rebalance-debug-zlb2q\" (UID: \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q" Mar 09 09:54:52 crc kubenswrapper[4971]: I0309 09:54:52.955835 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3a012e5c-0a72-421f-8833-e5ab3e9466a9-swiftconf\") pod \"swift-ring-rebalance-debug-zlb2q\" (UID: \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q" Mar 09 09:54:52 crc kubenswrapper[4971]: I0309 09:54:52.955866 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3a012e5c-0a72-421f-8833-e5ab3e9466a9-dispersionconf\") pod \"swift-ring-rebalance-debug-zlb2q\" (UID: \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q" Mar 09 09:54:52 crc kubenswrapper[4971]: I0309 09:54:52.955925 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a012e5c-0a72-421f-8833-e5ab3e9466a9-scripts\") pod \"swift-ring-rebalance-debug-zlb2q\" (UID: \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q" Mar 09 09:54:52 crc kubenswrapper[4971]: I0309 09:54:52.955994 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhdkv\" (UniqueName: \"kubernetes.io/projected/3a012e5c-0a72-421f-8833-e5ab3e9466a9-kube-api-access-rhdkv\") pod \"swift-ring-rebalance-debug-zlb2q\" (UID: \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q" Mar 09 09:54:52 crc kubenswrapper[4971]: I0309 09:54:52.956020 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3a012e5c-0a72-421f-8833-e5ab3e9466a9-ring-data-devices\") pod \"swift-ring-rebalance-debug-zlb2q\" (UID: \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q" Mar 09 09:54:53 crc kubenswrapper[4971]: I0309 09:54:53.057411 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3a012e5c-0a72-421f-8833-e5ab3e9466a9-swiftconf\") pod \"swift-ring-rebalance-debug-zlb2q\" (UID: \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q" Mar 09 09:54:53 crc kubenswrapper[4971]: I0309 09:54:53.057466 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3a012e5c-0a72-421f-8833-e5ab3e9466a9-dispersionconf\") pod \"swift-ring-rebalance-debug-zlb2q\" (UID: \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q" Mar 09 09:54:53 crc kubenswrapper[4971]: I0309 09:54:53.057519 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a012e5c-0a72-421f-8833-e5ab3e9466a9-scripts\") pod \"swift-ring-rebalance-debug-zlb2q\" (UID: \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q" Mar 09 09:54:53 crc kubenswrapper[4971]: I0309 09:54:53.057585 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3a012e5c-0a72-421f-8833-e5ab3e9466a9-ring-data-devices\") pod \"swift-ring-rebalance-debug-zlb2q\" (UID: \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q" Mar 09 09:54:53 crc kubenswrapper[4971]: I0309 09:54:53.057603 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhdkv\" (UniqueName: \"kubernetes.io/projected/3a012e5c-0a72-421f-8833-e5ab3e9466a9-kube-api-access-rhdkv\") pod \"swift-ring-rebalance-debug-zlb2q\" (UID: \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q" Mar 09 09:54:53 crc kubenswrapper[4971]: I0309 09:54:53.057626 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3a012e5c-0a72-421f-8833-e5ab3e9466a9-etc-swift\") pod \"swift-ring-rebalance-debug-zlb2q\" (UID: \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q" Mar 09 09:54:53 crc kubenswrapper[4971]: I0309 09:54:53.058221 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3a012e5c-0a72-421f-8833-e5ab3e9466a9-etc-swift\") pod \"swift-ring-rebalance-debug-zlb2q\" (UID: \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q" Mar 09 09:54:53 crc kubenswrapper[4971]: I0309 09:54:53.058614 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a012e5c-0a72-421f-8833-e5ab3e9466a9-scripts\") pod \"swift-ring-rebalance-debug-zlb2q\" (UID: \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q" Mar 09 09:54:53 crc kubenswrapper[4971]: I0309 09:54:53.058635 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3a012e5c-0a72-421f-8833-e5ab3e9466a9-ring-data-devices\") pod \"swift-ring-rebalance-debug-zlb2q\" (UID: \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q" Mar 09 09:54:53 crc kubenswrapper[4971]: I0309 09:54:53.061549 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3a012e5c-0a72-421f-8833-e5ab3e9466a9-swiftconf\") pod \"swift-ring-rebalance-debug-zlb2q\" (UID: \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q" Mar 09 09:54:53 crc kubenswrapper[4971]: I0309 09:54:53.062587 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3a012e5c-0a72-421f-8833-e5ab3e9466a9-dispersionconf\") pod \"swift-ring-rebalance-debug-zlb2q\" (UID: \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q" Mar 09 09:54:53 crc kubenswrapper[4971]: I0309 09:54:53.075203 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhdkv\" (UniqueName: \"kubernetes.io/projected/3a012e5c-0a72-421f-8833-e5ab3e9466a9-kube-api-access-rhdkv\") pod \"swift-ring-rebalance-debug-zlb2q\" (UID: \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q" Mar 09 09:54:53 crc kubenswrapper[4971]: I0309 09:54:53.161957 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81d9be96-f9f3-46a5-b857-071f4ed3ff57" path="/var/lib/kubelet/pods/81d9be96-f9f3-46a5-b857-071f4ed3ff57/volumes" Mar 09 09:54:53 crc kubenswrapper[4971]: I0309 09:54:53.182193 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q" Mar 09 09:54:53 crc kubenswrapper[4971]: I0309 09:54:53.669766 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q"] Mar 09 09:54:54 crc kubenswrapper[4971]: I0309 09:54:54.406913 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q" event={"ID":"3a012e5c-0a72-421f-8833-e5ab3e9466a9","Type":"ContainerStarted","Data":"52f66a49d3b65a2941efb947ade670e45ab03e36acb95ba082a24407ca851c63"} Mar 09 09:54:54 crc kubenswrapper[4971]: I0309 09:54:54.407235 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q" event={"ID":"3a012e5c-0a72-421f-8833-e5ab3e9466a9","Type":"ContainerStarted","Data":"86325ea0eaf766f5af9f9409b941c3b79c413da4a2098be7255ec7f71fc23e40"} Mar 09 09:54:54 crc kubenswrapper[4971]: I0309 09:54:54.425657 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q" podStartSLOduration=2.425634289 podStartE2EDuration="2.425634289s" podCreationTimestamp="2026-03-09 09:54:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:54:54.422934513 +0000 UTC m=+2097.982862323" watchObservedRunningTime="2026-03-09 09:54:54.425634289 +0000 UTC m=+2097.985562109" Mar 09 09:54:55 crc kubenswrapper[4971]: I0309 09:54:55.416678 4971 generic.go:334] "Generic (PLEG): container finished" podID="3a012e5c-0a72-421f-8833-e5ab3e9466a9" containerID="52f66a49d3b65a2941efb947ade670e45ab03e36acb95ba082a24407ca851c63" exitCode=0 Mar 09 09:54:55 crc kubenswrapper[4971]: I0309 09:54:55.416755 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q" event={"ID":"3a012e5c-0a72-421f-8833-e5ab3e9466a9","Type":"ContainerDied","Data":"52f66a49d3b65a2941efb947ade670e45ab03e36acb95ba082a24407ca851c63"} Mar 09 09:54:56 crc kubenswrapper[4971]: I0309 09:54:56.716002 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q" Mar 09 09:54:56 crc kubenswrapper[4971]: I0309 09:54:56.752384 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q"] Mar 09 09:54:56 crc kubenswrapper[4971]: I0309 09:54:56.762981 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q"] Mar 09 09:54:56 crc kubenswrapper[4971]: I0309 09:54:56.816421 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3a012e5c-0a72-421f-8833-e5ab3e9466a9-etc-swift\") pod \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\" (UID: \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\") " Mar 09 09:54:56 crc kubenswrapper[4971]: I0309 09:54:56.816513 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhdkv\" (UniqueName: \"kubernetes.io/projected/3a012e5c-0a72-421f-8833-e5ab3e9466a9-kube-api-access-rhdkv\") pod \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\" (UID: \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\") " Mar 09 09:54:56 crc kubenswrapper[4971]: I0309 09:54:56.816543 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a012e5c-0a72-421f-8833-e5ab3e9466a9-scripts\") pod \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\" (UID: \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\") " Mar 09 09:54:56 crc kubenswrapper[4971]: I0309 09:54:56.816565 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3a012e5c-0a72-421f-8833-e5ab3e9466a9-swiftconf\") pod \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\" (UID: \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\") " Mar 09 09:54:56 crc kubenswrapper[4971]: I0309 09:54:56.816628 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3a012e5c-0a72-421f-8833-e5ab3e9466a9-dispersionconf\") pod \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\" (UID: \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\") " Mar 09 09:54:56 crc kubenswrapper[4971]: I0309 09:54:56.817288 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a012e5c-0a72-421f-8833-e5ab3e9466a9-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3a012e5c-0a72-421f-8833-e5ab3e9466a9" (UID: "3a012e5c-0a72-421f-8833-e5ab3e9466a9"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:54:56 crc kubenswrapper[4971]: I0309 09:54:56.817805 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3a012e5c-0a72-421f-8833-e5ab3e9466a9-ring-data-devices\") pod \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\" (UID: \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\") " Mar 09 09:54:56 crc kubenswrapper[4971]: I0309 09:54:56.818110 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3a012e5c-0a72-421f-8833-e5ab3e9466a9-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:56 crc kubenswrapper[4971]: I0309 09:54:56.818608 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a012e5c-0a72-421f-8833-e5ab3e9466a9-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "3a012e5c-0a72-421f-8833-e5ab3e9466a9" (UID: "3a012e5c-0a72-421f-8833-e5ab3e9466a9"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:54:56 crc kubenswrapper[4971]: I0309 09:54:56.824377 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a012e5c-0a72-421f-8833-e5ab3e9466a9-kube-api-access-rhdkv" (OuterVolumeSpecName: "kube-api-access-rhdkv") pod "3a012e5c-0a72-421f-8833-e5ab3e9466a9" (UID: "3a012e5c-0a72-421f-8833-e5ab3e9466a9"). InnerVolumeSpecName "kube-api-access-rhdkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:54:56 crc kubenswrapper[4971]: E0309 09:54:56.840799 4971 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a012e5c-0a72-421f-8833-e5ab3e9466a9-dispersionconf podName:3a012e5c-0a72-421f-8833-e5ab3e9466a9 nodeName:}" failed. No retries permitted until 2026-03-09 09:54:57.340763673 +0000 UTC m=+2100.900691483 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dispersionconf" (UniqueName: "kubernetes.io/secret/3a012e5c-0a72-421f-8833-e5ab3e9466a9-dispersionconf") pod "3a012e5c-0a72-421f-8833-e5ab3e9466a9" (UID: "3a012e5c-0a72-421f-8833-e5ab3e9466a9") : error deleting /var/lib/kubelet/pods/3a012e5c-0a72-421f-8833-e5ab3e9466a9/volume-subpaths: remove /var/lib/kubelet/pods/3a012e5c-0a72-421f-8833-e5ab3e9466a9/volume-subpaths: no such file or directory Mar 09 09:54:56 crc kubenswrapper[4971]: I0309 09:54:56.841436 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a012e5c-0a72-421f-8833-e5ab3e9466a9-scripts" (OuterVolumeSpecName: "scripts") pod "3a012e5c-0a72-421f-8833-e5ab3e9466a9" (UID: "3a012e5c-0a72-421f-8833-e5ab3e9466a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:54:56 crc kubenswrapper[4971]: I0309 09:54:56.841754 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a012e5c-0a72-421f-8833-e5ab3e9466a9-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "3a012e5c-0a72-421f-8833-e5ab3e9466a9" (UID: "3a012e5c-0a72-421f-8833-e5ab3e9466a9"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:54:56 crc kubenswrapper[4971]: I0309 09:54:56.919577 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3a012e5c-0a72-421f-8833-e5ab3e9466a9-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:56 crc kubenswrapper[4971]: I0309 09:54:56.919630 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3a012e5c-0a72-421f-8833-e5ab3e9466a9-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:56 crc kubenswrapper[4971]: I0309 09:54:56.919645 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhdkv\" (UniqueName: \"kubernetes.io/projected/3a012e5c-0a72-421f-8833-e5ab3e9466a9-kube-api-access-rhdkv\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:56 crc kubenswrapper[4971]: I0309 09:54:56.919656 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a012e5c-0a72-421f-8833-e5ab3e9466a9-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:57 crc kubenswrapper[4971]: I0309 09:54:57.426696 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3a012e5c-0a72-421f-8833-e5ab3e9466a9-dispersionconf\") pod \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\" (UID: \"3a012e5c-0a72-421f-8833-e5ab3e9466a9\") " Mar 09 09:54:57 crc kubenswrapper[4971]: I0309 09:54:57.431598 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a012e5c-0a72-421f-8833-e5ab3e9466a9-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "3a012e5c-0a72-421f-8833-e5ab3e9466a9" (UID: "3a012e5c-0a72-421f-8833-e5ab3e9466a9"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:54:57 crc kubenswrapper[4971]: I0309 09:54:57.443992 4971 scope.go:117] "RemoveContainer" containerID="52f66a49d3b65a2941efb947ade670e45ab03e36acb95ba082a24407ca851c63" Mar 09 09:54:57 crc kubenswrapper[4971]: I0309 09:54:57.444044 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zlb2q" Mar 09 09:54:57 crc kubenswrapper[4971]: I0309 09:54:57.528447 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3a012e5c-0a72-421f-8833-e5ab3e9466a9-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:57 crc kubenswrapper[4971]: I0309 09:54:57.894699 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p"] Mar 09 09:54:57 crc kubenswrapper[4971]: E0309 09:54:57.895142 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a012e5c-0a72-421f-8833-e5ab3e9466a9" containerName="swift-ring-rebalance" Mar 09 09:54:57 crc kubenswrapper[4971]: I0309 09:54:57.895162 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a012e5c-0a72-421f-8833-e5ab3e9466a9" containerName="swift-ring-rebalance" Mar 09 09:54:57 crc kubenswrapper[4971]: I0309 09:54:57.895399 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a012e5c-0a72-421f-8833-e5ab3e9466a9" containerName="swift-ring-rebalance" Mar 09 09:54:57 crc kubenswrapper[4971]: I0309 09:54:57.896071 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p" Mar 09 09:54:57 crc kubenswrapper[4971]: I0309 09:54:57.898364 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:54:57 crc kubenswrapper[4971]: I0309 09:54:57.899122 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:54:57 crc kubenswrapper[4971]: I0309 09:54:57.904521 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p"] Mar 09 09:54:58 crc kubenswrapper[4971]: I0309 09:54:58.034777 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-swiftconf\") pod \"swift-ring-rebalance-debug-7zd6p\" (UID: \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p" Mar 09 09:54:58 crc kubenswrapper[4971]: I0309 09:54:58.034811 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-etc-swift\") pod \"swift-ring-rebalance-debug-7zd6p\" (UID: \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p" Mar 09 09:54:58 crc kubenswrapper[4971]: I0309 09:54:58.034867 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4vgq\" (UniqueName: \"kubernetes.io/projected/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-kube-api-access-x4vgq\") pod \"swift-ring-rebalance-debug-7zd6p\" (UID: \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p" Mar 09 09:54:58 crc kubenswrapper[4971]: I0309 09:54:58.034900 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-dispersionconf\") pod \"swift-ring-rebalance-debug-7zd6p\" (UID: \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p" Mar 09 09:54:58 crc kubenswrapper[4971]: I0309 09:54:58.034928 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-scripts\") pod \"swift-ring-rebalance-debug-7zd6p\" (UID: \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p" Mar 09 09:54:58 crc kubenswrapper[4971]: I0309 09:54:58.034949 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-ring-data-devices\") pod \"swift-ring-rebalance-debug-7zd6p\" (UID: \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p" Mar 09 09:54:58 crc kubenswrapper[4971]: I0309 09:54:58.135896 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4vgq\" (UniqueName: \"kubernetes.io/projected/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-kube-api-access-x4vgq\") pod \"swift-ring-rebalance-debug-7zd6p\" (UID: \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p" Mar 09 09:54:58 crc kubenswrapper[4971]: I0309 09:54:58.135986 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-dispersionconf\") pod \"swift-ring-rebalance-debug-7zd6p\" (UID: \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p" Mar 09 09:54:58 crc kubenswrapper[4971]: I0309 09:54:58.136027 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-scripts\") pod \"swift-ring-rebalance-debug-7zd6p\" (UID: \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p" Mar 09 09:54:58 crc kubenswrapper[4971]: I0309 09:54:58.136060 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-ring-data-devices\") pod \"swift-ring-rebalance-debug-7zd6p\" (UID: \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p" Mar 09 09:54:58 crc kubenswrapper[4971]: I0309 09:54:58.136110 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-swiftconf\") pod \"swift-ring-rebalance-debug-7zd6p\" (UID: \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p" Mar 09 09:54:58 crc kubenswrapper[4971]: I0309 09:54:58.136133 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-etc-swift\") pod \"swift-ring-rebalance-debug-7zd6p\" (UID: \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p" Mar 09 09:54:58 crc kubenswrapper[4971]: I0309 09:54:58.136923 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-etc-swift\") pod \"swift-ring-rebalance-debug-7zd6p\" (UID: \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p" Mar 09 09:54:58 crc kubenswrapper[4971]: I0309 09:54:58.137184 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-ring-data-devices\") pod \"swift-ring-rebalance-debug-7zd6p\" (UID: \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p" Mar 09 09:54:58 crc kubenswrapper[4971]: I0309 09:54:58.137204 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-scripts\") pod \"swift-ring-rebalance-debug-7zd6p\" (UID: \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p" Mar 09 09:54:58 crc kubenswrapper[4971]: I0309 09:54:58.140291 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-dispersionconf\") pod \"swift-ring-rebalance-debug-7zd6p\" (UID: \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p" Mar 09 09:54:58 crc kubenswrapper[4971]: I0309 09:54:58.142581 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-swiftconf\") pod \"swift-ring-rebalance-debug-7zd6p\" (UID: \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p" Mar 09 09:54:58 crc kubenswrapper[4971]: I0309 09:54:58.158894 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4vgq\" (UniqueName: \"kubernetes.io/projected/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-kube-api-access-x4vgq\") pod \"swift-ring-rebalance-debug-7zd6p\" (UID: \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p" Mar 09 09:54:58 crc kubenswrapper[4971]: I0309 09:54:58.219697 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p" Mar 09 09:54:58 crc kubenswrapper[4971]: I0309 09:54:58.665194 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p"] Mar 09 09:54:58 crc kubenswrapper[4971]: W0309 09:54:58.668765 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbc0b5e5_8eb6_462b_8f88_26f0782d1aff.slice/crio-75eafcfc8b5c87a33a6ddf0e40434eee93e09679bc0ad59e824033b731054f6e WatchSource:0}: Error finding container 75eafcfc8b5c87a33a6ddf0e40434eee93e09679bc0ad59e824033b731054f6e: Status 404 returned error can't find the container with id 75eafcfc8b5c87a33a6ddf0e40434eee93e09679bc0ad59e824033b731054f6e Mar 09 09:54:59 crc kubenswrapper[4971]: I0309 09:54:59.162399 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a012e5c-0a72-421f-8833-e5ab3e9466a9" path="/var/lib/kubelet/pods/3a012e5c-0a72-421f-8833-e5ab3e9466a9/volumes" Mar 09 09:54:59 crc kubenswrapper[4971]: I0309 09:54:59.463298 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p" event={"ID":"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff","Type":"ContainerStarted","Data":"de07b8af261cf7553fcab6154141ca1b2d159d0c8e55dcc8a1b5d29ff5142bca"} Mar 09 09:54:59 crc kubenswrapper[4971]: I0309 09:54:59.463600 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p" event={"ID":"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff","Type":"ContainerStarted","Data":"75eafcfc8b5c87a33a6ddf0e40434eee93e09679bc0ad59e824033b731054f6e"} Mar 09 09:54:59 crc kubenswrapper[4971]: I0309 09:54:59.488133 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p" podStartSLOduration=2.488116482 podStartE2EDuration="2.488116482s" podCreationTimestamp="2026-03-09 09:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:54:59.481722072 +0000 UTC m=+2103.041649902" watchObservedRunningTime="2026-03-09 09:54:59.488116482 +0000 UTC m=+2103.048044292" Mar 09 09:55:00 crc kubenswrapper[4971]: I0309 09:55:00.473937 4971 generic.go:334] "Generic (PLEG): container finished" podID="fbc0b5e5-8eb6-462b-8f88-26f0782d1aff" containerID="de07b8af261cf7553fcab6154141ca1b2d159d0c8e55dcc8a1b5d29ff5142bca" exitCode=0 Mar 09 09:55:00 crc kubenswrapper[4971]: I0309 09:55:00.474035 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p" event={"ID":"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff","Type":"ContainerDied","Data":"de07b8af261cf7553fcab6154141ca1b2d159d0c8e55dcc8a1b5d29ff5142bca"} Mar 09 09:55:01 crc kubenswrapper[4971]: I0309 09:55:01.772180 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p" Mar 09 09:55:01 crc kubenswrapper[4971]: I0309 09:55:01.803250 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p"] Mar 09 09:55:01 crc kubenswrapper[4971]: I0309 09:55:01.809619 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p"] Mar 09 09:55:01 crc kubenswrapper[4971]: I0309 09:55:01.894804 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-swiftconf\") pod \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\" (UID: \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\") " Mar 09 09:55:01 crc kubenswrapper[4971]: I0309 09:55:01.894896 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4vgq\" (UniqueName: \"kubernetes.io/projected/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-kube-api-access-x4vgq\") pod \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\" (UID: \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\") " Mar 09 09:55:01 crc kubenswrapper[4971]: I0309 09:55:01.894918 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-dispersionconf\") pod \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\" (UID: \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\") " Mar 09 09:55:01 crc kubenswrapper[4971]: I0309 09:55:01.894981 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-etc-swift\") pod \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\" (UID: \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\") " Mar 09 09:55:01 crc kubenswrapper[4971]: I0309 09:55:01.894998 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-ring-data-devices\") pod \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\" (UID: \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\") " Mar 09 09:55:01 crc kubenswrapper[4971]: I0309 09:55:01.895015 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-scripts\") pod \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\" (UID: \"fbc0b5e5-8eb6-462b-8f88-26f0782d1aff\") " Mar 09 09:55:01 crc kubenswrapper[4971]: I0309 09:55:01.896315 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "fbc0b5e5-8eb6-462b-8f88-26f0782d1aff" (UID: "fbc0b5e5-8eb6-462b-8f88-26f0782d1aff"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:55:01 crc kubenswrapper[4971]: I0309 09:55:01.896485 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "fbc0b5e5-8eb6-462b-8f88-26f0782d1aff" (UID: "fbc0b5e5-8eb6-462b-8f88-26f0782d1aff"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:55:01 crc kubenswrapper[4971]: I0309 09:55:01.900095 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-kube-api-access-x4vgq" (OuterVolumeSpecName: "kube-api-access-x4vgq") pod "fbc0b5e5-8eb6-462b-8f88-26f0782d1aff" (UID: "fbc0b5e5-8eb6-462b-8f88-26f0782d1aff"). InnerVolumeSpecName "kube-api-access-x4vgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:55:01 crc kubenswrapper[4971]: I0309 09:55:01.914230 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-scripts" (OuterVolumeSpecName: "scripts") pod "fbc0b5e5-8eb6-462b-8f88-26f0782d1aff" (UID: "fbc0b5e5-8eb6-462b-8f88-26f0782d1aff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:55:01 crc kubenswrapper[4971]: I0309 09:55:01.917181 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "fbc0b5e5-8eb6-462b-8f88-26f0782d1aff" (UID: "fbc0b5e5-8eb6-462b-8f88-26f0782d1aff"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:55:01 crc kubenswrapper[4971]: I0309 09:55:01.926925 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "fbc0b5e5-8eb6-462b-8f88-26f0782d1aff" (UID: "fbc0b5e5-8eb6-462b-8f88-26f0782d1aff"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:55:01 crc kubenswrapper[4971]: I0309 09:55:01.996301 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4vgq\" (UniqueName: \"kubernetes.io/projected/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-kube-api-access-x4vgq\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:01 crc kubenswrapper[4971]: I0309 09:55:01.996335 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:01 crc kubenswrapper[4971]: I0309 09:55:01.996373 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:01 crc kubenswrapper[4971]: I0309 09:55:01.996384 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:01 crc kubenswrapper[4971]: I0309 09:55:01.996393 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:01 crc kubenswrapper[4971]: I0309 09:55:01.996402 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:02 crc kubenswrapper[4971]: I0309 09:55:02.493792 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75eafcfc8b5c87a33a6ddf0e40434eee93e09679bc0ad59e824033b731054f6e" Mar 09 09:55:02 crc kubenswrapper[4971]: I0309 09:55:02.493888 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7zd6p" Mar 09 09:55:02 crc kubenswrapper[4971]: I0309 09:55:02.946742 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j"] Mar 09 09:55:02 crc kubenswrapper[4971]: E0309 09:55:02.947025 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbc0b5e5-8eb6-462b-8f88-26f0782d1aff" containerName="swift-ring-rebalance" Mar 09 09:55:02 crc kubenswrapper[4971]: I0309 09:55:02.947037 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbc0b5e5-8eb6-462b-8f88-26f0782d1aff" containerName="swift-ring-rebalance" Mar 09 09:55:02 crc kubenswrapper[4971]: I0309 09:55:02.947176 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbc0b5e5-8eb6-462b-8f88-26f0782d1aff" containerName="swift-ring-rebalance" Mar 09 09:55:02 crc kubenswrapper[4971]: I0309 09:55:02.947638 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j" Mar 09 09:55:02 crc kubenswrapper[4971]: I0309 09:55:02.953019 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:55:02 crc kubenswrapper[4971]: I0309 09:55:02.953751 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:55:02 crc kubenswrapper[4971]: I0309 09:55:02.978056 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j"] Mar 09 09:55:03 crc kubenswrapper[4971]: I0309 09:55:03.112218 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/72e50ad7-4649-44d2-a0b3-36bb92023bae-etc-swift\") pod \"swift-ring-rebalance-debug-9hg6j\" (UID: \"72e50ad7-4649-44d2-a0b3-36bb92023bae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j" Mar 09 09:55:03 crc kubenswrapper[4971]: I0309 09:55:03.112296 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/72e50ad7-4649-44d2-a0b3-36bb92023bae-ring-data-devices\") pod \"swift-ring-rebalance-debug-9hg6j\" (UID: \"72e50ad7-4649-44d2-a0b3-36bb92023bae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j" Mar 09 09:55:03 crc kubenswrapper[4971]: I0309 09:55:03.112413 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k4x5\" (UniqueName: \"kubernetes.io/projected/72e50ad7-4649-44d2-a0b3-36bb92023bae-kube-api-access-6k4x5\") pod \"swift-ring-rebalance-debug-9hg6j\" (UID: \"72e50ad7-4649-44d2-a0b3-36bb92023bae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j" Mar 09 09:55:03 crc kubenswrapper[4971]: I0309 09:55:03.112453 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72e50ad7-4649-44d2-a0b3-36bb92023bae-scripts\") pod \"swift-ring-rebalance-debug-9hg6j\" (UID: \"72e50ad7-4649-44d2-a0b3-36bb92023bae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j" Mar 09 09:55:03 crc kubenswrapper[4971]: I0309 09:55:03.112584 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/72e50ad7-4649-44d2-a0b3-36bb92023bae-swiftconf\") pod \"swift-ring-rebalance-debug-9hg6j\" (UID: \"72e50ad7-4649-44d2-a0b3-36bb92023bae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j" Mar 09 09:55:03 crc kubenswrapper[4971]: I0309 09:55:03.112626 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/72e50ad7-4649-44d2-a0b3-36bb92023bae-dispersionconf\") pod \"swift-ring-rebalance-debug-9hg6j\" (UID: \"72e50ad7-4649-44d2-a0b3-36bb92023bae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j" Mar 09 09:55:03 crc kubenswrapper[4971]: I0309 09:55:03.161772 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbc0b5e5-8eb6-462b-8f88-26f0782d1aff" path="/var/lib/kubelet/pods/fbc0b5e5-8eb6-462b-8f88-26f0782d1aff/volumes" Mar 09 09:55:03 crc kubenswrapper[4971]: I0309 09:55:03.214329 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k4x5\" (UniqueName: \"kubernetes.io/projected/72e50ad7-4649-44d2-a0b3-36bb92023bae-kube-api-access-6k4x5\") pod \"swift-ring-rebalance-debug-9hg6j\" (UID: \"72e50ad7-4649-44d2-a0b3-36bb92023bae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j" Mar 09 09:55:03 crc kubenswrapper[4971]: I0309 09:55:03.214421 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72e50ad7-4649-44d2-a0b3-36bb92023bae-scripts\") pod \"swift-ring-rebalance-debug-9hg6j\" (UID: \"72e50ad7-4649-44d2-a0b3-36bb92023bae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j" Mar 09 09:55:03 crc kubenswrapper[4971]: I0309 09:55:03.214511 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/72e50ad7-4649-44d2-a0b3-36bb92023bae-swiftconf\") pod \"swift-ring-rebalance-debug-9hg6j\" (UID: \"72e50ad7-4649-44d2-a0b3-36bb92023bae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j" Mar 09 09:55:03 crc kubenswrapper[4971]: I0309 09:55:03.214561 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/72e50ad7-4649-44d2-a0b3-36bb92023bae-dispersionconf\") pod \"swift-ring-rebalance-debug-9hg6j\" (UID: \"72e50ad7-4649-44d2-a0b3-36bb92023bae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j" Mar 09 09:55:03 crc kubenswrapper[4971]: I0309 09:55:03.214622 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/72e50ad7-4649-44d2-a0b3-36bb92023bae-etc-swift\") pod \"swift-ring-rebalance-debug-9hg6j\" (UID: \"72e50ad7-4649-44d2-a0b3-36bb92023bae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j" Mar 09 09:55:03 crc kubenswrapper[4971]: I0309 09:55:03.214648 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/72e50ad7-4649-44d2-a0b3-36bb92023bae-ring-data-devices\") pod \"swift-ring-rebalance-debug-9hg6j\" (UID: \"72e50ad7-4649-44d2-a0b3-36bb92023bae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j" Mar 09 09:55:03 crc kubenswrapper[4971]: I0309 09:55:03.215108 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/72e50ad7-4649-44d2-a0b3-36bb92023bae-etc-swift\") pod \"swift-ring-rebalance-debug-9hg6j\" (UID: \"72e50ad7-4649-44d2-a0b3-36bb92023bae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j" Mar 09 09:55:03 crc kubenswrapper[4971]: I0309 09:55:03.215184 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72e50ad7-4649-44d2-a0b3-36bb92023bae-scripts\") pod \"swift-ring-rebalance-debug-9hg6j\" (UID: \"72e50ad7-4649-44d2-a0b3-36bb92023bae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j" Mar 09 09:55:03 crc kubenswrapper[4971]: I0309 09:55:03.216659 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/72e50ad7-4649-44d2-a0b3-36bb92023bae-ring-data-devices\") pod \"swift-ring-rebalance-debug-9hg6j\" (UID: \"72e50ad7-4649-44d2-a0b3-36bb92023bae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j" Mar 09 09:55:03 crc kubenswrapper[4971]: I0309 09:55:03.221747 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/72e50ad7-4649-44d2-a0b3-36bb92023bae-dispersionconf\") pod \"swift-ring-rebalance-debug-9hg6j\" (UID: \"72e50ad7-4649-44d2-a0b3-36bb92023bae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j" Mar 09 09:55:03 crc kubenswrapper[4971]: I0309 09:55:03.222242 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/72e50ad7-4649-44d2-a0b3-36bb92023bae-swiftconf\") pod \"swift-ring-rebalance-debug-9hg6j\" (UID: \"72e50ad7-4649-44d2-a0b3-36bb92023bae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j" Mar 09 09:55:03 crc kubenswrapper[4971]: I0309 09:55:03.235059 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k4x5\" (UniqueName: \"kubernetes.io/projected/72e50ad7-4649-44d2-a0b3-36bb92023bae-kube-api-access-6k4x5\") pod \"swift-ring-rebalance-debug-9hg6j\" (UID: \"72e50ad7-4649-44d2-a0b3-36bb92023bae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j" Mar 09 09:55:03 crc kubenswrapper[4971]: I0309 09:55:03.266655 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j" Mar 09 09:55:03 crc kubenswrapper[4971]: I0309 09:55:03.705993 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j"] Mar 09 09:55:03 crc kubenswrapper[4971]: W0309 09:55:03.711129 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72e50ad7_4649_44d2_a0b3_36bb92023bae.slice/crio-4eb4affebde468e4cad4859a92f9a5219c4ff00352aebcc781dbfb7eeb8ece82 WatchSource:0}: Error finding container 4eb4affebde468e4cad4859a92f9a5219c4ff00352aebcc781dbfb7eeb8ece82: Status 404 returned error can't find the container with id 4eb4affebde468e4cad4859a92f9a5219c4ff00352aebcc781dbfb7eeb8ece82 Mar 09 09:55:04 crc kubenswrapper[4971]: I0309 09:55:04.513881 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j" event={"ID":"72e50ad7-4649-44d2-a0b3-36bb92023bae","Type":"ContainerStarted","Data":"f5c5b238e8e56fafd2adbc9a7e4cde6043a31cce836bf1c9ed40c1408ffd7060"} Mar 09 09:55:04 crc kubenswrapper[4971]: I0309 09:55:04.514284 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j" event={"ID":"72e50ad7-4649-44d2-a0b3-36bb92023bae","Type":"ContainerStarted","Data":"4eb4affebde468e4cad4859a92f9a5219c4ff00352aebcc781dbfb7eeb8ece82"} Mar 09 09:55:04 crc kubenswrapper[4971]: I0309 09:55:04.533946 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j" podStartSLOduration=2.533927637 podStartE2EDuration="2.533927637s" podCreationTimestamp="2026-03-09 09:55:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:55:04.53084122 +0000 UTC m=+2108.090769040" watchObservedRunningTime="2026-03-09 09:55:04.533927637 +0000 UTC m=+2108.093855457" Mar 09 09:55:05 crc kubenswrapper[4971]: I0309 09:55:05.521993 4971 generic.go:334] "Generic (PLEG): container finished" podID="72e50ad7-4649-44d2-a0b3-36bb92023bae" containerID="f5c5b238e8e56fafd2adbc9a7e4cde6043a31cce836bf1c9ed40c1408ffd7060" exitCode=0 Mar 09 09:55:05 crc kubenswrapper[4971]: I0309 09:55:05.522067 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j" event={"ID":"72e50ad7-4649-44d2-a0b3-36bb92023bae","Type":"ContainerDied","Data":"f5c5b238e8e56fafd2adbc9a7e4cde6043a31cce836bf1c9ed40c1408ffd7060"} Mar 09 09:55:05 crc kubenswrapper[4971]: I0309 09:55:05.700571 4971 scope.go:117] "RemoveContainer" containerID="a6ffb4ed080bb2d0151614bfb2935785b56947b5da5a3d8e21b253939225810b" Mar 09 09:55:06 crc kubenswrapper[4971]: I0309 09:55:06.880113 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j" Mar 09 09:55:06 crc kubenswrapper[4971]: I0309 09:55:06.930722 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j"] Mar 09 09:55:06 crc kubenswrapper[4971]: I0309 09:55:06.939798 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j"] Mar 09 09:55:06 crc kubenswrapper[4971]: I0309 09:55:06.970063 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6k4x5\" (UniqueName: \"kubernetes.io/projected/72e50ad7-4649-44d2-a0b3-36bb92023bae-kube-api-access-6k4x5\") pod \"72e50ad7-4649-44d2-a0b3-36bb92023bae\" (UID: \"72e50ad7-4649-44d2-a0b3-36bb92023bae\") " Mar 09 09:55:06 crc kubenswrapper[4971]: I0309 09:55:06.970171 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/72e50ad7-4649-44d2-a0b3-36bb92023bae-ring-data-devices\") pod \"72e50ad7-4649-44d2-a0b3-36bb92023bae\" (UID: \"72e50ad7-4649-44d2-a0b3-36bb92023bae\") " Mar 09 09:55:06 crc kubenswrapper[4971]: I0309 09:55:06.970247 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/72e50ad7-4649-44d2-a0b3-36bb92023bae-dispersionconf\") pod \"72e50ad7-4649-44d2-a0b3-36bb92023bae\" (UID: \"72e50ad7-4649-44d2-a0b3-36bb92023bae\") " Mar 09 09:55:06 crc kubenswrapper[4971]: I0309 09:55:06.970273 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/72e50ad7-4649-44d2-a0b3-36bb92023bae-swiftconf\") pod \"72e50ad7-4649-44d2-a0b3-36bb92023bae\" (UID: \"72e50ad7-4649-44d2-a0b3-36bb92023bae\") " Mar 09 09:55:06 crc kubenswrapper[4971]: I0309 09:55:06.970426 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72e50ad7-4649-44d2-a0b3-36bb92023bae-scripts\") pod \"72e50ad7-4649-44d2-a0b3-36bb92023bae\" (UID: \"72e50ad7-4649-44d2-a0b3-36bb92023bae\") " Mar 09 09:55:06 crc kubenswrapper[4971]: I0309 09:55:06.970568 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/72e50ad7-4649-44d2-a0b3-36bb92023bae-etc-swift\") pod \"72e50ad7-4649-44d2-a0b3-36bb92023bae\" (UID: \"72e50ad7-4649-44d2-a0b3-36bb92023bae\") " Mar 09 09:55:06 crc kubenswrapper[4971]: I0309 09:55:06.971689 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72e50ad7-4649-44d2-a0b3-36bb92023bae-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "72e50ad7-4649-44d2-a0b3-36bb92023bae" (UID: "72e50ad7-4649-44d2-a0b3-36bb92023bae"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:55:06 crc kubenswrapper[4971]: I0309 09:55:06.971709 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72e50ad7-4649-44d2-a0b3-36bb92023bae-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "72e50ad7-4649-44d2-a0b3-36bb92023bae" (UID: "72e50ad7-4649-44d2-a0b3-36bb92023bae"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:55:06 crc kubenswrapper[4971]: I0309 09:55:06.980246 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72e50ad7-4649-44d2-a0b3-36bb92023bae-kube-api-access-6k4x5" (OuterVolumeSpecName: "kube-api-access-6k4x5") pod "72e50ad7-4649-44d2-a0b3-36bb92023bae" (UID: "72e50ad7-4649-44d2-a0b3-36bb92023bae"). InnerVolumeSpecName "kube-api-access-6k4x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:55:06 crc kubenswrapper[4971]: I0309 09:55:06.998370 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72e50ad7-4649-44d2-a0b3-36bb92023bae-scripts" (OuterVolumeSpecName: "scripts") pod "72e50ad7-4649-44d2-a0b3-36bb92023bae" (UID: "72e50ad7-4649-44d2-a0b3-36bb92023bae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:55:06 crc kubenswrapper[4971]: I0309 09:55:06.998679 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72e50ad7-4649-44d2-a0b3-36bb92023bae-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "72e50ad7-4649-44d2-a0b3-36bb92023bae" (UID: "72e50ad7-4649-44d2-a0b3-36bb92023bae"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:55:06 crc kubenswrapper[4971]: I0309 09:55:06.998847 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72e50ad7-4649-44d2-a0b3-36bb92023bae-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "72e50ad7-4649-44d2-a0b3-36bb92023bae" (UID: "72e50ad7-4649-44d2-a0b3-36bb92023bae"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:55:07 crc kubenswrapper[4971]: I0309 09:55:07.072549 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/72e50ad7-4649-44d2-a0b3-36bb92023bae-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:07 crc kubenswrapper[4971]: I0309 09:55:07.072603 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6k4x5\" (UniqueName: \"kubernetes.io/projected/72e50ad7-4649-44d2-a0b3-36bb92023bae-kube-api-access-6k4x5\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:07 crc kubenswrapper[4971]: I0309 09:55:07.072619 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/72e50ad7-4649-44d2-a0b3-36bb92023bae-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:07 crc kubenswrapper[4971]: I0309 09:55:07.072631 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/72e50ad7-4649-44d2-a0b3-36bb92023bae-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:07 crc kubenswrapper[4971]: I0309 09:55:07.072643 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/72e50ad7-4649-44d2-a0b3-36bb92023bae-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:07 crc kubenswrapper[4971]: I0309 09:55:07.072656 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72e50ad7-4649-44d2-a0b3-36bb92023bae-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:07 crc kubenswrapper[4971]: I0309 09:55:07.161572 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72e50ad7-4649-44d2-a0b3-36bb92023bae" path="/var/lib/kubelet/pods/72e50ad7-4649-44d2-a0b3-36bb92023bae/volumes" Mar 09 09:55:07 crc kubenswrapper[4971]: I0309 09:55:07.542596 4971 scope.go:117] "RemoveContainer" containerID="f5c5b238e8e56fafd2adbc9a7e4cde6043a31cce836bf1c9ed40c1408ffd7060" Mar 09 09:55:07 crc kubenswrapper[4971]: I0309 09:55:07.542601 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9hg6j" Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.044262 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt"] Mar 09 09:55:08 crc kubenswrapper[4971]: E0309 09:55:08.044858 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72e50ad7-4649-44d2-a0b3-36bb92023bae" containerName="swift-ring-rebalance" Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.044871 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="72e50ad7-4649-44d2-a0b3-36bb92023bae" containerName="swift-ring-rebalance" Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.045029 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="72e50ad7-4649-44d2-a0b3-36bb92023bae" containerName="swift-ring-rebalance" Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.045479 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt" Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.047398 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.047515 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.062818 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt"] Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.188319 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69qbq\" (UniqueName: \"kubernetes.io/projected/3fbe7ef8-2b08-4723-976f-ed324fb782d2-kube-api-access-69qbq\") pod \"swift-ring-rebalance-debug-xhsmt\" (UID: \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt" Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.188392 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3fbe7ef8-2b08-4723-976f-ed324fb782d2-dispersionconf\") pod \"swift-ring-rebalance-debug-xhsmt\" (UID: \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt" Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.188469 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fbe7ef8-2b08-4723-976f-ed324fb782d2-scripts\") pod \"swift-ring-rebalance-debug-xhsmt\" (UID: \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt" Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.188519 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3fbe7ef8-2b08-4723-976f-ed324fb782d2-ring-data-devices\") pod \"swift-ring-rebalance-debug-xhsmt\" (UID: \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt" Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.188543 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3fbe7ef8-2b08-4723-976f-ed324fb782d2-swiftconf\") pod \"swift-ring-rebalance-debug-xhsmt\" (UID: \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt" Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.188577 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3fbe7ef8-2b08-4723-976f-ed324fb782d2-etc-swift\") pod \"swift-ring-rebalance-debug-xhsmt\" (UID: \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt" Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.289374 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fbe7ef8-2b08-4723-976f-ed324fb782d2-scripts\") pod \"swift-ring-rebalance-debug-xhsmt\" (UID: \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt" Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.289446 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3fbe7ef8-2b08-4723-976f-ed324fb782d2-ring-data-devices\") pod \"swift-ring-rebalance-debug-xhsmt\" (UID: \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt" Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.289472 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3fbe7ef8-2b08-4723-976f-ed324fb782d2-swiftconf\") pod \"swift-ring-rebalance-debug-xhsmt\" (UID: \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt" Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.289515 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3fbe7ef8-2b08-4723-976f-ed324fb782d2-etc-swift\") pod \"swift-ring-rebalance-debug-xhsmt\" (UID: \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt" Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.289561 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69qbq\" (UniqueName: \"kubernetes.io/projected/3fbe7ef8-2b08-4723-976f-ed324fb782d2-kube-api-access-69qbq\") pod \"swift-ring-rebalance-debug-xhsmt\" (UID: \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt" Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.289579 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3fbe7ef8-2b08-4723-976f-ed324fb782d2-dispersionconf\") pod \"swift-ring-rebalance-debug-xhsmt\" (UID: \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt" Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.290558 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3fbe7ef8-2b08-4723-976f-ed324fb782d2-ring-data-devices\") pod \"swift-ring-rebalance-debug-xhsmt\" (UID: \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt" Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.290685 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3fbe7ef8-2b08-4723-976f-ed324fb782d2-etc-swift\") pod \"swift-ring-rebalance-debug-xhsmt\" (UID: \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt" Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.291326 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fbe7ef8-2b08-4723-976f-ed324fb782d2-scripts\") pod \"swift-ring-rebalance-debug-xhsmt\" (UID: \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt" Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.293651 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3fbe7ef8-2b08-4723-976f-ed324fb782d2-dispersionconf\") pod \"swift-ring-rebalance-debug-xhsmt\" (UID: \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt" Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.293656 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3fbe7ef8-2b08-4723-976f-ed324fb782d2-swiftconf\") pod \"swift-ring-rebalance-debug-xhsmt\" (UID: \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt" Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.309000 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69qbq\" (UniqueName: \"kubernetes.io/projected/3fbe7ef8-2b08-4723-976f-ed324fb782d2-kube-api-access-69qbq\") pod \"swift-ring-rebalance-debug-xhsmt\" (UID: \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt" Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.365630 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt" Mar 09 09:55:08 crc kubenswrapper[4971]: I0309 09:55:08.861203 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt"] Mar 09 09:55:09 crc kubenswrapper[4971]: I0309 09:55:09.567466 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt" event={"ID":"3fbe7ef8-2b08-4723-976f-ed324fb782d2","Type":"ContainerStarted","Data":"dc45fcc3a20b773814ee053ffe3fefd339d84c7c11f2deacdc70a5377133616a"} Mar 09 09:55:09 crc kubenswrapper[4971]: I0309 09:55:09.568110 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt" event={"ID":"3fbe7ef8-2b08-4723-976f-ed324fb782d2","Type":"ContainerStarted","Data":"49d27ec57fc8d5755644ce475fb7981488c0972b6fa6b6baa2984090c8abb81d"} Mar 09 09:55:09 crc kubenswrapper[4971]: I0309 09:55:09.586368 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt" podStartSLOduration=1.586332086 podStartE2EDuration="1.586332086s" podCreationTimestamp="2026-03-09 09:55:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:55:09.585872343 +0000 UTC m=+2113.145800163" watchObservedRunningTime="2026-03-09 09:55:09.586332086 +0000 UTC m=+2113.146259896" Mar 09 09:55:10 crc kubenswrapper[4971]: I0309 09:55:10.578420 4971 generic.go:334] "Generic (PLEG): container finished" podID="3fbe7ef8-2b08-4723-976f-ed324fb782d2" containerID="dc45fcc3a20b773814ee053ffe3fefd339d84c7c11f2deacdc70a5377133616a" exitCode=0 Mar 09 09:55:10 crc kubenswrapper[4971]: I0309 09:55:10.578478 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt" event={"ID":"3fbe7ef8-2b08-4723-976f-ed324fb782d2","Type":"ContainerDied","Data":"dc45fcc3a20b773814ee053ffe3fefd339d84c7c11f2deacdc70a5377133616a"} Mar 09 09:55:11 crc kubenswrapper[4971]: I0309 09:55:11.858136 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt" Mar 09 09:55:11 crc kubenswrapper[4971]: I0309 09:55:11.898530 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt"] Mar 09 09:55:11 crc kubenswrapper[4971]: I0309 09:55:11.902892 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt"] Mar 09 09:55:11 crc kubenswrapper[4971]: I0309 09:55:11.951164 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3fbe7ef8-2b08-4723-976f-ed324fb782d2-etc-swift\") pod \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\" (UID: \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\") " Mar 09 09:55:11 crc kubenswrapper[4971]: I0309 09:55:11.951234 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69qbq\" (UniqueName: \"kubernetes.io/projected/3fbe7ef8-2b08-4723-976f-ed324fb782d2-kube-api-access-69qbq\") pod \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\" (UID: \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\") " Mar 09 09:55:11 crc kubenswrapper[4971]: I0309 09:55:11.951295 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3fbe7ef8-2b08-4723-976f-ed324fb782d2-ring-data-devices\") pod \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\" (UID: \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\") " Mar 09 09:55:11 crc kubenswrapper[4971]: I0309 09:55:11.951323 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3fbe7ef8-2b08-4723-976f-ed324fb782d2-dispersionconf\") pod \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\" (UID: \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\") " Mar 09 09:55:11 crc kubenswrapper[4971]: I0309 09:55:11.951389 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fbe7ef8-2b08-4723-976f-ed324fb782d2-scripts\") pod \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\" (UID: \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\") " Mar 09 09:55:11 crc kubenswrapper[4971]: I0309 09:55:11.951447 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3fbe7ef8-2b08-4723-976f-ed324fb782d2-swiftconf\") pod \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\" (UID: \"3fbe7ef8-2b08-4723-976f-ed324fb782d2\") " Mar 09 09:55:11 crc kubenswrapper[4971]: I0309 09:55:11.951925 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fbe7ef8-2b08-4723-976f-ed324fb782d2-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "3fbe7ef8-2b08-4723-976f-ed324fb782d2" (UID: "3fbe7ef8-2b08-4723-976f-ed324fb782d2"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:55:11 crc kubenswrapper[4971]: I0309 09:55:11.952690 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fbe7ef8-2b08-4723-976f-ed324fb782d2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3fbe7ef8-2b08-4723-976f-ed324fb782d2" (UID: "3fbe7ef8-2b08-4723-976f-ed324fb782d2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:55:11 crc kubenswrapper[4971]: I0309 09:55:11.971718 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fbe7ef8-2b08-4723-976f-ed324fb782d2-kube-api-access-69qbq" (OuterVolumeSpecName: "kube-api-access-69qbq") pod "3fbe7ef8-2b08-4723-976f-ed324fb782d2" (UID: "3fbe7ef8-2b08-4723-976f-ed324fb782d2"). InnerVolumeSpecName "kube-api-access-69qbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:55:11 crc kubenswrapper[4971]: I0309 09:55:11.979235 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fbe7ef8-2b08-4723-976f-ed324fb782d2-scripts" (OuterVolumeSpecName: "scripts") pod "3fbe7ef8-2b08-4723-976f-ed324fb782d2" (UID: "3fbe7ef8-2b08-4723-976f-ed324fb782d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:55:11 crc kubenswrapper[4971]: I0309 09:55:11.982894 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fbe7ef8-2b08-4723-976f-ed324fb782d2-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "3fbe7ef8-2b08-4723-976f-ed324fb782d2" (UID: "3fbe7ef8-2b08-4723-976f-ed324fb782d2"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:55:11 crc kubenswrapper[4971]: I0309 09:55:11.985514 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fbe7ef8-2b08-4723-976f-ed324fb782d2-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "3fbe7ef8-2b08-4723-976f-ed324fb782d2" (UID: "3fbe7ef8-2b08-4723-976f-ed324fb782d2"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:55:12 crc kubenswrapper[4971]: I0309 09:55:12.052994 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3fbe7ef8-2b08-4723-976f-ed324fb782d2-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:12 crc kubenswrapper[4971]: I0309 09:55:12.053033 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3fbe7ef8-2b08-4723-976f-ed324fb782d2-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:12 crc kubenswrapper[4971]: I0309 09:55:12.053047 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69qbq\" (UniqueName: \"kubernetes.io/projected/3fbe7ef8-2b08-4723-976f-ed324fb782d2-kube-api-access-69qbq\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:12 crc kubenswrapper[4971]: I0309 09:55:12.053062 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3fbe7ef8-2b08-4723-976f-ed324fb782d2-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:12 crc kubenswrapper[4971]: I0309 09:55:12.053073 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3fbe7ef8-2b08-4723-976f-ed324fb782d2-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:12 crc kubenswrapper[4971]: I0309 09:55:12.053084 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fbe7ef8-2b08-4723-976f-ed324fb782d2-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:12 crc kubenswrapper[4971]: I0309 09:55:12.608174 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49d27ec57fc8d5755644ce475fb7981488c0972b6fa6b6baa2984090c8abb81d" Mar 09 09:55:12 crc kubenswrapper[4971]: I0309 09:55:12.608236 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xhsmt" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.020659 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz"] Mar 09 09:55:13 crc kubenswrapper[4971]: E0309 09:55:13.021326 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fbe7ef8-2b08-4723-976f-ed324fb782d2" containerName="swift-ring-rebalance" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.021345 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fbe7ef8-2b08-4723-976f-ed324fb782d2" containerName="swift-ring-rebalance" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.021561 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fbe7ef8-2b08-4723-976f-ed324fb782d2" containerName="swift-ring-rebalance" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.022078 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.024458 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.024736 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.033222 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz"] Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.160761 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fbe7ef8-2b08-4723-976f-ed324fb782d2" path="/var/lib/kubelet/pods/3fbe7ef8-2b08-4723-976f-ed324fb782d2/volumes" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.168452 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/56bdd5be-3049-4d57-94f3-0975b8358331-ring-data-devices\") pod \"swift-ring-rebalance-debug-8bjwz\" (UID: \"56bdd5be-3049-4d57-94f3-0975b8358331\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.168497 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzfm6\" (UniqueName: \"kubernetes.io/projected/56bdd5be-3049-4d57-94f3-0975b8358331-kube-api-access-kzfm6\") pod \"swift-ring-rebalance-debug-8bjwz\" (UID: \"56bdd5be-3049-4d57-94f3-0975b8358331\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.168525 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/56bdd5be-3049-4d57-94f3-0975b8358331-etc-swift\") pod \"swift-ring-rebalance-debug-8bjwz\" (UID: \"56bdd5be-3049-4d57-94f3-0975b8358331\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.168565 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/56bdd5be-3049-4d57-94f3-0975b8358331-dispersionconf\") pod \"swift-ring-rebalance-debug-8bjwz\" (UID: \"56bdd5be-3049-4d57-94f3-0975b8358331\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.168586 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/56bdd5be-3049-4d57-94f3-0975b8358331-swiftconf\") pod \"swift-ring-rebalance-debug-8bjwz\" (UID: \"56bdd5be-3049-4d57-94f3-0975b8358331\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.168916 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56bdd5be-3049-4d57-94f3-0975b8358331-scripts\") pod \"swift-ring-rebalance-debug-8bjwz\" (UID: \"56bdd5be-3049-4d57-94f3-0975b8358331\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.270395 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56bdd5be-3049-4d57-94f3-0975b8358331-scripts\") pod \"swift-ring-rebalance-debug-8bjwz\" (UID: \"56bdd5be-3049-4d57-94f3-0975b8358331\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.270523 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/56bdd5be-3049-4d57-94f3-0975b8358331-ring-data-devices\") pod \"swift-ring-rebalance-debug-8bjwz\" (UID: \"56bdd5be-3049-4d57-94f3-0975b8358331\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.270542 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzfm6\" (UniqueName: \"kubernetes.io/projected/56bdd5be-3049-4d57-94f3-0975b8358331-kube-api-access-kzfm6\") pod \"swift-ring-rebalance-debug-8bjwz\" (UID: \"56bdd5be-3049-4d57-94f3-0975b8358331\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.270565 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/56bdd5be-3049-4d57-94f3-0975b8358331-etc-swift\") pod \"swift-ring-rebalance-debug-8bjwz\" (UID: \"56bdd5be-3049-4d57-94f3-0975b8358331\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.270635 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/56bdd5be-3049-4d57-94f3-0975b8358331-dispersionconf\") pod \"swift-ring-rebalance-debug-8bjwz\" (UID: \"56bdd5be-3049-4d57-94f3-0975b8358331\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.270674 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/56bdd5be-3049-4d57-94f3-0975b8358331-swiftconf\") pod \"swift-ring-rebalance-debug-8bjwz\" (UID: \"56bdd5be-3049-4d57-94f3-0975b8358331\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.271307 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/56bdd5be-3049-4d57-94f3-0975b8358331-etc-swift\") pod \"swift-ring-rebalance-debug-8bjwz\" (UID: \"56bdd5be-3049-4d57-94f3-0975b8358331\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.272738 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56bdd5be-3049-4d57-94f3-0975b8358331-scripts\") pod \"swift-ring-rebalance-debug-8bjwz\" (UID: \"56bdd5be-3049-4d57-94f3-0975b8358331\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.272798 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/56bdd5be-3049-4d57-94f3-0975b8358331-ring-data-devices\") pod \"swift-ring-rebalance-debug-8bjwz\" (UID: \"56bdd5be-3049-4d57-94f3-0975b8358331\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.277182 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/56bdd5be-3049-4d57-94f3-0975b8358331-dispersionconf\") pod \"swift-ring-rebalance-debug-8bjwz\" (UID: \"56bdd5be-3049-4d57-94f3-0975b8358331\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.284454 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/56bdd5be-3049-4d57-94f3-0975b8358331-swiftconf\") pod \"swift-ring-rebalance-debug-8bjwz\" (UID: \"56bdd5be-3049-4d57-94f3-0975b8358331\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.295598 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzfm6\" (UniqueName: \"kubernetes.io/projected/56bdd5be-3049-4d57-94f3-0975b8358331-kube-api-access-kzfm6\") pod \"swift-ring-rebalance-debug-8bjwz\" (UID: \"56bdd5be-3049-4d57-94f3-0975b8358331\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.339861 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz" Mar 09 09:55:13 crc kubenswrapper[4971]: I0309 09:55:13.776802 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz"] Mar 09 09:55:14 crc kubenswrapper[4971]: I0309 09:55:14.624808 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz" event={"ID":"56bdd5be-3049-4d57-94f3-0975b8358331","Type":"ContainerStarted","Data":"3d02852e9f0120aeee6b42032ea826d0122f39ceaa50ab5e73f2989f499907d7"} Mar 09 09:55:14 crc kubenswrapper[4971]: I0309 09:55:14.625172 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz" event={"ID":"56bdd5be-3049-4d57-94f3-0975b8358331","Type":"ContainerStarted","Data":"4ffee2344fb21b7ca8a4746d9860bef176f4eadc5a5681689a23524dc157b94a"} Mar 09 09:55:14 crc kubenswrapper[4971]: I0309 09:55:14.649117 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz" podStartSLOduration=1.649090088 podStartE2EDuration="1.649090088s" podCreationTimestamp="2026-03-09 09:55:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:55:14.641634838 +0000 UTC m=+2118.201562668" watchObservedRunningTime="2026-03-09 09:55:14.649090088 +0000 UTC m=+2118.209017908" Mar 09 09:55:15 crc kubenswrapper[4971]: I0309 09:55:15.635781 4971 generic.go:334] "Generic (PLEG): container finished" podID="56bdd5be-3049-4d57-94f3-0975b8358331" containerID="3d02852e9f0120aeee6b42032ea826d0122f39ceaa50ab5e73f2989f499907d7" exitCode=0 Mar 09 09:55:15 crc kubenswrapper[4971]: I0309 09:55:15.635881 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz" event={"ID":"56bdd5be-3049-4d57-94f3-0975b8358331","Type":"ContainerDied","Data":"3d02852e9f0120aeee6b42032ea826d0122f39ceaa50ab5e73f2989f499907d7"} Mar 09 09:55:16 crc kubenswrapper[4971]: I0309 09:55:16.945244 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz" Mar 09 09:55:16 crc kubenswrapper[4971]: I0309 09:55:16.985796 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz"] Mar 09 09:55:16 crc kubenswrapper[4971]: I0309 09:55:16.991505 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz"] Mar 09 09:55:17 crc kubenswrapper[4971]: I0309 09:55:17.030045 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/56bdd5be-3049-4d57-94f3-0975b8358331-dispersionconf\") pod \"56bdd5be-3049-4d57-94f3-0975b8358331\" (UID: \"56bdd5be-3049-4d57-94f3-0975b8358331\") " Mar 09 09:55:17 crc kubenswrapper[4971]: I0309 09:55:17.030184 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzfm6\" (UniqueName: \"kubernetes.io/projected/56bdd5be-3049-4d57-94f3-0975b8358331-kube-api-access-kzfm6\") pod \"56bdd5be-3049-4d57-94f3-0975b8358331\" (UID: \"56bdd5be-3049-4d57-94f3-0975b8358331\") " Mar 09 09:55:17 crc kubenswrapper[4971]: I0309 09:55:17.030224 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/56bdd5be-3049-4d57-94f3-0975b8358331-ring-data-devices\") pod \"56bdd5be-3049-4d57-94f3-0975b8358331\" (UID: \"56bdd5be-3049-4d57-94f3-0975b8358331\") " Mar 09 09:55:17 crc kubenswrapper[4971]: I0309 09:55:17.030332 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/56bdd5be-3049-4d57-94f3-0975b8358331-swiftconf\") pod \"56bdd5be-3049-4d57-94f3-0975b8358331\" (UID: \"56bdd5be-3049-4d57-94f3-0975b8358331\") " Mar 09 09:55:17 crc kubenswrapper[4971]: I0309 09:55:17.030391 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56bdd5be-3049-4d57-94f3-0975b8358331-scripts\") pod \"56bdd5be-3049-4d57-94f3-0975b8358331\" (UID: \"56bdd5be-3049-4d57-94f3-0975b8358331\") " Mar 09 09:55:17 crc kubenswrapper[4971]: I0309 09:55:17.030435 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/56bdd5be-3049-4d57-94f3-0975b8358331-etc-swift\") pod \"56bdd5be-3049-4d57-94f3-0975b8358331\" (UID: \"56bdd5be-3049-4d57-94f3-0975b8358331\") " Mar 09 09:55:17 crc kubenswrapper[4971]: I0309 09:55:17.030997 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56bdd5be-3049-4d57-94f3-0975b8358331-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "56bdd5be-3049-4d57-94f3-0975b8358331" (UID: "56bdd5be-3049-4d57-94f3-0975b8358331"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:55:17 crc kubenswrapper[4971]: I0309 09:55:17.031869 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56bdd5be-3049-4d57-94f3-0975b8358331-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "56bdd5be-3049-4d57-94f3-0975b8358331" (UID: "56bdd5be-3049-4d57-94f3-0975b8358331"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:55:17 crc kubenswrapper[4971]: I0309 09:55:17.036601 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56bdd5be-3049-4d57-94f3-0975b8358331-kube-api-access-kzfm6" (OuterVolumeSpecName: "kube-api-access-kzfm6") pod "56bdd5be-3049-4d57-94f3-0975b8358331" (UID: "56bdd5be-3049-4d57-94f3-0975b8358331"). InnerVolumeSpecName "kube-api-access-kzfm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:55:17 crc kubenswrapper[4971]: I0309 09:55:17.054704 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56bdd5be-3049-4d57-94f3-0975b8358331-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "56bdd5be-3049-4d57-94f3-0975b8358331" (UID: "56bdd5be-3049-4d57-94f3-0975b8358331"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:55:17 crc kubenswrapper[4971]: I0309 09:55:17.055016 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56bdd5be-3049-4d57-94f3-0975b8358331-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "56bdd5be-3049-4d57-94f3-0975b8358331" (UID: "56bdd5be-3049-4d57-94f3-0975b8358331"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:55:17 crc kubenswrapper[4971]: I0309 09:55:17.065127 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56bdd5be-3049-4d57-94f3-0975b8358331-scripts" (OuterVolumeSpecName: "scripts") pod "56bdd5be-3049-4d57-94f3-0975b8358331" (UID: "56bdd5be-3049-4d57-94f3-0975b8358331"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:55:17 crc kubenswrapper[4971]: I0309 09:55:17.132379 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/56bdd5be-3049-4d57-94f3-0975b8358331-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:17 crc kubenswrapper[4971]: I0309 09:55:17.132407 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56bdd5be-3049-4d57-94f3-0975b8358331-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:17 crc kubenswrapper[4971]: I0309 09:55:17.132416 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/56bdd5be-3049-4d57-94f3-0975b8358331-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:17 crc kubenswrapper[4971]: I0309 09:55:17.132425 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/56bdd5be-3049-4d57-94f3-0975b8358331-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:17 crc kubenswrapper[4971]: I0309 09:55:17.132435 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzfm6\" (UniqueName: \"kubernetes.io/projected/56bdd5be-3049-4d57-94f3-0975b8358331-kube-api-access-kzfm6\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:17 crc kubenswrapper[4971]: I0309 09:55:17.132444 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/56bdd5be-3049-4d57-94f3-0975b8358331-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:17 crc kubenswrapper[4971]: I0309 09:55:17.161472 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56bdd5be-3049-4d57-94f3-0975b8358331" path="/var/lib/kubelet/pods/56bdd5be-3049-4d57-94f3-0975b8358331/volumes" Mar 09 09:55:17 crc kubenswrapper[4971]: I0309 09:55:17.659225 4971 scope.go:117] "RemoveContainer" containerID="3d02852e9f0120aeee6b42032ea826d0122f39ceaa50ab5e73f2989f499907d7" Mar 09 09:55:17 crc kubenswrapper[4971]: I0309 09:55:17.659408 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bjwz" Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.125630 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl"] Mar 09 09:55:18 crc kubenswrapper[4971]: E0309 09:55:18.126168 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56bdd5be-3049-4d57-94f3-0975b8358331" containerName="swift-ring-rebalance" Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.126180 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="56bdd5be-3049-4d57-94f3-0975b8358331" containerName="swift-ring-rebalance" Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.126434 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="56bdd5be-3049-4d57-94f3-0975b8358331" containerName="swift-ring-rebalance" Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.126893 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl" Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.129449 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.130869 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.136814 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl"] Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.248041 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/45901fbf-99ef-4ef7-9cc5-6783ae123708-dispersionconf\") pod \"swift-ring-rebalance-debug-8q2gl\" (UID: \"45901fbf-99ef-4ef7-9cc5-6783ae123708\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl" Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.248084 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45901fbf-99ef-4ef7-9cc5-6783ae123708-scripts\") pod \"swift-ring-rebalance-debug-8q2gl\" (UID: \"45901fbf-99ef-4ef7-9cc5-6783ae123708\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl" Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.248387 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/45901fbf-99ef-4ef7-9cc5-6783ae123708-swiftconf\") pod \"swift-ring-rebalance-debug-8q2gl\" (UID: \"45901fbf-99ef-4ef7-9cc5-6783ae123708\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl" Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.248610 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2vfd\" (UniqueName: \"kubernetes.io/projected/45901fbf-99ef-4ef7-9cc5-6783ae123708-kube-api-access-h2vfd\") pod \"swift-ring-rebalance-debug-8q2gl\" (UID: \"45901fbf-99ef-4ef7-9cc5-6783ae123708\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl" Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.248715 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/45901fbf-99ef-4ef7-9cc5-6783ae123708-ring-data-devices\") pod \"swift-ring-rebalance-debug-8q2gl\" (UID: \"45901fbf-99ef-4ef7-9cc5-6783ae123708\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl" Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.248784 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/45901fbf-99ef-4ef7-9cc5-6783ae123708-etc-swift\") pod \"swift-ring-rebalance-debug-8q2gl\" (UID: \"45901fbf-99ef-4ef7-9cc5-6783ae123708\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl" Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.350735 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2vfd\" (UniqueName: \"kubernetes.io/projected/45901fbf-99ef-4ef7-9cc5-6783ae123708-kube-api-access-h2vfd\") pod \"swift-ring-rebalance-debug-8q2gl\" (UID: \"45901fbf-99ef-4ef7-9cc5-6783ae123708\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl" Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.350789 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/45901fbf-99ef-4ef7-9cc5-6783ae123708-ring-data-devices\") pod \"swift-ring-rebalance-debug-8q2gl\" (UID: \"45901fbf-99ef-4ef7-9cc5-6783ae123708\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl" Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.350811 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/45901fbf-99ef-4ef7-9cc5-6783ae123708-etc-swift\") pod \"swift-ring-rebalance-debug-8q2gl\" (UID: \"45901fbf-99ef-4ef7-9cc5-6783ae123708\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl" Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.350842 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/45901fbf-99ef-4ef7-9cc5-6783ae123708-dispersionconf\") pod \"swift-ring-rebalance-debug-8q2gl\" (UID: \"45901fbf-99ef-4ef7-9cc5-6783ae123708\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl" Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.350860 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45901fbf-99ef-4ef7-9cc5-6783ae123708-scripts\") pod \"swift-ring-rebalance-debug-8q2gl\" (UID: \"45901fbf-99ef-4ef7-9cc5-6783ae123708\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl" Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.350915 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/45901fbf-99ef-4ef7-9cc5-6783ae123708-swiftconf\") pod \"swift-ring-rebalance-debug-8q2gl\" (UID: \"45901fbf-99ef-4ef7-9cc5-6783ae123708\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl" Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.351432 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/45901fbf-99ef-4ef7-9cc5-6783ae123708-etc-swift\") pod \"swift-ring-rebalance-debug-8q2gl\" (UID: \"45901fbf-99ef-4ef7-9cc5-6783ae123708\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl" Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.351914 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/45901fbf-99ef-4ef7-9cc5-6783ae123708-ring-data-devices\") pod \"swift-ring-rebalance-debug-8q2gl\" (UID: \"45901fbf-99ef-4ef7-9cc5-6783ae123708\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl" Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.352094 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45901fbf-99ef-4ef7-9cc5-6783ae123708-scripts\") pod \"swift-ring-rebalance-debug-8q2gl\" (UID: \"45901fbf-99ef-4ef7-9cc5-6783ae123708\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl" Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.354989 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/45901fbf-99ef-4ef7-9cc5-6783ae123708-swiftconf\") pod \"swift-ring-rebalance-debug-8q2gl\" (UID: \"45901fbf-99ef-4ef7-9cc5-6783ae123708\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl" Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.359868 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/45901fbf-99ef-4ef7-9cc5-6783ae123708-dispersionconf\") pod \"swift-ring-rebalance-debug-8q2gl\" (UID: \"45901fbf-99ef-4ef7-9cc5-6783ae123708\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl" Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.367268 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2vfd\" (UniqueName: \"kubernetes.io/projected/45901fbf-99ef-4ef7-9cc5-6783ae123708-kube-api-access-h2vfd\") pod \"swift-ring-rebalance-debug-8q2gl\" (UID: \"45901fbf-99ef-4ef7-9cc5-6783ae123708\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl" Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.442635 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl" Mar 09 09:55:18 crc kubenswrapper[4971]: I0309 09:55:18.872182 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl"] Mar 09 09:55:18 crc kubenswrapper[4971]: W0309 09:55:18.878886 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45901fbf_99ef_4ef7_9cc5_6783ae123708.slice/crio-b1d4a0eb8d3ae69c0c644abf378e3dec96785d2f669d4d356dcfb78ea2a97bdd WatchSource:0}: Error finding container b1d4a0eb8d3ae69c0c644abf378e3dec96785d2f669d4d356dcfb78ea2a97bdd: Status 404 returned error can't find the container with id b1d4a0eb8d3ae69c0c644abf378e3dec96785d2f669d4d356dcfb78ea2a97bdd Mar 09 09:55:19 crc kubenswrapper[4971]: I0309 09:55:19.683270 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl" event={"ID":"45901fbf-99ef-4ef7-9cc5-6783ae123708","Type":"ContainerStarted","Data":"387b2365a9ff29875776b85adc14987f2110e074d2204addf6cc14451029b082"} Mar 09 09:55:19 crc kubenswrapper[4971]: I0309 09:55:19.684322 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl" event={"ID":"45901fbf-99ef-4ef7-9cc5-6783ae123708","Type":"ContainerStarted","Data":"b1d4a0eb8d3ae69c0c644abf378e3dec96785d2f669d4d356dcfb78ea2a97bdd"} Mar 09 09:55:20 crc kubenswrapper[4971]: I0309 09:55:20.699966 4971 generic.go:334] "Generic (PLEG): container finished" podID="45901fbf-99ef-4ef7-9cc5-6783ae123708" containerID="387b2365a9ff29875776b85adc14987f2110e074d2204addf6cc14451029b082" exitCode=0 Mar 09 09:55:20 crc kubenswrapper[4971]: I0309 09:55:20.701152 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl" event={"ID":"45901fbf-99ef-4ef7-9cc5-6783ae123708","Type":"ContainerDied","Data":"387b2365a9ff29875776b85adc14987f2110e074d2204addf6cc14451029b082"} Mar 09 09:55:22 crc kubenswrapper[4971]: I0309 09:55:22.016564 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl" Mar 09 09:55:22 crc kubenswrapper[4971]: I0309 09:55:22.046702 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl"] Mar 09 09:55:22 crc kubenswrapper[4971]: I0309 09:55:22.052072 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl"] Mar 09 09:55:22 crc kubenswrapper[4971]: I0309 09:55:22.112227 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/45901fbf-99ef-4ef7-9cc5-6783ae123708-ring-data-devices\") pod \"45901fbf-99ef-4ef7-9cc5-6783ae123708\" (UID: \"45901fbf-99ef-4ef7-9cc5-6783ae123708\") " Mar 09 09:55:22 crc kubenswrapper[4971]: I0309 09:55:22.112293 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2vfd\" (UniqueName: \"kubernetes.io/projected/45901fbf-99ef-4ef7-9cc5-6783ae123708-kube-api-access-h2vfd\") pod \"45901fbf-99ef-4ef7-9cc5-6783ae123708\" (UID: \"45901fbf-99ef-4ef7-9cc5-6783ae123708\") " Mar 09 09:55:22 crc kubenswrapper[4971]: I0309 09:55:22.112433 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/45901fbf-99ef-4ef7-9cc5-6783ae123708-dispersionconf\") pod \"45901fbf-99ef-4ef7-9cc5-6783ae123708\" (UID: \"45901fbf-99ef-4ef7-9cc5-6783ae123708\") " Mar 09 09:55:22 crc kubenswrapper[4971]: I0309 09:55:22.112477 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/45901fbf-99ef-4ef7-9cc5-6783ae123708-etc-swift\") pod \"45901fbf-99ef-4ef7-9cc5-6783ae123708\" (UID: \"45901fbf-99ef-4ef7-9cc5-6783ae123708\") " Mar 09 09:55:22 crc kubenswrapper[4971]: I0309 09:55:22.112541 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45901fbf-99ef-4ef7-9cc5-6783ae123708-scripts\") pod \"45901fbf-99ef-4ef7-9cc5-6783ae123708\" (UID: \"45901fbf-99ef-4ef7-9cc5-6783ae123708\") " Mar 09 09:55:22 crc kubenswrapper[4971]: I0309 09:55:22.112564 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/45901fbf-99ef-4ef7-9cc5-6783ae123708-swiftconf\") pod \"45901fbf-99ef-4ef7-9cc5-6783ae123708\" (UID: \"45901fbf-99ef-4ef7-9cc5-6783ae123708\") " Mar 09 09:55:22 crc kubenswrapper[4971]: I0309 09:55:22.112841 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45901fbf-99ef-4ef7-9cc5-6783ae123708-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "45901fbf-99ef-4ef7-9cc5-6783ae123708" (UID: "45901fbf-99ef-4ef7-9cc5-6783ae123708"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:55:22 crc kubenswrapper[4971]: I0309 09:55:22.113232 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45901fbf-99ef-4ef7-9cc5-6783ae123708-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "45901fbf-99ef-4ef7-9cc5-6783ae123708" (UID: "45901fbf-99ef-4ef7-9cc5-6783ae123708"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:55:22 crc kubenswrapper[4971]: I0309 09:55:22.118071 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45901fbf-99ef-4ef7-9cc5-6783ae123708-kube-api-access-h2vfd" (OuterVolumeSpecName: "kube-api-access-h2vfd") pod "45901fbf-99ef-4ef7-9cc5-6783ae123708" (UID: "45901fbf-99ef-4ef7-9cc5-6783ae123708"). InnerVolumeSpecName "kube-api-access-h2vfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:55:22 crc kubenswrapper[4971]: I0309 09:55:22.136820 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45901fbf-99ef-4ef7-9cc5-6783ae123708-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "45901fbf-99ef-4ef7-9cc5-6783ae123708" (UID: "45901fbf-99ef-4ef7-9cc5-6783ae123708"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:55:22 crc kubenswrapper[4971]: I0309 09:55:22.141497 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45901fbf-99ef-4ef7-9cc5-6783ae123708-scripts" (OuterVolumeSpecName: "scripts") pod "45901fbf-99ef-4ef7-9cc5-6783ae123708" (UID: "45901fbf-99ef-4ef7-9cc5-6783ae123708"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:55:22 crc kubenswrapper[4971]: I0309 09:55:22.141798 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45901fbf-99ef-4ef7-9cc5-6783ae123708-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "45901fbf-99ef-4ef7-9cc5-6783ae123708" (UID: "45901fbf-99ef-4ef7-9cc5-6783ae123708"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:55:22 crc kubenswrapper[4971]: I0309 09:55:22.214522 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45901fbf-99ef-4ef7-9cc5-6783ae123708-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:22 crc kubenswrapper[4971]: I0309 09:55:22.214554 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/45901fbf-99ef-4ef7-9cc5-6783ae123708-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:22 crc kubenswrapper[4971]: I0309 09:55:22.214565 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/45901fbf-99ef-4ef7-9cc5-6783ae123708-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:22 crc kubenswrapper[4971]: I0309 09:55:22.214577 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2vfd\" (UniqueName: \"kubernetes.io/projected/45901fbf-99ef-4ef7-9cc5-6783ae123708-kube-api-access-h2vfd\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:22 crc kubenswrapper[4971]: I0309 09:55:22.214595 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/45901fbf-99ef-4ef7-9cc5-6783ae123708-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:22 crc kubenswrapper[4971]: I0309 09:55:22.214605 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/45901fbf-99ef-4ef7-9cc5-6783ae123708-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:22 crc kubenswrapper[4971]: I0309 09:55:22.718532 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1d4a0eb8d3ae69c0c644abf378e3dec96785d2f669d4d356dcfb78ea2a97bdd" Mar 09 09:55:22 crc kubenswrapper[4971]: I0309 09:55:22.718603 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8q2gl" Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.161724 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45901fbf-99ef-4ef7-9cc5-6783ae123708" path="/var/lib/kubelet/pods/45901fbf-99ef-4ef7-9cc5-6783ae123708/volumes" Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.190160 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh"] Mar 09 09:55:23 crc kubenswrapper[4971]: E0309 09:55:23.190537 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45901fbf-99ef-4ef7-9cc5-6783ae123708" containerName="swift-ring-rebalance" Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.190557 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="45901fbf-99ef-4ef7-9cc5-6783ae123708" containerName="swift-ring-rebalance" Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.190796 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="45901fbf-99ef-4ef7-9cc5-6783ae123708" containerName="swift-ring-rebalance" Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.191396 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh" Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.193989 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.196719 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.200561 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh"] Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.330317 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ba5d91d1-801d-4210-8374-35b8fc5a7611-etc-swift\") pod \"swift-ring-rebalance-debug-4ttdh\" (UID: \"ba5d91d1-801d-4210-8374-35b8fc5a7611\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh" Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.330425 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ba5d91d1-801d-4210-8374-35b8fc5a7611-dispersionconf\") pod \"swift-ring-rebalance-debug-4ttdh\" (UID: \"ba5d91d1-801d-4210-8374-35b8fc5a7611\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh" Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.330502 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ba5d91d1-801d-4210-8374-35b8fc5a7611-ring-data-devices\") pod \"swift-ring-rebalance-debug-4ttdh\" (UID: \"ba5d91d1-801d-4210-8374-35b8fc5a7611\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh" Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.330589 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba5d91d1-801d-4210-8374-35b8fc5a7611-scripts\") pod \"swift-ring-rebalance-debug-4ttdh\" (UID: \"ba5d91d1-801d-4210-8374-35b8fc5a7611\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh" Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.330686 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj6bh\" (UniqueName: \"kubernetes.io/projected/ba5d91d1-801d-4210-8374-35b8fc5a7611-kube-api-access-fj6bh\") pod \"swift-ring-rebalance-debug-4ttdh\" (UID: \"ba5d91d1-801d-4210-8374-35b8fc5a7611\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh" Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.330734 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ba5d91d1-801d-4210-8374-35b8fc5a7611-swiftconf\") pod \"swift-ring-rebalance-debug-4ttdh\" (UID: \"ba5d91d1-801d-4210-8374-35b8fc5a7611\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh" Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.432338 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba5d91d1-801d-4210-8374-35b8fc5a7611-scripts\") pod \"swift-ring-rebalance-debug-4ttdh\" (UID: \"ba5d91d1-801d-4210-8374-35b8fc5a7611\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh" Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.432433 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj6bh\" (UniqueName: \"kubernetes.io/projected/ba5d91d1-801d-4210-8374-35b8fc5a7611-kube-api-access-fj6bh\") pod \"swift-ring-rebalance-debug-4ttdh\" (UID: \"ba5d91d1-801d-4210-8374-35b8fc5a7611\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh" Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.432480 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ba5d91d1-801d-4210-8374-35b8fc5a7611-swiftconf\") pod \"swift-ring-rebalance-debug-4ttdh\" (UID: \"ba5d91d1-801d-4210-8374-35b8fc5a7611\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh" Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.432906 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ba5d91d1-801d-4210-8374-35b8fc5a7611-etc-swift\") pod \"swift-ring-rebalance-debug-4ttdh\" (UID: \"ba5d91d1-801d-4210-8374-35b8fc5a7611\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh" Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.433322 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ba5d91d1-801d-4210-8374-35b8fc5a7611-etc-swift\") pod \"swift-ring-rebalance-debug-4ttdh\" (UID: \"ba5d91d1-801d-4210-8374-35b8fc5a7611\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh" Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.433341 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba5d91d1-801d-4210-8374-35b8fc5a7611-scripts\") pod \"swift-ring-rebalance-debug-4ttdh\" (UID: \"ba5d91d1-801d-4210-8374-35b8fc5a7611\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh" Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.433486 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ba5d91d1-801d-4210-8374-35b8fc5a7611-dispersionconf\") pod \"swift-ring-rebalance-debug-4ttdh\" (UID: \"ba5d91d1-801d-4210-8374-35b8fc5a7611\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh" Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.433523 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ba5d91d1-801d-4210-8374-35b8fc5a7611-ring-data-devices\") pod \"swift-ring-rebalance-debug-4ttdh\" (UID: \"ba5d91d1-801d-4210-8374-35b8fc5a7611\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh" Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.434072 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ba5d91d1-801d-4210-8374-35b8fc5a7611-ring-data-devices\") pod \"swift-ring-rebalance-debug-4ttdh\" (UID: \"ba5d91d1-801d-4210-8374-35b8fc5a7611\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh" Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.436819 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ba5d91d1-801d-4210-8374-35b8fc5a7611-dispersionconf\") pod \"swift-ring-rebalance-debug-4ttdh\" (UID: \"ba5d91d1-801d-4210-8374-35b8fc5a7611\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh" Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.436955 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ba5d91d1-801d-4210-8374-35b8fc5a7611-swiftconf\") pod \"swift-ring-rebalance-debug-4ttdh\" (UID: \"ba5d91d1-801d-4210-8374-35b8fc5a7611\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh" Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.449205 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj6bh\" (UniqueName: \"kubernetes.io/projected/ba5d91d1-801d-4210-8374-35b8fc5a7611-kube-api-access-fj6bh\") pod \"swift-ring-rebalance-debug-4ttdh\" (UID: \"ba5d91d1-801d-4210-8374-35b8fc5a7611\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh" Mar 09 09:55:23 crc kubenswrapper[4971]: I0309 09:55:23.521390 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh" Mar 09 09:55:24 crc kubenswrapper[4971]: I0309 09:55:24.004969 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh"] Mar 09 09:55:24 crc kubenswrapper[4971]: I0309 09:55:24.738698 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh" event={"ID":"ba5d91d1-801d-4210-8374-35b8fc5a7611","Type":"ContainerStarted","Data":"330739818d80c60c2e3845ec0ef77943f8f055cf3cd5544489160b28e44b4f39"} Mar 09 09:55:24 crc kubenswrapper[4971]: I0309 09:55:24.739043 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh" event={"ID":"ba5d91d1-801d-4210-8374-35b8fc5a7611","Type":"ContainerStarted","Data":"0eea51af49e947cae2fc14a305a6d49bb04d395525dd210ca1eea8617d8c0a17"} Mar 09 09:55:24 crc kubenswrapper[4971]: I0309 09:55:24.759967 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh" podStartSLOduration=1.759940529 podStartE2EDuration="1.759940529s" podCreationTimestamp="2026-03-09 09:55:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:55:24.756950224 +0000 UTC m=+2128.316878034" watchObservedRunningTime="2026-03-09 09:55:24.759940529 +0000 UTC m=+2128.319868359" Mar 09 09:55:25 crc kubenswrapper[4971]: I0309 09:55:25.749785 4971 generic.go:334] "Generic (PLEG): container finished" podID="ba5d91d1-801d-4210-8374-35b8fc5a7611" containerID="330739818d80c60c2e3845ec0ef77943f8f055cf3cd5544489160b28e44b4f39" exitCode=0 Mar 09 09:55:25 crc kubenswrapper[4971]: I0309 09:55:25.749838 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh" event={"ID":"ba5d91d1-801d-4210-8374-35b8fc5a7611","Type":"ContainerDied","Data":"330739818d80c60c2e3845ec0ef77943f8f055cf3cd5544489160b28e44b4f39"} Mar 09 09:55:27 crc kubenswrapper[4971]: I0309 09:55:27.082096 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh" Mar 09 09:55:27 crc kubenswrapper[4971]: I0309 09:55:27.112757 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh"] Mar 09 09:55:27 crc kubenswrapper[4971]: I0309 09:55:27.123328 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh"] Mar 09 09:55:27 crc kubenswrapper[4971]: I0309 09:55:27.234760 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fj6bh\" (UniqueName: \"kubernetes.io/projected/ba5d91d1-801d-4210-8374-35b8fc5a7611-kube-api-access-fj6bh\") pod \"ba5d91d1-801d-4210-8374-35b8fc5a7611\" (UID: \"ba5d91d1-801d-4210-8374-35b8fc5a7611\") " Mar 09 09:55:27 crc kubenswrapper[4971]: I0309 09:55:27.234871 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ba5d91d1-801d-4210-8374-35b8fc5a7611-ring-data-devices\") pod \"ba5d91d1-801d-4210-8374-35b8fc5a7611\" (UID: \"ba5d91d1-801d-4210-8374-35b8fc5a7611\") " Mar 09 09:55:27 crc kubenswrapper[4971]: I0309 09:55:27.234903 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ba5d91d1-801d-4210-8374-35b8fc5a7611-dispersionconf\") pod \"ba5d91d1-801d-4210-8374-35b8fc5a7611\" (UID: \"ba5d91d1-801d-4210-8374-35b8fc5a7611\") " Mar 09 09:55:27 crc kubenswrapper[4971]: I0309 09:55:27.234933 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ba5d91d1-801d-4210-8374-35b8fc5a7611-swiftconf\") pod \"ba5d91d1-801d-4210-8374-35b8fc5a7611\" (UID: \"ba5d91d1-801d-4210-8374-35b8fc5a7611\") " Mar 09 09:55:27 crc kubenswrapper[4971]: I0309 09:55:27.234959 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba5d91d1-801d-4210-8374-35b8fc5a7611-scripts\") pod \"ba5d91d1-801d-4210-8374-35b8fc5a7611\" (UID: \"ba5d91d1-801d-4210-8374-35b8fc5a7611\") " Mar 09 09:55:27 crc kubenswrapper[4971]: I0309 09:55:27.234992 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ba5d91d1-801d-4210-8374-35b8fc5a7611-etc-swift\") pod \"ba5d91d1-801d-4210-8374-35b8fc5a7611\" (UID: \"ba5d91d1-801d-4210-8374-35b8fc5a7611\") " Mar 09 09:55:27 crc kubenswrapper[4971]: I0309 09:55:27.235881 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba5d91d1-801d-4210-8374-35b8fc5a7611-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ba5d91d1-801d-4210-8374-35b8fc5a7611" (UID: "ba5d91d1-801d-4210-8374-35b8fc5a7611"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:55:27 crc kubenswrapper[4971]: I0309 09:55:27.236152 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba5d91d1-801d-4210-8374-35b8fc5a7611-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ba5d91d1-801d-4210-8374-35b8fc5a7611" (UID: "ba5d91d1-801d-4210-8374-35b8fc5a7611"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:55:27 crc kubenswrapper[4971]: I0309 09:55:27.240900 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba5d91d1-801d-4210-8374-35b8fc5a7611-kube-api-access-fj6bh" (OuterVolumeSpecName: "kube-api-access-fj6bh") pod "ba5d91d1-801d-4210-8374-35b8fc5a7611" (UID: "ba5d91d1-801d-4210-8374-35b8fc5a7611"). InnerVolumeSpecName "kube-api-access-fj6bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:55:27 crc kubenswrapper[4971]: I0309 09:55:27.258079 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba5d91d1-801d-4210-8374-35b8fc5a7611-scripts" (OuterVolumeSpecName: "scripts") pod "ba5d91d1-801d-4210-8374-35b8fc5a7611" (UID: "ba5d91d1-801d-4210-8374-35b8fc5a7611"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:55:27 crc kubenswrapper[4971]: I0309 09:55:27.260825 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba5d91d1-801d-4210-8374-35b8fc5a7611-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ba5d91d1-801d-4210-8374-35b8fc5a7611" (UID: "ba5d91d1-801d-4210-8374-35b8fc5a7611"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:55:27 crc kubenswrapper[4971]: I0309 09:55:27.261567 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba5d91d1-801d-4210-8374-35b8fc5a7611-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ba5d91d1-801d-4210-8374-35b8fc5a7611" (UID: "ba5d91d1-801d-4210-8374-35b8fc5a7611"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:55:27 crc kubenswrapper[4971]: I0309 09:55:27.336538 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fj6bh\" (UniqueName: \"kubernetes.io/projected/ba5d91d1-801d-4210-8374-35b8fc5a7611-kube-api-access-fj6bh\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:27 crc kubenswrapper[4971]: I0309 09:55:27.336594 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ba5d91d1-801d-4210-8374-35b8fc5a7611-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:27 crc kubenswrapper[4971]: I0309 09:55:27.336607 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ba5d91d1-801d-4210-8374-35b8fc5a7611-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:27 crc kubenswrapper[4971]: I0309 09:55:27.336619 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ba5d91d1-801d-4210-8374-35b8fc5a7611-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:27 crc kubenswrapper[4971]: I0309 09:55:27.336630 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba5d91d1-801d-4210-8374-35b8fc5a7611-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:27 crc kubenswrapper[4971]: I0309 09:55:27.336642 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ba5d91d1-801d-4210-8374-35b8fc5a7611-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:27 crc kubenswrapper[4971]: I0309 09:55:27.770990 4971 scope.go:117] "RemoveContainer" containerID="330739818d80c60c2e3845ec0ef77943f8f055cf3cd5544489160b28e44b4f39" Mar 09 09:55:27 crc kubenswrapper[4971]: I0309 09:55:27.771063 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4ttdh" Mar 09 09:55:28 crc kubenswrapper[4971]: I0309 09:55:28.270882 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc"] Mar 09 09:55:28 crc kubenswrapper[4971]: E0309 09:55:28.271592 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba5d91d1-801d-4210-8374-35b8fc5a7611" containerName="swift-ring-rebalance" Mar 09 09:55:28 crc kubenswrapper[4971]: I0309 09:55:28.271611 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba5d91d1-801d-4210-8374-35b8fc5a7611" containerName="swift-ring-rebalance" Mar 09 09:55:28 crc kubenswrapper[4971]: I0309 09:55:28.272019 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba5d91d1-801d-4210-8374-35b8fc5a7611" containerName="swift-ring-rebalance" Mar 09 09:55:28 crc kubenswrapper[4971]: I0309 09:55:28.272933 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc" Mar 09 09:55:28 crc kubenswrapper[4971]: I0309 09:55:28.276205 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:55:28 crc kubenswrapper[4971]: I0309 09:55:28.278000 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:55:28 crc kubenswrapper[4971]: I0309 09:55:28.303258 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc"] Mar 09 09:55:28 crc kubenswrapper[4971]: I0309 09:55:28.455221 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-swiftconf\") pod \"swift-ring-rebalance-debug-lwqcc\" (UID: \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc" Mar 09 09:55:28 crc kubenswrapper[4971]: I0309 09:55:28.455311 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-ring-data-devices\") pod \"swift-ring-rebalance-debug-lwqcc\" (UID: \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc" Mar 09 09:55:28 crc kubenswrapper[4971]: I0309 09:55:28.455514 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-dispersionconf\") pod \"swift-ring-rebalance-debug-lwqcc\" (UID: \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc" Mar 09 09:55:28 crc kubenswrapper[4971]: I0309 09:55:28.455566 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n44n\" (UniqueName: \"kubernetes.io/projected/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-kube-api-access-6n44n\") pod \"swift-ring-rebalance-debug-lwqcc\" (UID: \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc" Mar 09 09:55:28 crc kubenswrapper[4971]: I0309 09:55:28.455595 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-scripts\") pod \"swift-ring-rebalance-debug-lwqcc\" (UID: \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc" Mar 09 09:55:28 crc kubenswrapper[4971]: I0309 09:55:28.455661 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-etc-swift\") pod \"swift-ring-rebalance-debug-lwqcc\" (UID: \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc" Mar 09 09:55:28 crc kubenswrapper[4971]: I0309 09:55:28.557804 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-swiftconf\") pod \"swift-ring-rebalance-debug-lwqcc\" (UID: \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc" Mar 09 09:55:28 crc kubenswrapper[4971]: I0309 09:55:28.557931 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-ring-data-devices\") pod \"swift-ring-rebalance-debug-lwqcc\" (UID: \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc" Mar 09 09:55:28 crc kubenswrapper[4971]: I0309 09:55:28.558061 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-dispersionconf\") pod \"swift-ring-rebalance-debug-lwqcc\" (UID: \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc" Mar 09 09:55:28 crc kubenswrapper[4971]: I0309 09:55:28.558105 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n44n\" (UniqueName: \"kubernetes.io/projected/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-kube-api-access-6n44n\") pod \"swift-ring-rebalance-debug-lwqcc\" (UID: \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc" Mar 09 09:55:28 crc kubenswrapper[4971]: I0309 09:55:28.558153 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-scripts\") pod \"swift-ring-rebalance-debug-lwqcc\" (UID: \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc" Mar 09 09:55:28 crc kubenswrapper[4971]: I0309 09:55:28.558197 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-etc-swift\") pod \"swift-ring-rebalance-debug-lwqcc\" (UID: \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc" Mar 09 09:55:28 crc kubenswrapper[4971]: I0309 09:55:28.558814 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-etc-swift\") pod \"swift-ring-rebalance-debug-lwqcc\" (UID: \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc" Mar 09 09:55:28 crc kubenswrapper[4971]: I0309 09:55:28.560041 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-ring-data-devices\") pod \"swift-ring-rebalance-debug-lwqcc\" (UID: \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc" Mar 09 09:55:28 crc kubenswrapper[4971]: I0309 09:55:28.560165 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-scripts\") pod \"swift-ring-rebalance-debug-lwqcc\" (UID: \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc" Mar 09 09:55:28 crc kubenswrapper[4971]: I0309 09:55:28.562706 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-swiftconf\") pod \"swift-ring-rebalance-debug-lwqcc\" (UID: \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc" Mar 09 09:55:28 crc kubenswrapper[4971]: I0309 09:55:28.564179 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-dispersionconf\") pod \"swift-ring-rebalance-debug-lwqcc\" (UID: \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc" Mar 09 09:55:28 crc kubenswrapper[4971]: I0309 09:55:28.575674 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n44n\" (UniqueName: \"kubernetes.io/projected/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-kube-api-access-6n44n\") pod \"swift-ring-rebalance-debug-lwqcc\" (UID: \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc" Mar 09 09:55:28 crc kubenswrapper[4971]: I0309 09:55:28.599171 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc" Mar 09 09:55:29 crc kubenswrapper[4971]: I0309 09:55:29.161292 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba5d91d1-801d-4210-8374-35b8fc5a7611" path="/var/lib/kubelet/pods/ba5d91d1-801d-4210-8374-35b8fc5a7611/volumes" Mar 09 09:55:29 crc kubenswrapper[4971]: I0309 09:55:29.204381 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc"] Mar 09 09:55:29 crc kubenswrapper[4971]: I0309 09:55:29.797339 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc" event={"ID":"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05","Type":"ContainerStarted","Data":"981f1843c34316c5481bd60d5a3239a73b420beb85b61747c32bbd64f3b70863"} Mar 09 09:55:29 crc kubenswrapper[4971]: I0309 09:55:29.797734 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc" event={"ID":"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05","Type":"ContainerStarted","Data":"d89f70e1670657d3169177916d88f60cba52469db2d20001148a25d78dbb6fc6"} Mar 09 09:55:30 crc kubenswrapper[4971]: I0309 09:55:30.807733 4971 generic.go:334] "Generic (PLEG): container finished" podID="ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05" containerID="981f1843c34316c5481bd60d5a3239a73b420beb85b61747c32bbd64f3b70863" exitCode=0 Mar 09 09:55:30 crc kubenswrapper[4971]: I0309 09:55:30.807774 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc" event={"ID":"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05","Type":"ContainerDied","Data":"981f1843c34316c5481bd60d5a3239a73b420beb85b61747c32bbd64f3b70863"} Mar 09 09:55:32 crc kubenswrapper[4971]: I0309 09:55:32.100960 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc" Mar 09 09:55:32 crc kubenswrapper[4971]: I0309 09:55:32.115497 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n44n\" (UniqueName: \"kubernetes.io/projected/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-kube-api-access-6n44n\") pod \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\" (UID: \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\") " Mar 09 09:55:32 crc kubenswrapper[4971]: I0309 09:55:32.115545 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-dispersionconf\") pod \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\" (UID: \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\") " Mar 09 09:55:32 crc kubenswrapper[4971]: I0309 09:55:32.115586 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-etc-swift\") pod \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\" (UID: \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\") " Mar 09 09:55:32 crc kubenswrapper[4971]: I0309 09:55:32.115685 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-ring-data-devices\") pod \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\" (UID: \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\") " Mar 09 09:55:32 crc kubenswrapper[4971]: I0309 09:55:32.115726 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-scripts\") pod \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\" (UID: \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\") " Mar 09 09:55:32 crc kubenswrapper[4971]: I0309 09:55:32.115760 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-swiftconf\") pod \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\" (UID: \"ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05\") " Mar 09 09:55:32 crc kubenswrapper[4971]: I0309 09:55:32.116476 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05" (UID: "ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:55:32 crc kubenswrapper[4971]: I0309 09:55:32.116549 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05" (UID: "ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:55:32 crc kubenswrapper[4971]: I0309 09:55:32.120998 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-kube-api-access-6n44n" (OuterVolumeSpecName: "kube-api-access-6n44n") pod "ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05" (UID: "ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05"). InnerVolumeSpecName "kube-api-access-6n44n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:55:32 crc kubenswrapper[4971]: I0309 09:55:32.150428 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-scripts" (OuterVolumeSpecName: "scripts") pod "ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05" (UID: "ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:55:32 crc kubenswrapper[4971]: I0309 09:55:32.170100 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05" (UID: "ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:55:32 crc kubenswrapper[4971]: I0309 09:55:32.176994 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc"] Mar 09 09:55:32 crc kubenswrapper[4971]: I0309 09:55:32.179415 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05" (UID: "ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:55:32 crc kubenswrapper[4971]: I0309 09:55:32.184814 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc"] Mar 09 09:55:32 crc kubenswrapper[4971]: I0309 09:55:32.217165 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:32 crc kubenswrapper[4971]: I0309 09:55:32.217208 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:32 crc kubenswrapper[4971]: I0309 09:55:32.217221 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:32 crc kubenswrapper[4971]: I0309 09:55:32.217232 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n44n\" (UniqueName: \"kubernetes.io/projected/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-kube-api-access-6n44n\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:32 crc kubenswrapper[4971]: I0309 09:55:32.217246 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:32 crc kubenswrapper[4971]: I0309 09:55:32.217257 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:32 crc kubenswrapper[4971]: I0309 09:55:32.825249 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d89f70e1670657d3169177916d88f60cba52469db2d20001148a25d78dbb6fc6" Mar 09 09:55:32 crc kubenswrapper[4971]: I0309 09:55:32.825321 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwqcc" Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.163143 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05" path="/var/lib/kubelet/pods/ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05/volumes" Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.263904 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tls4v"] Mar 09 09:55:33 crc kubenswrapper[4971]: E0309 09:55:33.264174 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05" containerName="swift-ring-rebalance" Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.264190 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05" containerName="swift-ring-rebalance" Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.264338 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae0bc10c-ed34-4019-9ee1-1fd82e5a9d05" containerName="swift-ring-rebalance" Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.264840 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tls4v" Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.266764 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.267897 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.276208 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tls4v"] Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.331303 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-555pk\" (UniqueName: \"kubernetes.io/projected/9441a9ca-ccfa-450d-8ecb-1aa820557164-kube-api-access-555pk\") pod \"swift-ring-rebalance-debug-tls4v\" (UID: \"9441a9ca-ccfa-450d-8ecb-1aa820557164\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tls4v" Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.331451 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9441a9ca-ccfa-450d-8ecb-1aa820557164-dispersionconf\") pod \"swift-ring-rebalance-debug-tls4v\" (UID: \"9441a9ca-ccfa-450d-8ecb-1aa820557164\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tls4v" Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.331490 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9441a9ca-ccfa-450d-8ecb-1aa820557164-scripts\") pod \"swift-ring-rebalance-debug-tls4v\" (UID: \"9441a9ca-ccfa-450d-8ecb-1aa820557164\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tls4v" Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.331545 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9441a9ca-ccfa-450d-8ecb-1aa820557164-ring-data-devices\") pod \"swift-ring-rebalance-debug-tls4v\" (UID: \"9441a9ca-ccfa-450d-8ecb-1aa820557164\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tls4v" Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.331590 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9441a9ca-ccfa-450d-8ecb-1aa820557164-etc-swift\") pod \"swift-ring-rebalance-debug-tls4v\" (UID: \"9441a9ca-ccfa-450d-8ecb-1aa820557164\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tls4v" Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.331667 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9441a9ca-ccfa-450d-8ecb-1aa820557164-swiftconf\") pod \"swift-ring-rebalance-debug-tls4v\" (UID: \"9441a9ca-ccfa-450d-8ecb-1aa820557164\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tls4v" Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.432956 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9441a9ca-ccfa-450d-8ecb-1aa820557164-dispersionconf\") pod \"swift-ring-rebalance-debug-tls4v\" (UID: \"9441a9ca-ccfa-450d-8ecb-1aa820557164\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tls4v" Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.433010 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9441a9ca-ccfa-450d-8ecb-1aa820557164-scripts\") pod \"swift-ring-rebalance-debug-tls4v\" (UID: \"9441a9ca-ccfa-450d-8ecb-1aa820557164\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tls4v" Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.433038 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9441a9ca-ccfa-450d-8ecb-1aa820557164-ring-data-devices\") pod \"swift-ring-rebalance-debug-tls4v\" (UID: \"9441a9ca-ccfa-450d-8ecb-1aa820557164\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tls4v" Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.433069 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9441a9ca-ccfa-450d-8ecb-1aa820557164-etc-swift\") pod \"swift-ring-rebalance-debug-tls4v\" (UID: \"9441a9ca-ccfa-450d-8ecb-1aa820557164\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tls4v" Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.433104 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9441a9ca-ccfa-450d-8ecb-1aa820557164-swiftconf\") pod \"swift-ring-rebalance-debug-tls4v\" (UID: \"9441a9ca-ccfa-450d-8ecb-1aa820557164\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tls4v" Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.433141 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-555pk\" (UniqueName: \"kubernetes.io/projected/9441a9ca-ccfa-450d-8ecb-1aa820557164-kube-api-access-555pk\") pod \"swift-ring-rebalance-debug-tls4v\" (UID: \"9441a9ca-ccfa-450d-8ecb-1aa820557164\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tls4v" Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.433591 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9441a9ca-ccfa-450d-8ecb-1aa820557164-etc-swift\") pod \"swift-ring-rebalance-debug-tls4v\" (UID: \"9441a9ca-ccfa-450d-8ecb-1aa820557164\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tls4v" Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.433903 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9441a9ca-ccfa-450d-8ecb-1aa820557164-ring-data-devices\") pod \"swift-ring-rebalance-debug-tls4v\" (UID: \"9441a9ca-ccfa-450d-8ecb-1aa820557164\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tls4v" Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.434167 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9441a9ca-ccfa-450d-8ecb-1aa820557164-scripts\") pod \"swift-ring-rebalance-debug-tls4v\" (UID: \"9441a9ca-ccfa-450d-8ecb-1aa820557164\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tls4v" Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.436773 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9441a9ca-ccfa-450d-8ecb-1aa820557164-swiftconf\") pod \"swift-ring-rebalance-debug-tls4v\" (UID: \"9441a9ca-ccfa-450d-8ecb-1aa820557164\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tls4v" Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.437669 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9441a9ca-ccfa-450d-8ecb-1aa820557164-dispersionconf\") pod \"swift-ring-rebalance-debug-tls4v\" (UID: \"9441a9ca-ccfa-450d-8ecb-1aa820557164\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tls4v" Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.455855 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-555pk\" (UniqueName: \"kubernetes.io/projected/9441a9ca-ccfa-450d-8ecb-1aa820557164-kube-api-access-555pk\") pod \"swift-ring-rebalance-debug-tls4v\" (UID: \"9441a9ca-ccfa-450d-8ecb-1aa820557164\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tls4v" Mar 09 09:55:33 crc kubenswrapper[4971]: I0309 09:55:33.635049 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tls4v" Mar 09 09:55:34 crc kubenswrapper[4971]: I0309 09:55:34.136044 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tls4v"] Mar 09 09:55:34 crc kubenswrapper[4971]: I0309 09:55:34.846554 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tls4v" event={"ID":"9441a9ca-ccfa-450d-8ecb-1aa820557164","Type":"ContainerStarted","Data":"9b75a598bd5760c08bfec3d215e963d264f41fcd7034641b137fb9e1250ee069"} Mar 09 09:55:34 crc kubenswrapper[4971]: I0309 09:55:34.846604 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tls4v" event={"ID":"9441a9ca-ccfa-450d-8ecb-1aa820557164","Type":"ContainerStarted","Data":"a16301ba13b92eadfc7082127223c196a22efdf7ca4ad1dd3b4d8fdb48ec69ec"} Mar 09 09:55:34 crc kubenswrapper[4971]: I0309 09:55:34.870780 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tls4v" podStartSLOduration=1.870758687 podStartE2EDuration="1.870758687s" podCreationTimestamp="2026-03-09 09:55:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:55:34.863614846 +0000 UTC m=+2138.423542656" watchObservedRunningTime="2026-03-09 09:55:34.870758687 +0000 UTC m=+2138.430686497" Mar 09 09:55:35 crc kubenswrapper[4971]: I0309 09:55:35.861718 4971 generic.go:334] "Generic (PLEG): container finished" podID="9441a9ca-ccfa-450d-8ecb-1aa820557164" containerID="9b75a598bd5760c08bfec3d215e963d264f41fcd7034641b137fb9e1250ee069" exitCode=0 Mar 09 09:55:35 crc kubenswrapper[4971]: I0309 09:55:35.861769 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tls4v" event={"ID":"9441a9ca-ccfa-450d-8ecb-1aa820557164","Type":"ContainerDied","Data":"9b75a598bd5760c08bfec3d215e963d264f41fcd7034641b137fb9e1250ee069"} Mar 09 09:55:37 crc kubenswrapper[4971]: I0309 09:55:37.163623 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tls4v" Mar 09 09:55:37 crc kubenswrapper[4971]: I0309 09:55:37.200242 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tls4v"] Mar 09 09:55:37 crc kubenswrapper[4971]: I0309 09:55:37.228575 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tls4v"] Mar 09 09:55:37 crc kubenswrapper[4971]: I0309 09:55:37.288675 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9441a9ca-ccfa-450d-8ecb-1aa820557164-dispersionconf\") pod \"9441a9ca-ccfa-450d-8ecb-1aa820557164\" (UID: \"9441a9ca-ccfa-450d-8ecb-1aa820557164\") " Mar 09 09:55:37 crc kubenswrapper[4971]: I0309 09:55:37.288807 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9441a9ca-ccfa-450d-8ecb-1aa820557164-swiftconf\") pod \"9441a9ca-ccfa-450d-8ecb-1aa820557164\" (UID: \"9441a9ca-ccfa-450d-8ecb-1aa820557164\") " Mar 09 09:55:37 crc kubenswrapper[4971]: I0309 09:55:37.288847 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9441a9ca-ccfa-450d-8ecb-1aa820557164-scripts\") pod \"9441a9ca-ccfa-450d-8ecb-1aa820557164\" (UID: \"9441a9ca-ccfa-450d-8ecb-1aa820557164\") " Mar 09 09:55:37 crc kubenswrapper[4971]: I0309 09:55:37.288887 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-555pk\" (UniqueName: \"kubernetes.io/projected/9441a9ca-ccfa-450d-8ecb-1aa820557164-kube-api-access-555pk\") pod \"9441a9ca-ccfa-450d-8ecb-1aa820557164\" (UID: \"9441a9ca-ccfa-450d-8ecb-1aa820557164\") " Mar 09 09:55:37 crc kubenswrapper[4971]: I0309 09:55:37.288919 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9441a9ca-ccfa-450d-8ecb-1aa820557164-ring-data-devices\") pod \"9441a9ca-ccfa-450d-8ecb-1aa820557164\" (UID: \"9441a9ca-ccfa-450d-8ecb-1aa820557164\") " Mar 09 09:55:37 crc kubenswrapper[4971]: I0309 09:55:37.288977 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9441a9ca-ccfa-450d-8ecb-1aa820557164-etc-swift\") pod \"9441a9ca-ccfa-450d-8ecb-1aa820557164\" (UID: \"9441a9ca-ccfa-450d-8ecb-1aa820557164\") " Mar 09 09:55:37 crc kubenswrapper[4971]: I0309 09:55:37.289987 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9441a9ca-ccfa-450d-8ecb-1aa820557164-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9441a9ca-ccfa-450d-8ecb-1aa820557164" (UID: "9441a9ca-ccfa-450d-8ecb-1aa820557164"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:55:37 crc kubenswrapper[4971]: I0309 09:55:37.290386 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9441a9ca-ccfa-450d-8ecb-1aa820557164-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9441a9ca-ccfa-450d-8ecb-1aa820557164" (UID: "9441a9ca-ccfa-450d-8ecb-1aa820557164"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:55:37 crc kubenswrapper[4971]: I0309 09:55:37.305499 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9441a9ca-ccfa-450d-8ecb-1aa820557164-kube-api-access-555pk" (OuterVolumeSpecName: "kube-api-access-555pk") pod "9441a9ca-ccfa-450d-8ecb-1aa820557164" (UID: "9441a9ca-ccfa-450d-8ecb-1aa820557164"). InnerVolumeSpecName "kube-api-access-555pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:55:37 crc kubenswrapper[4971]: I0309 09:55:37.314559 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9441a9ca-ccfa-450d-8ecb-1aa820557164-scripts" (OuterVolumeSpecName: "scripts") pod "9441a9ca-ccfa-450d-8ecb-1aa820557164" (UID: "9441a9ca-ccfa-450d-8ecb-1aa820557164"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:55:37 crc kubenswrapper[4971]: I0309 09:55:37.314744 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9441a9ca-ccfa-450d-8ecb-1aa820557164-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9441a9ca-ccfa-450d-8ecb-1aa820557164" (UID: "9441a9ca-ccfa-450d-8ecb-1aa820557164"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:55:37 crc kubenswrapper[4971]: I0309 09:55:37.672484 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9441a9ca-ccfa-450d-8ecb-1aa820557164-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9441a9ca-ccfa-450d-8ecb-1aa820557164" (UID: "9441a9ca-ccfa-450d-8ecb-1aa820557164"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:55:37 crc kubenswrapper[4971]: I0309 09:55:37.675745 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9441a9ca-ccfa-450d-8ecb-1aa820557164-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:37 crc kubenswrapper[4971]: I0309 09:55:37.675776 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9441a9ca-ccfa-450d-8ecb-1aa820557164-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:37 crc kubenswrapper[4971]: I0309 09:55:37.675789 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9441a9ca-ccfa-450d-8ecb-1aa820557164-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:37 crc kubenswrapper[4971]: I0309 09:55:37.675798 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-555pk\" (UniqueName: \"kubernetes.io/projected/9441a9ca-ccfa-450d-8ecb-1aa820557164-kube-api-access-555pk\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:37 crc kubenswrapper[4971]: I0309 09:55:37.675810 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9441a9ca-ccfa-450d-8ecb-1aa820557164-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:37 crc kubenswrapper[4971]: I0309 09:55:37.676054 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9441a9ca-ccfa-450d-8ecb-1aa820557164-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:37 crc kubenswrapper[4971]: I0309 09:55:37.879247 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a16301ba13b92eadfc7082127223c196a22efdf7ca4ad1dd3b4d8fdb48ec69ec" Mar 09 09:55:37 crc kubenswrapper[4971]: I0309 09:55:37.879536 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tls4v" Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.350916 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-trxfp"] Mar 09 09:55:38 crc kubenswrapper[4971]: E0309 09:55:38.351215 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9441a9ca-ccfa-450d-8ecb-1aa820557164" containerName="swift-ring-rebalance" Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.351228 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="9441a9ca-ccfa-450d-8ecb-1aa820557164" containerName="swift-ring-rebalance" Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.351404 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="9441a9ca-ccfa-450d-8ecb-1aa820557164" containerName="swift-ring-rebalance" Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.351903 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-trxfp" Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.353834 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.354181 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.361880 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-trxfp"] Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.384673 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fc13ef8b-d0df-4b38-9905-ff34f13ef756-ring-data-devices\") pod \"swift-ring-rebalance-debug-trxfp\" (UID: \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-trxfp" Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.384746 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fc13ef8b-d0df-4b38-9905-ff34f13ef756-swiftconf\") pod \"swift-ring-rebalance-debug-trxfp\" (UID: \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-trxfp" Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.384771 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfnxl\" (UniqueName: \"kubernetes.io/projected/fc13ef8b-d0df-4b38-9905-ff34f13ef756-kube-api-access-zfnxl\") pod \"swift-ring-rebalance-debug-trxfp\" (UID: \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-trxfp" Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.384828 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fc13ef8b-d0df-4b38-9905-ff34f13ef756-etc-swift\") pod \"swift-ring-rebalance-debug-trxfp\" (UID: \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-trxfp" Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.384920 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fc13ef8b-d0df-4b38-9905-ff34f13ef756-dispersionconf\") pod \"swift-ring-rebalance-debug-trxfp\" (UID: \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-trxfp" Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.384958 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc13ef8b-d0df-4b38-9905-ff34f13ef756-scripts\") pod \"swift-ring-rebalance-debug-trxfp\" (UID: \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-trxfp" Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.485722 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fc13ef8b-d0df-4b38-9905-ff34f13ef756-dispersionconf\") pod \"swift-ring-rebalance-debug-trxfp\" (UID: \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-trxfp" Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.485790 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc13ef8b-d0df-4b38-9905-ff34f13ef756-scripts\") pod \"swift-ring-rebalance-debug-trxfp\" (UID: \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-trxfp" Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.485819 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fc13ef8b-d0df-4b38-9905-ff34f13ef756-ring-data-devices\") pod \"swift-ring-rebalance-debug-trxfp\" (UID: \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-trxfp" Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.485853 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fc13ef8b-d0df-4b38-9905-ff34f13ef756-swiftconf\") pod \"swift-ring-rebalance-debug-trxfp\" (UID: \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-trxfp" Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.485877 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfnxl\" (UniqueName: \"kubernetes.io/projected/fc13ef8b-d0df-4b38-9905-ff34f13ef756-kube-api-access-zfnxl\") pod \"swift-ring-rebalance-debug-trxfp\" (UID: \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-trxfp" Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.485922 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fc13ef8b-d0df-4b38-9905-ff34f13ef756-etc-swift\") pod \"swift-ring-rebalance-debug-trxfp\" (UID: \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-trxfp" Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.487314 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fc13ef8b-d0df-4b38-9905-ff34f13ef756-ring-data-devices\") pod \"swift-ring-rebalance-debug-trxfp\" (UID: \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-trxfp" Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.487440 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc13ef8b-d0df-4b38-9905-ff34f13ef756-scripts\") pod \"swift-ring-rebalance-debug-trxfp\" (UID: \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-trxfp" Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.487565 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fc13ef8b-d0df-4b38-9905-ff34f13ef756-etc-swift\") pod \"swift-ring-rebalance-debug-trxfp\" (UID: \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-trxfp" Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.491806 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fc13ef8b-d0df-4b38-9905-ff34f13ef756-dispersionconf\") pod \"swift-ring-rebalance-debug-trxfp\" (UID: \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-trxfp" Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.505021 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fc13ef8b-d0df-4b38-9905-ff34f13ef756-swiftconf\") pod \"swift-ring-rebalance-debug-trxfp\" (UID: \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-trxfp" Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.509007 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfnxl\" (UniqueName: \"kubernetes.io/projected/fc13ef8b-d0df-4b38-9905-ff34f13ef756-kube-api-access-zfnxl\") pod \"swift-ring-rebalance-debug-trxfp\" (UID: \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-trxfp" Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.691801 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-trxfp" Mar 09 09:55:38 crc kubenswrapper[4971]: I0309 09:55:38.927892 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-trxfp"] Mar 09 09:55:38 crc kubenswrapper[4971]: W0309 09:55:38.934558 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc13ef8b_d0df_4b38_9905_ff34f13ef756.slice/crio-2b0bd3b1e26f9a2867d29bd534225010183f63e97fd1de3215e5c40e282bc461 WatchSource:0}: Error finding container 2b0bd3b1e26f9a2867d29bd534225010183f63e97fd1de3215e5c40e282bc461: Status 404 returned error can't find the container with id 2b0bd3b1e26f9a2867d29bd534225010183f63e97fd1de3215e5c40e282bc461 Mar 09 09:55:39 crc kubenswrapper[4971]: I0309 09:55:39.164819 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9441a9ca-ccfa-450d-8ecb-1aa820557164" path="/var/lib/kubelet/pods/9441a9ca-ccfa-450d-8ecb-1aa820557164/volumes" Mar 09 09:55:39 crc kubenswrapper[4971]: I0309 09:55:39.898179 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-trxfp" event={"ID":"fc13ef8b-d0df-4b38-9905-ff34f13ef756","Type":"ContainerStarted","Data":"dd164751dddf8119010b520442b9226651c4d94e43e1fd8edc0ae8f360889bd4"} Mar 09 09:55:39 crc kubenswrapper[4971]: I0309 09:55:39.899703 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-trxfp" event={"ID":"fc13ef8b-d0df-4b38-9905-ff34f13ef756","Type":"ContainerStarted","Data":"2b0bd3b1e26f9a2867d29bd534225010183f63e97fd1de3215e5c40e282bc461"} Mar 09 09:55:39 crc kubenswrapper[4971]: I0309 09:55:39.931730 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-trxfp" podStartSLOduration=1.931713098 podStartE2EDuration="1.931713098s" podCreationTimestamp="2026-03-09 09:55:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:55:39.924516545 +0000 UTC m=+2143.484444355" watchObservedRunningTime="2026-03-09 09:55:39.931713098 +0000 UTC m=+2143.491640908" Mar 09 09:55:40 crc kubenswrapper[4971]: I0309 09:55:40.908444 4971 generic.go:334] "Generic (PLEG): container finished" podID="fc13ef8b-d0df-4b38-9905-ff34f13ef756" containerID="dd164751dddf8119010b520442b9226651c4d94e43e1fd8edc0ae8f360889bd4" exitCode=0 Mar 09 09:55:40 crc kubenswrapper[4971]: I0309 09:55:40.908498 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-trxfp" event={"ID":"fc13ef8b-d0df-4b38-9905-ff34f13ef756","Type":"ContainerDied","Data":"dd164751dddf8119010b520442b9226651c4d94e43e1fd8edc0ae8f360889bd4"} Mar 09 09:55:42 crc kubenswrapper[4971]: I0309 09:55:42.253703 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-trxfp" Mar 09 09:55:42 crc kubenswrapper[4971]: I0309 09:55:42.291400 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-trxfp"] Mar 09 09:55:42 crc kubenswrapper[4971]: I0309 09:55:42.295047 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-trxfp"] Mar 09 09:55:42 crc kubenswrapper[4971]: I0309 09:55:42.446818 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fc13ef8b-d0df-4b38-9905-ff34f13ef756-swiftconf\") pod \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\" (UID: \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\") " Mar 09 09:55:42 crc kubenswrapper[4971]: I0309 09:55:42.446872 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfnxl\" (UniqueName: \"kubernetes.io/projected/fc13ef8b-d0df-4b38-9905-ff34f13ef756-kube-api-access-zfnxl\") pod \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\" (UID: \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\") " Mar 09 09:55:42 crc kubenswrapper[4971]: I0309 09:55:42.446900 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fc13ef8b-d0df-4b38-9905-ff34f13ef756-dispersionconf\") pod \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\" (UID: \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\") " Mar 09 09:55:42 crc kubenswrapper[4971]: I0309 09:55:42.446957 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc13ef8b-d0df-4b38-9905-ff34f13ef756-scripts\") pod \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\" (UID: \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\") " Mar 09 09:55:42 crc kubenswrapper[4971]: I0309 09:55:42.447046 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fc13ef8b-d0df-4b38-9905-ff34f13ef756-ring-data-devices\") pod \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\" (UID: \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\") " Mar 09 09:55:42 crc kubenswrapper[4971]: I0309 09:55:42.447119 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fc13ef8b-d0df-4b38-9905-ff34f13ef756-etc-swift\") pod \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\" (UID: \"fc13ef8b-d0df-4b38-9905-ff34f13ef756\") " Mar 09 09:55:42 crc kubenswrapper[4971]: I0309 09:55:42.447926 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc13ef8b-d0df-4b38-9905-ff34f13ef756-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "fc13ef8b-d0df-4b38-9905-ff34f13ef756" (UID: "fc13ef8b-d0df-4b38-9905-ff34f13ef756"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:55:42 crc kubenswrapper[4971]: I0309 09:55:42.448101 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc13ef8b-d0df-4b38-9905-ff34f13ef756-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "fc13ef8b-d0df-4b38-9905-ff34f13ef756" (UID: "fc13ef8b-d0df-4b38-9905-ff34f13ef756"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:55:42 crc kubenswrapper[4971]: I0309 09:55:42.452273 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc13ef8b-d0df-4b38-9905-ff34f13ef756-kube-api-access-zfnxl" (OuterVolumeSpecName: "kube-api-access-zfnxl") pod "fc13ef8b-d0df-4b38-9905-ff34f13ef756" (UID: "fc13ef8b-d0df-4b38-9905-ff34f13ef756"). InnerVolumeSpecName "kube-api-access-zfnxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:55:42 crc kubenswrapper[4971]: I0309 09:55:42.468060 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc13ef8b-d0df-4b38-9905-ff34f13ef756-scripts" (OuterVolumeSpecName: "scripts") pod "fc13ef8b-d0df-4b38-9905-ff34f13ef756" (UID: "fc13ef8b-d0df-4b38-9905-ff34f13ef756"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:55:42 crc kubenswrapper[4971]: I0309 09:55:42.469216 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc13ef8b-d0df-4b38-9905-ff34f13ef756-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "fc13ef8b-d0df-4b38-9905-ff34f13ef756" (UID: "fc13ef8b-d0df-4b38-9905-ff34f13ef756"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:55:42 crc kubenswrapper[4971]: I0309 09:55:42.470101 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc13ef8b-d0df-4b38-9905-ff34f13ef756-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "fc13ef8b-d0df-4b38-9905-ff34f13ef756" (UID: "fc13ef8b-d0df-4b38-9905-ff34f13ef756"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:55:42 crc kubenswrapper[4971]: I0309 09:55:42.548508 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fc13ef8b-d0df-4b38-9905-ff34f13ef756-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:42 crc kubenswrapper[4971]: I0309 09:55:42.548560 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfnxl\" (UniqueName: \"kubernetes.io/projected/fc13ef8b-d0df-4b38-9905-ff34f13ef756-kube-api-access-zfnxl\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:42 crc kubenswrapper[4971]: I0309 09:55:42.548578 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fc13ef8b-d0df-4b38-9905-ff34f13ef756-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:42 crc kubenswrapper[4971]: I0309 09:55:42.548592 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc13ef8b-d0df-4b38-9905-ff34f13ef756-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:42 crc kubenswrapper[4971]: I0309 09:55:42.548606 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fc13ef8b-d0df-4b38-9905-ff34f13ef756-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:42 crc kubenswrapper[4971]: I0309 09:55:42.548618 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fc13ef8b-d0df-4b38-9905-ff34f13ef756-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:42 crc kubenswrapper[4971]: I0309 09:55:42.933505 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b0bd3b1e26f9a2867d29bd534225010183f63e97fd1de3215e5c40e282bc461" Mar 09 09:55:42 crc kubenswrapper[4971]: I0309 09:55:42.933587 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-trxfp" Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.162064 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc13ef8b-d0df-4b38-9905-ff34f13ef756" path="/var/lib/kubelet/pods/fc13ef8b-d0df-4b38-9905-ff34f13ef756/volumes" Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.420320 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8"] Mar 09 09:55:43 crc kubenswrapper[4971]: E0309 09:55:43.420628 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc13ef8b-d0df-4b38-9905-ff34f13ef756" containerName="swift-ring-rebalance" Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.420640 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc13ef8b-d0df-4b38-9905-ff34f13ef756" containerName="swift-ring-rebalance" Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.420810 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc13ef8b-d0df-4b38-9905-ff34f13ef756" containerName="swift-ring-rebalance" Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.421266 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8" Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.423215 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.423458 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.432087 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8"] Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.460715 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2dc785f6-1a40-487a-97da-c13ad804c776-swiftconf\") pod \"swift-ring-rebalance-debug-8qrq8\" (UID: \"2dc785f6-1a40-487a-97da-c13ad804c776\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8" Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.460794 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2dc785f6-1a40-487a-97da-c13ad804c776-dispersionconf\") pod \"swift-ring-rebalance-debug-8qrq8\" (UID: \"2dc785f6-1a40-487a-97da-c13ad804c776\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8" Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.460867 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2dc785f6-1a40-487a-97da-c13ad804c776-scripts\") pod \"swift-ring-rebalance-debug-8qrq8\" (UID: \"2dc785f6-1a40-487a-97da-c13ad804c776\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8" Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.460938 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2dc785f6-1a40-487a-97da-c13ad804c776-etc-swift\") pod \"swift-ring-rebalance-debug-8qrq8\" (UID: \"2dc785f6-1a40-487a-97da-c13ad804c776\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8" Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.460971 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2dc785f6-1a40-487a-97da-c13ad804c776-ring-data-devices\") pod \"swift-ring-rebalance-debug-8qrq8\" (UID: \"2dc785f6-1a40-487a-97da-c13ad804c776\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8" Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.563055 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2dc785f6-1a40-487a-97da-c13ad804c776-swiftconf\") pod \"swift-ring-rebalance-debug-8qrq8\" (UID: \"2dc785f6-1a40-487a-97da-c13ad804c776\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8" Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.563120 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2dc785f6-1a40-487a-97da-c13ad804c776-dispersionconf\") pod \"swift-ring-rebalance-debug-8qrq8\" (UID: \"2dc785f6-1a40-487a-97da-c13ad804c776\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8" Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.563174 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znltw\" (UniqueName: \"kubernetes.io/projected/2dc785f6-1a40-487a-97da-c13ad804c776-kube-api-access-znltw\") pod \"swift-ring-rebalance-debug-8qrq8\" (UID: \"2dc785f6-1a40-487a-97da-c13ad804c776\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8" Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.563202 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2dc785f6-1a40-487a-97da-c13ad804c776-scripts\") pod \"swift-ring-rebalance-debug-8qrq8\" (UID: \"2dc785f6-1a40-487a-97da-c13ad804c776\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8" Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.563255 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2dc785f6-1a40-487a-97da-c13ad804c776-etc-swift\") pod \"swift-ring-rebalance-debug-8qrq8\" (UID: \"2dc785f6-1a40-487a-97da-c13ad804c776\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8" Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.563279 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2dc785f6-1a40-487a-97da-c13ad804c776-ring-data-devices\") pod \"swift-ring-rebalance-debug-8qrq8\" (UID: \"2dc785f6-1a40-487a-97da-c13ad804c776\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8" Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.564328 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2dc785f6-1a40-487a-97da-c13ad804c776-etc-swift\") pod \"swift-ring-rebalance-debug-8qrq8\" (UID: \"2dc785f6-1a40-487a-97da-c13ad804c776\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8" Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.564533 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2dc785f6-1a40-487a-97da-c13ad804c776-scripts\") pod \"swift-ring-rebalance-debug-8qrq8\" (UID: \"2dc785f6-1a40-487a-97da-c13ad804c776\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8" Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.564540 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2dc785f6-1a40-487a-97da-c13ad804c776-ring-data-devices\") pod \"swift-ring-rebalance-debug-8qrq8\" (UID: \"2dc785f6-1a40-487a-97da-c13ad804c776\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8" Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.569013 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2dc785f6-1a40-487a-97da-c13ad804c776-swiftconf\") pod \"swift-ring-rebalance-debug-8qrq8\" (UID: \"2dc785f6-1a40-487a-97da-c13ad804c776\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8" Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.578019 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2dc785f6-1a40-487a-97da-c13ad804c776-dispersionconf\") pod \"swift-ring-rebalance-debug-8qrq8\" (UID: \"2dc785f6-1a40-487a-97da-c13ad804c776\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8" Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.665157 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znltw\" (UniqueName: \"kubernetes.io/projected/2dc785f6-1a40-487a-97da-c13ad804c776-kube-api-access-znltw\") pod \"swift-ring-rebalance-debug-8qrq8\" (UID: \"2dc785f6-1a40-487a-97da-c13ad804c776\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8" Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.684972 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znltw\" (UniqueName: \"kubernetes.io/projected/2dc785f6-1a40-487a-97da-c13ad804c776-kube-api-access-znltw\") pod \"swift-ring-rebalance-debug-8qrq8\" (UID: \"2dc785f6-1a40-487a-97da-c13ad804c776\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8" Mar 09 09:55:43 crc kubenswrapper[4971]: I0309 09:55:43.737665 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8" Mar 09 09:55:44 crc kubenswrapper[4971]: I0309 09:55:44.152803 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8"] Mar 09 09:55:44 crc kubenswrapper[4971]: I0309 09:55:44.958215 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8" event={"ID":"2dc785f6-1a40-487a-97da-c13ad804c776","Type":"ContainerStarted","Data":"0e67b4838cd380b4b2100e2af862182488f46704ee234f96714aade59f12ccc8"} Mar 09 09:55:44 crc kubenswrapper[4971]: I0309 09:55:44.958613 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8" event={"ID":"2dc785f6-1a40-487a-97da-c13ad804c776","Type":"ContainerStarted","Data":"1c61b3451ac34d2fd76aed6d89e334ef4242a0658cc48970e53ecffda3b200dd"} Mar 09 09:55:44 crc kubenswrapper[4971]: I0309 09:55:44.999407 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8" podStartSLOduration=1.999380749 podStartE2EDuration="1.999380749s" podCreationTimestamp="2026-03-09 09:55:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:55:44.986917657 +0000 UTC m=+2148.546845487" watchObservedRunningTime="2026-03-09 09:55:44.999380749 +0000 UTC m=+2148.559308559" Mar 09 09:55:45 crc kubenswrapper[4971]: I0309 09:55:45.970743 4971 generic.go:334] "Generic (PLEG): container finished" podID="2dc785f6-1a40-487a-97da-c13ad804c776" containerID="0e67b4838cd380b4b2100e2af862182488f46704ee234f96714aade59f12ccc8" exitCode=0 Mar 09 09:55:45 crc kubenswrapper[4971]: I0309 09:55:45.970888 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8" event={"ID":"2dc785f6-1a40-487a-97da-c13ad804c776","Type":"ContainerDied","Data":"0e67b4838cd380b4b2100e2af862182488f46704ee234f96714aade59f12ccc8"} Mar 09 09:55:47 crc kubenswrapper[4971]: I0309 09:55:47.319844 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8" Mar 09 09:55:47 crc kubenswrapper[4971]: I0309 09:55:47.360559 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8"] Mar 09 09:55:47 crc kubenswrapper[4971]: I0309 09:55:47.366283 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8"] Mar 09 09:55:47 crc kubenswrapper[4971]: I0309 09:55:47.435299 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2dc785f6-1a40-487a-97da-c13ad804c776-scripts\") pod \"2dc785f6-1a40-487a-97da-c13ad804c776\" (UID: \"2dc785f6-1a40-487a-97da-c13ad804c776\") " Mar 09 09:55:47 crc kubenswrapper[4971]: I0309 09:55:47.435452 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znltw\" (UniqueName: \"kubernetes.io/projected/2dc785f6-1a40-487a-97da-c13ad804c776-kube-api-access-znltw\") pod \"2dc785f6-1a40-487a-97da-c13ad804c776\" (UID: \"2dc785f6-1a40-487a-97da-c13ad804c776\") " Mar 09 09:55:47 crc kubenswrapper[4971]: I0309 09:55:47.435524 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2dc785f6-1a40-487a-97da-c13ad804c776-ring-data-devices\") pod \"2dc785f6-1a40-487a-97da-c13ad804c776\" (UID: \"2dc785f6-1a40-487a-97da-c13ad804c776\") " Mar 09 09:55:47 crc kubenswrapper[4971]: I0309 09:55:47.435551 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2dc785f6-1a40-487a-97da-c13ad804c776-swiftconf\") pod \"2dc785f6-1a40-487a-97da-c13ad804c776\" (UID: \"2dc785f6-1a40-487a-97da-c13ad804c776\") " Mar 09 09:55:47 crc kubenswrapper[4971]: I0309 09:55:47.435567 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2dc785f6-1a40-487a-97da-c13ad804c776-dispersionconf\") pod \"2dc785f6-1a40-487a-97da-c13ad804c776\" (UID: \"2dc785f6-1a40-487a-97da-c13ad804c776\") " Mar 09 09:55:47 crc kubenswrapper[4971]: I0309 09:55:47.435601 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2dc785f6-1a40-487a-97da-c13ad804c776-etc-swift\") pod \"2dc785f6-1a40-487a-97da-c13ad804c776\" (UID: \"2dc785f6-1a40-487a-97da-c13ad804c776\") " Mar 09 09:55:47 crc kubenswrapper[4971]: I0309 09:55:47.436628 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dc785f6-1a40-487a-97da-c13ad804c776-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "2dc785f6-1a40-487a-97da-c13ad804c776" (UID: "2dc785f6-1a40-487a-97da-c13ad804c776"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:55:47 crc kubenswrapper[4971]: I0309 09:55:47.436783 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dc785f6-1a40-487a-97da-c13ad804c776-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2dc785f6-1a40-487a-97da-c13ad804c776" (UID: "2dc785f6-1a40-487a-97da-c13ad804c776"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:55:47 crc kubenswrapper[4971]: I0309 09:55:47.440944 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dc785f6-1a40-487a-97da-c13ad804c776-kube-api-access-znltw" (OuterVolumeSpecName: "kube-api-access-znltw") pod "2dc785f6-1a40-487a-97da-c13ad804c776" (UID: "2dc785f6-1a40-487a-97da-c13ad804c776"). InnerVolumeSpecName "kube-api-access-znltw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:55:47 crc kubenswrapper[4971]: I0309 09:55:47.455521 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dc785f6-1a40-487a-97da-c13ad804c776-scripts" (OuterVolumeSpecName: "scripts") pod "2dc785f6-1a40-487a-97da-c13ad804c776" (UID: "2dc785f6-1a40-487a-97da-c13ad804c776"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:55:47 crc kubenswrapper[4971]: I0309 09:55:47.461196 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dc785f6-1a40-487a-97da-c13ad804c776-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "2dc785f6-1a40-487a-97da-c13ad804c776" (UID: "2dc785f6-1a40-487a-97da-c13ad804c776"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:55:47 crc kubenswrapper[4971]: I0309 09:55:47.461644 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dc785f6-1a40-487a-97da-c13ad804c776-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "2dc785f6-1a40-487a-97da-c13ad804c776" (UID: "2dc785f6-1a40-487a-97da-c13ad804c776"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:55:47 crc kubenswrapper[4971]: I0309 09:55:47.537464 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2dc785f6-1a40-487a-97da-c13ad804c776-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:47 crc kubenswrapper[4971]: I0309 09:55:47.537514 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znltw\" (UniqueName: \"kubernetes.io/projected/2dc785f6-1a40-487a-97da-c13ad804c776-kube-api-access-znltw\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:47 crc kubenswrapper[4971]: I0309 09:55:47.537539 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2dc785f6-1a40-487a-97da-c13ad804c776-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:47 crc kubenswrapper[4971]: I0309 09:55:47.537557 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2dc785f6-1a40-487a-97da-c13ad804c776-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:47 crc kubenswrapper[4971]: I0309 09:55:47.537574 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2dc785f6-1a40-487a-97da-c13ad804c776-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:47 crc kubenswrapper[4971]: I0309 09:55:47.537590 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2dc785f6-1a40-487a-97da-c13ad804c776-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:47 crc kubenswrapper[4971]: I0309 09:55:47.993031 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c61b3451ac34d2fd76aed6d89e334ef4242a0658cc48970e53ecffda3b200dd" Mar 09 09:55:47 crc kubenswrapper[4971]: I0309 09:55:47.993110 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8qrq8" Mar 09 09:55:48 crc kubenswrapper[4971]: I0309 09:55:48.480981 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wwj56"] Mar 09 09:55:48 crc kubenswrapper[4971]: E0309 09:55:48.481630 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc785f6-1a40-487a-97da-c13ad804c776" containerName="swift-ring-rebalance" Mar 09 09:55:48 crc kubenswrapper[4971]: I0309 09:55:48.481644 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc785f6-1a40-487a-97da-c13ad804c776" containerName="swift-ring-rebalance" Mar 09 09:55:48 crc kubenswrapper[4971]: I0309 09:55:48.481817 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dc785f6-1a40-487a-97da-c13ad804c776" containerName="swift-ring-rebalance" Mar 09 09:55:48 crc kubenswrapper[4971]: I0309 09:55:48.482336 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwj56" Mar 09 09:55:48 crc kubenswrapper[4971]: I0309 09:55:48.484989 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:55:48 crc kubenswrapper[4971]: I0309 09:55:48.495245 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wwj56"] Mar 09 09:55:48 crc kubenswrapper[4971]: I0309 09:55:48.500625 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:55:48 crc kubenswrapper[4971]: I0309 09:55:48.654410 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c4be6d7a-476e-4340-8fb4-43542fe2adca-swiftconf\") pod \"swift-ring-rebalance-debug-wwj56\" (UID: \"c4be6d7a-476e-4340-8fb4-43542fe2adca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwj56" Mar 09 09:55:48 crc kubenswrapper[4971]: I0309 09:55:48.654532 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s59zw\" (UniqueName: \"kubernetes.io/projected/c4be6d7a-476e-4340-8fb4-43542fe2adca-kube-api-access-s59zw\") pod \"swift-ring-rebalance-debug-wwj56\" (UID: \"c4be6d7a-476e-4340-8fb4-43542fe2adca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwj56" Mar 09 09:55:48 crc kubenswrapper[4971]: I0309 09:55:48.654571 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c4be6d7a-476e-4340-8fb4-43542fe2adca-dispersionconf\") pod \"swift-ring-rebalance-debug-wwj56\" (UID: \"c4be6d7a-476e-4340-8fb4-43542fe2adca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwj56" Mar 09 09:55:48 crc kubenswrapper[4971]: I0309 09:55:48.654642 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c4be6d7a-476e-4340-8fb4-43542fe2adca-etc-swift\") pod \"swift-ring-rebalance-debug-wwj56\" (UID: \"c4be6d7a-476e-4340-8fb4-43542fe2adca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwj56" Mar 09 09:55:48 crc kubenswrapper[4971]: I0309 09:55:48.654713 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c4be6d7a-476e-4340-8fb4-43542fe2adca-scripts\") pod \"swift-ring-rebalance-debug-wwj56\" (UID: \"c4be6d7a-476e-4340-8fb4-43542fe2adca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwj56" Mar 09 09:55:48 crc kubenswrapper[4971]: I0309 09:55:48.654801 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c4be6d7a-476e-4340-8fb4-43542fe2adca-ring-data-devices\") pod \"swift-ring-rebalance-debug-wwj56\" (UID: \"c4be6d7a-476e-4340-8fb4-43542fe2adca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwj56" Mar 09 09:55:48 crc kubenswrapper[4971]: I0309 09:55:48.756104 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c4be6d7a-476e-4340-8fb4-43542fe2adca-etc-swift\") pod \"swift-ring-rebalance-debug-wwj56\" (UID: \"c4be6d7a-476e-4340-8fb4-43542fe2adca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwj56" Mar 09 09:55:48 crc kubenswrapper[4971]: I0309 09:55:48.756165 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c4be6d7a-476e-4340-8fb4-43542fe2adca-scripts\") pod \"swift-ring-rebalance-debug-wwj56\" (UID: \"c4be6d7a-476e-4340-8fb4-43542fe2adca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwj56" Mar 09 09:55:48 crc kubenswrapper[4971]: I0309 09:55:48.756211 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c4be6d7a-476e-4340-8fb4-43542fe2adca-ring-data-devices\") pod \"swift-ring-rebalance-debug-wwj56\" (UID: \"c4be6d7a-476e-4340-8fb4-43542fe2adca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwj56" Mar 09 09:55:48 crc kubenswrapper[4971]: I0309 09:55:48.756263 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c4be6d7a-476e-4340-8fb4-43542fe2adca-swiftconf\") pod \"swift-ring-rebalance-debug-wwj56\" (UID: \"c4be6d7a-476e-4340-8fb4-43542fe2adca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwj56" Mar 09 09:55:48 crc kubenswrapper[4971]: I0309 09:55:48.756341 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s59zw\" (UniqueName: \"kubernetes.io/projected/c4be6d7a-476e-4340-8fb4-43542fe2adca-kube-api-access-s59zw\") pod \"swift-ring-rebalance-debug-wwj56\" (UID: \"c4be6d7a-476e-4340-8fb4-43542fe2adca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwj56" Mar 09 09:55:48 crc kubenswrapper[4971]: I0309 09:55:48.756396 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c4be6d7a-476e-4340-8fb4-43542fe2adca-dispersionconf\") pod \"swift-ring-rebalance-debug-wwj56\" (UID: \"c4be6d7a-476e-4340-8fb4-43542fe2adca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwj56" Mar 09 09:55:48 crc kubenswrapper[4971]: I0309 09:55:48.757181 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c4be6d7a-476e-4340-8fb4-43542fe2adca-ring-data-devices\") pod \"swift-ring-rebalance-debug-wwj56\" (UID: \"c4be6d7a-476e-4340-8fb4-43542fe2adca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwj56" Mar 09 09:55:48 crc kubenswrapper[4971]: I0309 09:55:48.757630 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c4be6d7a-476e-4340-8fb4-43542fe2adca-scripts\") pod \"swift-ring-rebalance-debug-wwj56\" (UID: \"c4be6d7a-476e-4340-8fb4-43542fe2adca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwj56" Mar 09 09:55:48 crc kubenswrapper[4971]: I0309 09:55:48.758978 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c4be6d7a-476e-4340-8fb4-43542fe2adca-etc-swift\") pod \"swift-ring-rebalance-debug-wwj56\" (UID: \"c4be6d7a-476e-4340-8fb4-43542fe2adca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwj56" Mar 09 09:55:48 crc kubenswrapper[4971]: I0309 09:55:48.761471 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c4be6d7a-476e-4340-8fb4-43542fe2adca-dispersionconf\") pod \"swift-ring-rebalance-debug-wwj56\" (UID: \"c4be6d7a-476e-4340-8fb4-43542fe2adca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwj56" Mar 09 09:55:48 crc kubenswrapper[4971]: I0309 09:55:48.761585 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c4be6d7a-476e-4340-8fb4-43542fe2adca-swiftconf\") pod \"swift-ring-rebalance-debug-wwj56\" (UID: \"c4be6d7a-476e-4340-8fb4-43542fe2adca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwj56" Mar 09 09:55:48 crc kubenswrapper[4971]: I0309 09:55:48.789852 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s59zw\" (UniqueName: \"kubernetes.io/projected/c4be6d7a-476e-4340-8fb4-43542fe2adca-kube-api-access-s59zw\") pod \"swift-ring-rebalance-debug-wwj56\" (UID: \"c4be6d7a-476e-4340-8fb4-43542fe2adca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwj56" Mar 09 09:55:48 crc kubenswrapper[4971]: I0309 09:55:48.798668 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwj56" Mar 09 09:55:49 crc kubenswrapper[4971]: I0309 09:55:49.161474 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dc785f6-1a40-487a-97da-c13ad804c776" path="/var/lib/kubelet/pods/2dc785f6-1a40-487a-97da-c13ad804c776/volumes" Mar 09 09:55:49 crc kubenswrapper[4971]: I0309 09:55:49.230294 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wwj56"] Mar 09 09:55:50 crc kubenswrapper[4971]: I0309 09:55:50.011499 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwj56" event={"ID":"c4be6d7a-476e-4340-8fb4-43542fe2adca","Type":"ContainerStarted","Data":"e494e41aa24e964423bfe50acc40db831782757ecaf70f17b9cf8095a61cf1f0"} Mar 09 09:55:50 crc kubenswrapper[4971]: I0309 09:55:50.012082 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwj56" event={"ID":"c4be6d7a-476e-4340-8fb4-43542fe2adca","Type":"ContainerStarted","Data":"1ba5721f5061ecb3b440dc145fbde27f0d2947fc0d3d7816eca34767e2d5f940"} Mar 09 09:55:50 crc kubenswrapper[4971]: I0309 09:55:50.032227 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwj56" podStartSLOduration=2.032202176 podStartE2EDuration="2.032202176s" podCreationTimestamp="2026-03-09 09:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:55:50.025737483 +0000 UTC m=+2153.585665293" watchObservedRunningTime="2026-03-09 09:55:50.032202176 +0000 UTC m=+2153.592129986" Mar 09 09:55:51 crc kubenswrapper[4971]: I0309 09:55:51.024517 4971 generic.go:334] "Generic (PLEG): container finished" podID="c4be6d7a-476e-4340-8fb4-43542fe2adca" containerID="e494e41aa24e964423bfe50acc40db831782757ecaf70f17b9cf8095a61cf1f0" exitCode=0 Mar 09 09:55:51 crc kubenswrapper[4971]: I0309 09:55:51.024561 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwj56" event={"ID":"c4be6d7a-476e-4340-8fb4-43542fe2adca","Type":"ContainerDied","Data":"e494e41aa24e964423bfe50acc40db831782757ecaf70f17b9cf8095a61cf1f0"} Mar 09 09:55:52 crc kubenswrapper[4971]: I0309 09:55:52.350695 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwj56" Mar 09 09:55:52 crc kubenswrapper[4971]: I0309 09:55:52.386880 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wwj56"] Mar 09 09:55:52 crc kubenswrapper[4971]: I0309 09:55:52.394826 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wwj56"] Mar 09 09:55:52 crc kubenswrapper[4971]: I0309 09:55:52.407667 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c4be6d7a-476e-4340-8fb4-43542fe2adca-swiftconf\") pod \"c4be6d7a-476e-4340-8fb4-43542fe2adca\" (UID: \"c4be6d7a-476e-4340-8fb4-43542fe2adca\") " Mar 09 09:55:52 crc kubenswrapper[4971]: I0309 09:55:52.407736 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c4be6d7a-476e-4340-8fb4-43542fe2adca-etc-swift\") pod \"c4be6d7a-476e-4340-8fb4-43542fe2adca\" (UID: \"c4be6d7a-476e-4340-8fb4-43542fe2adca\") " Mar 09 09:55:52 crc kubenswrapper[4971]: I0309 09:55:52.407782 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c4be6d7a-476e-4340-8fb4-43542fe2adca-dispersionconf\") pod \"c4be6d7a-476e-4340-8fb4-43542fe2adca\" (UID: \"c4be6d7a-476e-4340-8fb4-43542fe2adca\") " Mar 09 09:55:52 crc kubenswrapper[4971]: I0309 09:55:52.407799 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c4be6d7a-476e-4340-8fb4-43542fe2adca-scripts\") pod \"c4be6d7a-476e-4340-8fb4-43542fe2adca\" (UID: \"c4be6d7a-476e-4340-8fb4-43542fe2adca\") " Mar 09 09:55:52 crc kubenswrapper[4971]: I0309 09:55:52.407837 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s59zw\" (UniqueName: \"kubernetes.io/projected/c4be6d7a-476e-4340-8fb4-43542fe2adca-kube-api-access-s59zw\") pod \"c4be6d7a-476e-4340-8fb4-43542fe2adca\" (UID: \"c4be6d7a-476e-4340-8fb4-43542fe2adca\") " Mar 09 09:55:52 crc kubenswrapper[4971]: I0309 09:55:52.407861 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c4be6d7a-476e-4340-8fb4-43542fe2adca-ring-data-devices\") pod \"c4be6d7a-476e-4340-8fb4-43542fe2adca\" (UID: \"c4be6d7a-476e-4340-8fb4-43542fe2adca\") " Mar 09 09:55:52 crc kubenswrapper[4971]: I0309 09:55:52.408819 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4be6d7a-476e-4340-8fb4-43542fe2adca-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c4be6d7a-476e-4340-8fb4-43542fe2adca" (UID: "c4be6d7a-476e-4340-8fb4-43542fe2adca"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:55:52 crc kubenswrapper[4971]: I0309 09:55:52.409096 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4be6d7a-476e-4340-8fb4-43542fe2adca-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c4be6d7a-476e-4340-8fb4-43542fe2adca" (UID: "c4be6d7a-476e-4340-8fb4-43542fe2adca"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:55:52 crc kubenswrapper[4971]: I0309 09:55:52.422734 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4be6d7a-476e-4340-8fb4-43542fe2adca-kube-api-access-s59zw" (OuterVolumeSpecName: "kube-api-access-s59zw") pod "c4be6d7a-476e-4340-8fb4-43542fe2adca" (UID: "c4be6d7a-476e-4340-8fb4-43542fe2adca"). InnerVolumeSpecName "kube-api-access-s59zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:55:52 crc kubenswrapper[4971]: I0309 09:55:52.437553 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4be6d7a-476e-4340-8fb4-43542fe2adca-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c4be6d7a-476e-4340-8fb4-43542fe2adca" (UID: "c4be6d7a-476e-4340-8fb4-43542fe2adca"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:55:52 crc kubenswrapper[4971]: I0309 09:55:52.439116 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4be6d7a-476e-4340-8fb4-43542fe2adca-scripts" (OuterVolumeSpecName: "scripts") pod "c4be6d7a-476e-4340-8fb4-43542fe2adca" (UID: "c4be6d7a-476e-4340-8fb4-43542fe2adca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:55:52 crc kubenswrapper[4971]: I0309 09:55:52.448765 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4be6d7a-476e-4340-8fb4-43542fe2adca-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c4be6d7a-476e-4340-8fb4-43542fe2adca" (UID: "c4be6d7a-476e-4340-8fb4-43542fe2adca"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:55:52 crc kubenswrapper[4971]: I0309 09:55:52.509130 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c4be6d7a-476e-4340-8fb4-43542fe2adca-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:52 crc kubenswrapper[4971]: I0309 09:55:52.509168 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c4be6d7a-476e-4340-8fb4-43542fe2adca-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:52 crc kubenswrapper[4971]: I0309 09:55:52.509178 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c4be6d7a-476e-4340-8fb4-43542fe2adca-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:52 crc kubenswrapper[4971]: I0309 09:55:52.509189 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s59zw\" (UniqueName: \"kubernetes.io/projected/c4be6d7a-476e-4340-8fb4-43542fe2adca-kube-api-access-s59zw\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:52 crc kubenswrapper[4971]: I0309 09:55:52.509198 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c4be6d7a-476e-4340-8fb4-43542fe2adca-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:52 crc kubenswrapper[4971]: I0309 09:55:52.509206 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c4be6d7a-476e-4340-8fb4-43542fe2adca-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.044063 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ba5721f5061ecb3b440dc145fbde27f0d2947fc0d3d7816eca34767e2d5f940" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.044152 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwj56" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.162158 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4be6d7a-476e-4340-8fb4-43542fe2adca" path="/var/lib/kubelet/pods/c4be6d7a-476e-4340-8fb4-43542fe2adca/volumes" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.562692 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw"] Mar 09 09:55:53 crc kubenswrapper[4971]: E0309 09:55:53.563006 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4be6d7a-476e-4340-8fb4-43542fe2adca" containerName="swift-ring-rebalance" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.563020 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4be6d7a-476e-4340-8fb4-43542fe2adca" containerName="swift-ring-rebalance" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.563153 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4be6d7a-476e-4340-8fb4-43542fe2adca" containerName="swift-ring-rebalance" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.563634 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.565615 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.566213 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.582586 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw"] Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.725134 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7a8f0858-32a6-478b-9c8b-afbf58114bc4-ring-data-devices\") pod \"swift-ring-rebalance-debug-zbxhw\" (UID: \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.725215 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a8f0858-32a6-478b-9c8b-afbf58114bc4-scripts\") pod \"swift-ring-rebalance-debug-zbxhw\" (UID: \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.725254 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7a8f0858-32a6-478b-9c8b-afbf58114bc4-dispersionconf\") pod \"swift-ring-rebalance-debug-zbxhw\" (UID: \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.725289 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7a8f0858-32a6-478b-9c8b-afbf58114bc4-swiftconf\") pod \"swift-ring-rebalance-debug-zbxhw\" (UID: \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.725383 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w66tn\" (UniqueName: \"kubernetes.io/projected/7a8f0858-32a6-478b-9c8b-afbf58114bc4-kube-api-access-w66tn\") pod \"swift-ring-rebalance-debug-zbxhw\" (UID: \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.725478 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7a8f0858-32a6-478b-9c8b-afbf58114bc4-etc-swift\") pod \"swift-ring-rebalance-debug-zbxhw\" (UID: \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.827426 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7a8f0858-32a6-478b-9c8b-afbf58114bc4-etc-swift\") pod \"swift-ring-rebalance-debug-zbxhw\" (UID: \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.827831 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7a8f0858-32a6-478b-9c8b-afbf58114bc4-ring-data-devices\") pod \"swift-ring-rebalance-debug-zbxhw\" (UID: \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.827984 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a8f0858-32a6-478b-9c8b-afbf58114bc4-scripts\") pod \"swift-ring-rebalance-debug-zbxhw\" (UID: \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.828105 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7a8f0858-32a6-478b-9c8b-afbf58114bc4-dispersionconf\") pod \"swift-ring-rebalance-debug-zbxhw\" (UID: \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.828237 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7a8f0858-32a6-478b-9c8b-afbf58114bc4-swiftconf\") pod \"swift-ring-rebalance-debug-zbxhw\" (UID: \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.828400 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w66tn\" (UniqueName: \"kubernetes.io/projected/7a8f0858-32a6-478b-9c8b-afbf58114bc4-kube-api-access-w66tn\") pod \"swift-ring-rebalance-debug-zbxhw\" (UID: \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.828032 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7a8f0858-32a6-478b-9c8b-afbf58114bc4-etc-swift\") pod \"swift-ring-rebalance-debug-zbxhw\" (UID: \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.828745 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7a8f0858-32a6-478b-9c8b-afbf58114bc4-ring-data-devices\") pod \"swift-ring-rebalance-debug-zbxhw\" (UID: \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.828963 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a8f0858-32a6-478b-9c8b-afbf58114bc4-scripts\") pod \"swift-ring-rebalance-debug-zbxhw\" (UID: \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.833564 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7a8f0858-32a6-478b-9c8b-afbf58114bc4-swiftconf\") pod \"swift-ring-rebalance-debug-zbxhw\" (UID: \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.833954 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7a8f0858-32a6-478b-9c8b-afbf58114bc4-dispersionconf\") pod \"swift-ring-rebalance-debug-zbxhw\" (UID: \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.845087 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w66tn\" (UniqueName: \"kubernetes.io/projected/7a8f0858-32a6-478b-9c8b-afbf58114bc4-kube-api-access-w66tn\") pod \"swift-ring-rebalance-debug-zbxhw\" (UID: \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw" Mar 09 09:55:53 crc kubenswrapper[4971]: I0309 09:55:53.885364 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw" Mar 09 09:55:54 crc kubenswrapper[4971]: I0309 09:55:54.326365 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw"] Mar 09 09:55:55 crc kubenswrapper[4971]: I0309 09:55:55.069238 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw" event={"ID":"7a8f0858-32a6-478b-9c8b-afbf58114bc4","Type":"ContainerStarted","Data":"addf5a50f8ae5778125e8c51fedd74e68fad53dc157ea2cd71b543cc15a49308"} Mar 09 09:55:55 crc kubenswrapper[4971]: I0309 09:55:55.069592 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw" event={"ID":"7a8f0858-32a6-478b-9c8b-afbf58114bc4","Type":"ContainerStarted","Data":"90e12fddd109684672b4242e760ad8284b0143455298be55f759fd141c2833b7"} Mar 09 09:55:55 crc kubenswrapper[4971]: I0309 09:55:55.089143 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw" podStartSLOduration=2.089122553 podStartE2EDuration="2.089122553s" podCreationTimestamp="2026-03-09 09:55:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:55:55.087859918 +0000 UTC m=+2158.647787738" watchObservedRunningTime="2026-03-09 09:55:55.089122553 +0000 UTC m=+2158.649050363" Mar 09 09:55:56 crc kubenswrapper[4971]: I0309 09:55:56.078876 4971 generic.go:334] "Generic (PLEG): container finished" podID="7a8f0858-32a6-478b-9c8b-afbf58114bc4" containerID="addf5a50f8ae5778125e8c51fedd74e68fad53dc157ea2cd71b543cc15a49308" exitCode=0 Mar 09 09:55:56 crc kubenswrapper[4971]: I0309 09:55:56.078972 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw" event={"ID":"7a8f0858-32a6-478b-9c8b-afbf58114bc4","Type":"ContainerDied","Data":"addf5a50f8ae5778125e8c51fedd74e68fad53dc157ea2cd71b543cc15a49308"} Mar 09 09:55:57 crc kubenswrapper[4971]: I0309 09:55:57.431164 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw" Mar 09 09:55:57 crc kubenswrapper[4971]: I0309 09:55:57.480208 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw"] Mar 09 09:55:57 crc kubenswrapper[4971]: I0309 09:55:57.486199 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw"] Mar 09 09:55:57 crc kubenswrapper[4971]: I0309 09:55:57.587672 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a8f0858-32a6-478b-9c8b-afbf58114bc4-scripts\") pod \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\" (UID: \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\") " Mar 09 09:55:57 crc kubenswrapper[4971]: I0309 09:55:57.587722 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7a8f0858-32a6-478b-9c8b-afbf58114bc4-etc-swift\") pod \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\" (UID: \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\") " Mar 09 09:55:57 crc kubenswrapper[4971]: I0309 09:55:57.587786 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7a8f0858-32a6-478b-9c8b-afbf58114bc4-ring-data-devices\") pod \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\" (UID: \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\") " Mar 09 09:55:57 crc kubenswrapper[4971]: I0309 09:55:57.587810 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w66tn\" (UniqueName: \"kubernetes.io/projected/7a8f0858-32a6-478b-9c8b-afbf58114bc4-kube-api-access-w66tn\") pod \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\" (UID: \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\") " Mar 09 09:55:57 crc kubenswrapper[4971]: I0309 09:55:57.587844 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7a8f0858-32a6-478b-9c8b-afbf58114bc4-swiftconf\") pod \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\" (UID: \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\") " Mar 09 09:55:57 crc kubenswrapper[4971]: I0309 09:55:57.587921 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7a8f0858-32a6-478b-9c8b-afbf58114bc4-dispersionconf\") pod \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\" (UID: \"7a8f0858-32a6-478b-9c8b-afbf58114bc4\") " Mar 09 09:55:57 crc kubenswrapper[4971]: I0309 09:55:57.588409 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a8f0858-32a6-478b-9c8b-afbf58114bc4-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7a8f0858-32a6-478b-9c8b-afbf58114bc4" (UID: "7a8f0858-32a6-478b-9c8b-afbf58114bc4"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:55:57 crc kubenswrapper[4971]: I0309 09:55:57.588887 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a8f0858-32a6-478b-9c8b-afbf58114bc4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7a8f0858-32a6-478b-9c8b-afbf58114bc4" (UID: "7a8f0858-32a6-478b-9c8b-afbf58114bc4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:55:57 crc kubenswrapper[4971]: I0309 09:55:57.594779 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a8f0858-32a6-478b-9c8b-afbf58114bc4-kube-api-access-w66tn" (OuterVolumeSpecName: "kube-api-access-w66tn") pod "7a8f0858-32a6-478b-9c8b-afbf58114bc4" (UID: "7a8f0858-32a6-478b-9c8b-afbf58114bc4"). InnerVolumeSpecName "kube-api-access-w66tn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:55:57 crc kubenswrapper[4971]: I0309 09:55:57.611610 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a8f0858-32a6-478b-9c8b-afbf58114bc4-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7a8f0858-32a6-478b-9c8b-afbf58114bc4" (UID: "7a8f0858-32a6-478b-9c8b-afbf58114bc4"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:55:57 crc kubenswrapper[4971]: I0309 09:55:57.614462 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a8f0858-32a6-478b-9c8b-afbf58114bc4-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7a8f0858-32a6-478b-9c8b-afbf58114bc4" (UID: "7a8f0858-32a6-478b-9c8b-afbf58114bc4"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:55:57 crc kubenswrapper[4971]: I0309 09:55:57.623959 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a8f0858-32a6-478b-9c8b-afbf58114bc4-scripts" (OuterVolumeSpecName: "scripts") pod "7a8f0858-32a6-478b-9c8b-afbf58114bc4" (UID: "7a8f0858-32a6-478b-9c8b-afbf58114bc4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:55:57 crc kubenswrapper[4971]: I0309 09:55:57.689674 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a8f0858-32a6-478b-9c8b-afbf58114bc4-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:57 crc kubenswrapper[4971]: I0309 09:55:57.689716 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7a8f0858-32a6-478b-9c8b-afbf58114bc4-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:57 crc kubenswrapper[4971]: I0309 09:55:57.689729 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7a8f0858-32a6-478b-9c8b-afbf58114bc4-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:57 crc kubenswrapper[4971]: I0309 09:55:57.689742 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w66tn\" (UniqueName: \"kubernetes.io/projected/7a8f0858-32a6-478b-9c8b-afbf58114bc4-kube-api-access-w66tn\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:57 crc kubenswrapper[4971]: I0309 09:55:57.689755 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7a8f0858-32a6-478b-9c8b-afbf58114bc4-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:57 crc kubenswrapper[4971]: I0309 09:55:57.689771 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7a8f0858-32a6-478b-9c8b-afbf58114bc4-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.097397 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90e12fddd109684672b4242e760ad8284b0143455298be55f759fd141c2833b7" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.097461 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zbxhw" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.631151 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw"] Mar 09 09:55:58 crc kubenswrapper[4971]: E0309 09:55:58.631538 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a8f0858-32a6-478b-9c8b-afbf58114bc4" containerName="swift-ring-rebalance" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.631557 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8f0858-32a6-478b-9c8b-afbf58114bc4" containerName="swift-ring-rebalance" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.631748 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a8f0858-32a6-478b-9c8b-afbf58114bc4" containerName="swift-ring-rebalance" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.632375 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.634390 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.635850 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.645190 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw"] Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.701832 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-ring-data-devices\") pod \"swift-ring-rebalance-debug-wd2hw\" (UID: \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.701895 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvrtv\" (UniqueName: \"kubernetes.io/projected/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-kube-api-access-hvrtv\") pod \"swift-ring-rebalance-debug-wd2hw\" (UID: \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.701944 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-swiftconf\") pod \"swift-ring-rebalance-debug-wd2hw\" (UID: \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.701970 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-scripts\") pod \"swift-ring-rebalance-debug-wd2hw\" (UID: \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.702013 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-dispersionconf\") pod \"swift-ring-rebalance-debug-wd2hw\" (UID: \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.702072 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-etc-swift\") pod \"swift-ring-rebalance-debug-wd2hw\" (UID: \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.803211 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-swiftconf\") pod \"swift-ring-rebalance-debug-wd2hw\" (UID: \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.803527 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-scripts\") pod \"swift-ring-rebalance-debug-wd2hw\" (UID: \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.803675 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-dispersionconf\") pod \"swift-ring-rebalance-debug-wd2hw\" (UID: \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.803828 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-etc-swift\") pod \"swift-ring-rebalance-debug-wd2hw\" (UID: \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.803974 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-ring-data-devices\") pod \"swift-ring-rebalance-debug-wd2hw\" (UID: \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.804083 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvrtv\" (UniqueName: \"kubernetes.io/projected/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-kube-api-access-hvrtv\") pod \"swift-ring-rebalance-debug-wd2hw\" (UID: \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.804325 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-etc-swift\") pod \"swift-ring-rebalance-debug-wd2hw\" (UID: \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.804856 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-scripts\") pod \"swift-ring-rebalance-debug-wd2hw\" (UID: \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.804961 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-ring-data-devices\") pod \"swift-ring-rebalance-debug-wd2hw\" (UID: \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.810981 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-dispersionconf\") pod \"swift-ring-rebalance-debug-wd2hw\" (UID: \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.810998 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-swiftconf\") pod \"swift-ring-rebalance-debug-wd2hw\" (UID: \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.821330 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvrtv\" (UniqueName: \"kubernetes.io/projected/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-kube-api-access-hvrtv\") pod \"swift-ring-rebalance-debug-wd2hw\" (UID: \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw" Mar 09 09:55:58 crc kubenswrapper[4971]: I0309 09:55:58.959193 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw" Mar 09 09:55:59 crc kubenswrapper[4971]: I0309 09:55:59.162981 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a8f0858-32a6-478b-9c8b-afbf58114bc4" path="/var/lib/kubelet/pods/7a8f0858-32a6-478b-9c8b-afbf58114bc4/volumes" Mar 09 09:55:59 crc kubenswrapper[4971]: I0309 09:55:59.368690 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw"] Mar 09 09:55:59 crc kubenswrapper[4971]: W0309 09:55:59.374419 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92d7ad8f_1721_4e5c_9572_98bc7d68e68a.slice/crio-745fa02f7895962537afa339ea2612d90250d8e423a31a9af7a19784a5497811 WatchSource:0}: Error finding container 745fa02f7895962537afa339ea2612d90250d8e423a31a9af7a19784a5497811: Status 404 returned error can't find the container with id 745fa02f7895962537afa339ea2612d90250d8e423a31a9af7a19784a5497811 Mar 09 09:56:00 crc kubenswrapper[4971]: I0309 09:56:00.121766 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw" event={"ID":"92d7ad8f-1721-4e5c-9572-98bc7d68e68a","Type":"ContainerStarted","Data":"84e8b754bf78e7b154bcfed1e3556616bfbc248bc2f47dc0613358088fa52cfe"} Mar 09 09:56:00 crc kubenswrapper[4971]: I0309 09:56:00.122133 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw" event={"ID":"92d7ad8f-1721-4e5c-9572-98bc7d68e68a","Type":"ContainerStarted","Data":"745fa02f7895962537afa339ea2612d90250d8e423a31a9af7a19784a5497811"} Mar 09 09:56:00 crc kubenswrapper[4971]: I0309 09:56:00.144847 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550836-4gnqj"] Mar 09 09:56:00 crc kubenswrapper[4971]: I0309 09:56:00.146007 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550836-4gnqj" Mar 09 09:56:00 crc kubenswrapper[4971]: I0309 09:56:00.150470 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-xhrv2" Mar 09 09:56:00 crc kubenswrapper[4971]: I0309 09:56:00.150749 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:56:00 crc kubenswrapper[4971]: I0309 09:56:00.151170 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:56:00 crc kubenswrapper[4971]: I0309 09:56:00.151840 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw" podStartSLOduration=2.151823142 podStartE2EDuration="2.151823142s" podCreationTimestamp="2026-03-09 09:55:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:56:00.137132238 +0000 UTC m=+2163.697060048" watchObservedRunningTime="2026-03-09 09:56:00.151823142 +0000 UTC m=+2163.711750952" Mar 09 09:56:00 crc kubenswrapper[4971]: I0309 09:56:00.163074 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550836-4gnqj"] Mar 09 09:56:00 crc kubenswrapper[4971]: I0309 09:56:00.323856 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l9nr\" (UniqueName: \"kubernetes.io/projected/a220b921-116d-4764-86f8-894f497476f6-kube-api-access-4l9nr\") pod \"auto-csr-approver-29550836-4gnqj\" (UID: \"a220b921-116d-4764-86f8-894f497476f6\") " pod="openshift-infra/auto-csr-approver-29550836-4gnqj" Mar 09 09:56:00 crc kubenswrapper[4971]: I0309 09:56:00.426289 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l9nr\" (UniqueName: \"kubernetes.io/projected/a220b921-116d-4764-86f8-894f497476f6-kube-api-access-4l9nr\") pod \"auto-csr-approver-29550836-4gnqj\" (UID: \"a220b921-116d-4764-86f8-894f497476f6\") " pod="openshift-infra/auto-csr-approver-29550836-4gnqj" Mar 09 09:56:00 crc kubenswrapper[4971]: I0309 09:56:00.447472 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l9nr\" (UniqueName: \"kubernetes.io/projected/a220b921-116d-4764-86f8-894f497476f6-kube-api-access-4l9nr\") pod \"auto-csr-approver-29550836-4gnqj\" (UID: \"a220b921-116d-4764-86f8-894f497476f6\") " pod="openshift-infra/auto-csr-approver-29550836-4gnqj" Mar 09 09:56:00 crc kubenswrapper[4971]: I0309 09:56:00.476582 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550836-4gnqj" Mar 09 09:56:00 crc kubenswrapper[4971]: I0309 09:56:00.895802 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550836-4gnqj"] Mar 09 09:56:00 crc kubenswrapper[4971]: W0309 09:56:00.904838 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda220b921_116d_4764_86f8_894f497476f6.slice/crio-f2d30349f5a7a30b5a0b0a58b8c9774ecdc6fd9b2e286a3cb27d51541bedc0e6 WatchSource:0}: Error finding container f2d30349f5a7a30b5a0b0a58b8c9774ecdc6fd9b2e286a3cb27d51541bedc0e6: Status 404 returned error can't find the container with id f2d30349f5a7a30b5a0b0a58b8c9774ecdc6fd9b2e286a3cb27d51541bedc0e6 Mar 09 09:56:00 crc kubenswrapper[4971]: I0309 09:56:00.910862 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:56:01 crc kubenswrapper[4971]: I0309 09:56:01.132481 4971 generic.go:334] "Generic (PLEG): container finished" podID="92d7ad8f-1721-4e5c-9572-98bc7d68e68a" containerID="84e8b754bf78e7b154bcfed1e3556616bfbc248bc2f47dc0613358088fa52cfe" exitCode=0 Mar 09 09:56:01 crc kubenswrapper[4971]: I0309 09:56:01.132552 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw" event={"ID":"92d7ad8f-1721-4e5c-9572-98bc7d68e68a","Type":"ContainerDied","Data":"84e8b754bf78e7b154bcfed1e3556616bfbc248bc2f47dc0613358088fa52cfe"} Mar 09 09:56:01 crc kubenswrapper[4971]: I0309 09:56:01.134712 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550836-4gnqj" event={"ID":"a220b921-116d-4764-86f8-894f497476f6","Type":"ContainerStarted","Data":"f2d30349f5a7a30b5a0b0a58b8c9774ecdc6fd9b2e286a3cb27d51541bedc0e6"} Mar 09 09:56:02 crc kubenswrapper[4971]: I0309 09:56:02.144593 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550836-4gnqj" event={"ID":"a220b921-116d-4764-86f8-894f497476f6","Type":"ContainerStarted","Data":"0540622c2bcb9bcf74f1cff2a4d07ae304122babc85ef78c6a76be2b57b0ffa7"} Mar 09 09:56:02 crc kubenswrapper[4971]: I0309 09:56:02.165002 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550836-4gnqj" podStartSLOduration=1.288732268 podStartE2EDuration="2.164985957s" podCreationTimestamp="2026-03-09 09:56:00 +0000 UTC" firstStartedPulling="2026-03-09 09:56:00.91059695 +0000 UTC m=+2164.470524760" lastFinishedPulling="2026-03-09 09:56:01.786850629 +0000 UTC m=+2165.346778449" observedRunningTime="2026-03-09 09:56:02.158316809 +0000 UTC m=+2165.718244699" watchObservedRunningTime="2026-03-09 09:56:02.164985957 +0000 UTC m=+2165.724913767" Mar 09 09:56:02 crc kubenswrapper[4971]: I0309 09:56:02.417082 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw" Mar 09 09:56:02 crc kubenswrapper[4971]: I0309 09:56:02.448399 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw"] Mar 09 09:56:02 crc kubenswrapper[4971]: I0309 09:56:02.456476 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw"] Mar 09 09:56:02 crc kubenswrapper[4971]: I0309 09:56:02.567831 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-swiftconf\") pod \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\" (UID: \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\") " Mar 09 09:56:02 crc kubenswrapper[4971]: I0309 09:56:02.567917 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-dispersionconf\") pod \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\" (UID: \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\") " Mar 09 09:56:02 crc kubenswrapper[4971]: I0309 09:56:02.567951 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-etc-swift\") pod \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\" (UID: \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\") " Mar 09 09:56:02 crc kubenswrapper[4971]: I0309 09:56:02.568013 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-ring-data-devices\") pod \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\" (UID: \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\") " Mar 09 09:56:02 crc kubenswrapper[4971]: I0309 09:56:02.568030 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-scripts\") pod \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\" (UID: \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\") " Mar 09 09:56:02 crc kubenswrapper[4971]: I0309 09:56:02.568057 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvrtv\" (UniqueName: \"kubernetes.io/projected/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-kube-api-access-hvrtv\") pod \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\" (UID: \"92d7ad8f-1721-4e5c-9572-98bc7d68e68a\") " Mar 09 09:56:02 crc kubenswrapper[4971]: I0309 09:56:02.568591 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "92d7ad8f-1721-4e5c-9572-98bc7d68e68a" (UID: "92d7ad8f-1721-4e5c-9572-98bc7d68e68a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:56:02 crc kubenswrapper[4971]: I0309 09:56:02.568772 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "92d7ad8f-1721-4e5c-9572-98bc7d68e68a" (UID: "92d7ad8f-1721-4e5c-9572-98bc7d68e68a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:56:02 crc kubenswrapper[4971]: I0309 09:56:02.568917 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:02 crc kubenswrapper[4971]: I0309 09:56:02.568935 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:02 crc kubenswrapper[4971]: I0309 09:56:02.612547 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-kube-api-access-hvrtv" (OuterVolumeSpecName: "kube-api-access-hvrtv") pod "92d7ad8f-1721-4e5c-9572-98bc7d68e68a" (UID: "92d7ad8f-1721-4e5c-9572-98bc7d68e68a"). InnerVolumeSpecName "kube-api-access-hvrtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:56:02 crc kubenswrapper[4971]: I0309 09:56:02.612707 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "92d7ad8f-1721-4e5c-9572-98bc7d68e68a" (UID: "92d7ad8f-1721-4e5c-9572-98bc7d68e68a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:56:02 crc kubenswrapper[4971]: I0309 09:56:02.647549 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "92d7ad8f-1721-4e5c-9572-98bc7d68e68a" (UID: "92d7ad8f-1721-4e5c-9572-98bc7d68e68a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:56:02 crc kubenswrapper[4971]: I0309 09:56:02.650111 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-scripts" (OuterVolumeSpecName: "scripts") pod "92d7ad8f-1721-4e5c-9572-98bc7d68e68a" (UID: "92d7ad8f-1721-4e5c-9572-98bc7d68e68a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:56:02 crc kubenswrapper[4971]: I0309 09:56:02.670314 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:02 crc kubenswrapper[4971]: I0309 09:56:02.670380 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:02 crc kubenswrapper[4971]: I0309 09:56:02.670396 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvrtv\" (UniqueName: \"kubernetes.io/projected/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-kube-api-access-hvrtv\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:02 crc kubenswrapper[4971]: I0309 09:56:02.670413 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/92d7ad8f-1721-4e5c-9572-98bc7d68e68a-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.153538 4971 generic.go:334] "Generic (PLEG): container finished" podID="a220b921-116d-4764-86f8-894f497476f6" containerID="0540622c2bcb9bcf74f1cff2a4d07ae304122babc85ef78c6a76be2b57b0ffa7" exitCode=0 Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.156160 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wd2hw" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.161246 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92d7ad8f-1721-4e5c-9572-98bc7d68e68a" path="/var/lib/kubelet/pods/92d7ad8f-1721-4e5c-9572-98bc7d68e68a/volumes" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.163732 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550836-4gnqj" event={"ID":"a220b921-116d-4764-86f8-894f497476f6","Type":"ContainerDied","Data":"0540622c2bcb9bcf74f1cff2a4d07ae304122babc85ef78c6a76be2b57b0ffa7"} Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.163779 4971 scope.go:117] "RemoveContainer" containerID="84e8b754bf78e7b154bcfed1e3556616bfbc248bc2f47dc0613358088fa52cfe" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.575620 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf"] Mar 09 09:56:03 crc kubenswrapper[4971]: E0309 09:56:03.575930 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92d7ad8f-1721-4e5c-9572-98bc7d68e68a" containerName="swift-ring-rebalance" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.575949 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="92d7ad8f-1721-4e5c-9572-98bc7d68e68a" containerName="swift-ring-rebalance" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.576107 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="92d7ad8f-1721-4e5c-9572-98bc7d68e68a" containerName="swift-ring-rebalance" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.576579 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.578594 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.578749 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.582016 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ff80ca5e-0492-400d-9b49-6fbb4576c918-swiftconf\") pod \"swift-ring-rebalance-debug-x8lpf\" (UID: \"ff80ca5e-0492-400d-9b49-6fbb4576c918\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.582063 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ff80ca5e-0492-400d-9b49-6fbb4576c918-dispersionconf\") pod \"swift-ring-rebalance-debug-x8lpf\" (UID: \"ff80ca5e-0492-400d-9b49-6fbb4576c918\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.582093 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff80ca5e-0492-400d-9b49-6fbb4576c918-scripts\") pod \"swift-ring-rebalance-debug-x8lpf\" (UID: \"ff80ca5e-0492-400d-9b49-6fbb4576c918\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.582125 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg26x\" (UniqueName: \"kubernetes.io/projected/ff80ca5e-0492-400d-9b49-6fbb4576c918-kube-api-access-fg26x\") pod \"swift-ring-rebalance-debug-x8lpf\" (UID: \"ff80ca5e-0492-400d-9b49-6fbb4576c918\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.582181 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ff80ca5e-0492-400d-9b49-6fbb4576c918-etc-swift\") pod \"swift-ring-rebalance-debug-x8lpf\" (UID: \"ff80ca5e-0492-400d-9b49-6fbb4576c918\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.582231 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ff80ca5e-0492-400d-9b49-6fbb4576c918-ring-data-devices\") pod \"swift-ring-rebalance-debug-x8lpf\" (UID: \"ff80ca5e-0492-400d-9b49-6fbb4576c918\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.585265 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf"] Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.683576 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg26x\" (UniqueName: \"kubernetes.io/projected/ff80ca5e-0492-400d-9b49-6fbb4576c918-kube-api-access-fg26x\") pod \"swift-ring-rebalance-debug-x8lpf\" (UID: \"ff80ca5e-0492-400d-9b49-6fbb4576c918\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.683763 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ff80ca5e-0492-400d-9b49-6fbb4576c918-etc-swift\") pod \"swift-ring-rebalance-debug-x8lpf\" (UID: \"ff80ca5e-0492-400d-9b49-6fbb4576c918\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.683820 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ff80ca5e-0492-400d-9b49-6fbb4576c918-ring-data-devices\") pod \"swift-ring-rebalance-debug-x8lpf\" (UID: \"ff80ca5e-0492-400d-9b49-6fbb4576c918\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.683851 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ff80ca5e-0492-400d-9b49-6fbb4576c918-swiftconf\") pod \"swift-ring-rebalance-debug-x8lpf\" (UID: \"ff80ca5e-0492-400d-9b49-6fbb4576c918\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.683878 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ff80ca5e-0492-400d-9b49-6fbb4576c918-dispersionconf\") pod \"swift-ring-rebalance-debug-x8lpf\" (UID: \"ff80ca5e-0492-400d-9b49-6fbb4576c918\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.683904 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff80ca5e-0492-400d-9b49-6fbb4576c918-scripts\") pod \"swift-ring-rebalance-debug-x8lpf\" (UID: \"ff80ca5e-0492-400d-9b49-6fbb4576c918\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.684580 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ff80ca5e-0492-400d-9b49-6fbb4576c918-etc-swift\") pod \"swift-ring-rebalance-debug-x8lpf\" (UID: \"ff80ca5e-0492-400d-9b49-6fbb4576c918\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.684711 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff80ca5e-0492-400d-9b49-6fbb4576c918-scripts\") pod \"swift-ring-rebalance-debug-x8lpf\" (UID: \"ff80ca5e-0492-400d-9b49-6fbb4576c918\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.684956 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ff80ca5e-0492-400d-9b49-6fbb4576c918-ring-data-devices\") pod \"swift-ring-rebalance-debug-x8lpf\" (UID: \"ff80ca5e-0492-400d-9b49-6fbb4576c918\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.689911 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ff80ca5e-0492-400d-9b49-6fbb4576c918-dispersionconf\") pod \"swift-ring-rebalance-debug-x8lpf\" (UID: \"ff80ca5e-0492-400d-9b49-6fbb4576c918\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.694526 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ff80ca5e-0492-400d-9b49-6fbb4576c918-swiftconf\") pod \"swift-ring-rebalance-debug-x8lpf\" (UID: \"ff80ca5e-0492-400d-9b49-6fbb4576c918\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.701193 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg26x\" (UniqueName: \"kubernetes.io/projected/ff80ca5e-0492-400d-9b49-6fbb4576c918-kube-api-access-fg26x\") pod \"swift-ring-rebalance-debug-x8lpf\" (UID: \"ff80ca5e-0492-400d-9b49-6fbb4576c918\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf" Mar 09 09:56:03 crc kubenswrapper[4971]: I0309 09:56:03.902439 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf" Mar 09 09:56:04 crc kubenswrapper[4971]: I0309 09:56:04.327740 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf"] Mar 09 09:56:04 crc kubenswrapper[4971]: W0309 09:56:04.330563 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff80ca5e_0492_400d_9b49_6fbb4576c918.slice/crio-95179886efe308d6380a65a54af516f4d4dd638ee0df9067235ad956bfdbc566 WatchSource:0}: Error finding container 95179886efe308d6380a65a54af516f4d4dd638ee0df9067235ad956bfdbc566: Status 404 returned error can't find the container with id 95179886efe308d6380a65a54af516f4d4dd638ee0df9067235ad956bfdbc566 Mar 09 09:56:04 crc kubenswrapper[4971]: I0309 09:56:04.453204 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550836-4gnqj" Mar 09 09:56:04 crc kubenswrapper[4971]: I0309 09:56:04.597066 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l9nr\" (UniqueName: \"kubernetes.io/projected/a220b921-116d-4764-86f8-894f497476f6-kube-api-access-4l9nr\") pod \"a220b921-116d-4764-86f8-894f497476f6\" (UID: \"a220b921-116d-4764-86f8-894f497476f6\") " Mar 09 09:56:04 crc kubenswrapper[4971]: I0309 09:56:04.601544 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a220b921-116d-4764-86f8-894f497476f6-kube-api-access-4l9nr" (OuterVolumeSpecName: "kube-api-access-4l9nr") pod "a220b921-116d-4764-86f8-894f497476f6" (UID: "a220b921-116d-4764-86f8-894f497476f6"). InnerVolumeSpecName "kube-api-access-4l9nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:56:04 crc kubenswrapper[4971]: I0309 09:56:04.698550 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l9nr\" (UniqueName: \"kubernetes.io/projected/a220b921-116d-4764-86f8-894f497476f6-kube-api-access-4l9nr\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:05 crc kubenswrapper[4971]: I0309 09:56:05.178613 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550836-4gnqj" event={"ID":"a220b921-116d-4764-86f8-894f497476f6","Type":"ContainerDied","Data":"f2d30349f5a7a30b5a0b0a58b8c9774ecdc6fd9b2e286a3cb27d51541bedc0e6"} Mar 09 09:56:05 crc kubenswrapper[4971]: I0309 09:56:05.178892 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2d30349f5a7a30b5a0b0a58b8c9774ecdc6fd9b2e286a3cb27d51541bedc0e6" Mar 09 09:56:05 crc kubenswrapper[4971]: I0309 09:56:05.178966 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550836-4gnqj" Mar 09 09:56:05 crc kubenswrapper[4971]: I0309 09:56:05.180800 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf" event={"ID":"ff80ca5e-0492-400d-9b49-6fbb4576c918","Type":"ContainerStarted","Data":"b7359a996de529e746fd0c2682b33c6703ef9ce3b8fdfc095db7ac9f4498e328"} Mar 09 09:56:05 crc kubenswrapper[4971]: I0309 09:56:05.180842 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf" event={"ID":"ff80ca5e-0492-400d-9b49-6fbb4576c918","Type":"ContainerStarted","Data":"95179886efe308d6380a65a54af516f4d4dd638ee0df9067235ad956bfdbc566"} Mar 09 09:56:05 crc kubenswrapper[4971]: I0309 09:56:05.215500 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf" podStartSLOduration=2.21547639 podStartE2EDuration="2.21547639s" podCreationTimestamp="2026-03-09 09:56:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:56:05.205712975 +0000 UTC m=+2168.765640785" watchObservedRunningTime="2026-03-09 09:56:05.21547639 +0000 UTC m=+2168.775404210" Mar 09 09:56:05 crc kubenswrapper[4971]: I0309 09:56:05.234191 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550830-t6nld"] Mar 09 09:56:05 crc kubenswrapper[4971]: I0309 09:56:05.242372 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550830-t6nld"] Mar 09 09:56:06 crc kubenswrapper[4971]: I0309 09:56:06.193114 4971 generic.go:334] "Generic (PLEG): container finished" podID="ff80ca5e-0492-400d-9b49-6fbb4576c918" containerID="b7359a996de529e746fd0c2682b33c6703ef9ce3b8fdfc095db7ac9f4498e328" exitCode=0 Mar 09 09:56:06 crc kubenswrapper[4971]: I0309 09:56:06.193210 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf" event={"ID":"ff80ca5e-0492-400d-9b49-6fbb4576c918","Type":"ContainerDied","Data":"b7359a996de529e746fd0c2682b33c6703ef9ce3b8fdfc095db7ac9f4498e328"} Mar 09 09:56:07 crc kubenswrapper[4971]: I0309 09:56:07.163657 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd0bfbf3-81ff-4403-98da-d03b5baa13ae" path="/var/lib/kubelet/pods/fd0bfbf3-81ff-4403-98da-d03b5baa13ae/volumes" Mar 09 09:56:07 crc kubenswrapper[4971]: I0309 09:56:07.490264 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf" Mar 09 09:56:07 crc kubenswrapper[4971]: I0309 09:56:07.531190 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf"] Mar 09 09:56:07 crc kubenswrapper[4971]: I0309 09:56:07.540883 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf"] Mar 09 09:56:07 crc kubenswrapper[4971]: I0309 09:56:07.642787 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff80ca5e-0492-400d-9b49-6fbb4576c918-scripts\") pod \"ff80ca5e-0492-400d-9b49-6fbb4576c918\" (UID: \"ff80ca5e-0492-400d-9b49-6fbb4576c918\") " Mar 09 09:56:07 crc kubenswrapper[4971]: I0309 09:56:07.643185 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ff80ca5e-0492-400d-9b49-6fbb4576c918-swiftconf\") pod \"ff80ca5e-0492-400d-9b49-6fbb4576c918\" (UID: \"ff80ca5e-0492-400d-9b49-6fbb4576c918\") " Mar 09 09:56:07 crc kubenswrapper[4971]: I0309 09:56:07.643226 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ff80ca5e-0492-400d-9b49-6fbb4576c918-ring-data-devices\") pod \"ff80ca5e-0492-400d-9b49-6fbb4576c918\" (UID: \"ff80ca5e-0492-400d-9b49-6fbb4576c918\") " Mar 09 09:56:07 crc kubenswrapper[4971]: I0309 09:56:07.643295 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ff80ca5e-0492-400d-9b49-6fbb4576c918-dispersionconf\") pod \"ff80ca5e-0492-400d-9b49-6fbb4576c918\" (UID: \"ff80ca5e-0492-400d-9b49-6fbb4576c918\") " Mar 09 09:56:07 crc kubenswrapper[4971]: I0309 09:56:07.643367 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ff80ca5e-0492-400d-9b49-6fbb4576c918-etc-swift\") pod \"ff80ca5e-0492-400d-9b49-6fbb4576c918\" (UID: \"ff80ca5e-0492-400d-9b49-6fbb4576c918\") " Mar 09 09:56:07 crc kubenswrapper[4971]: I0309 09:56:07.643426 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg26x\" (UniqueName: \"kubernetes.io/projected/ff80ca5e-0492-400d-9b49-6fbb4576c918-kube-api-access-fg26x\") pod \"ff80ca5e-0492-400d-9b49-6fbb4576c918\" (UID: \"ff80ca5e-0492-400d-9b49-6fbb4576c918\") " Mar 09 09:56:07 crc kubenswrapper[4971]: I0309 09:56:07.645425 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff80ca5e-0492-400d-9b49-6fbb4576c918-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ff80ca5e-0492-400d-9b49-6fbb4576c918" (UID: "ff80ca5e-0492-400d-9b49-6fbb4576c918"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:56:07 crc kubenswrapper[4971]: I0309 09:56:07.646147 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff80ca5e-0492-400d-9b49-6fbb4576c918-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ff80ca5e-0492-400d-9b49-6fbb4576c918" (UID: "ff80ca5e-0492-400d-9b49-6fbb4576c918"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:56:07 crc kubenswrapper[4971]: I0309 09:56:07.649645 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff80ca5e-0492-400d-9b49-6fbb4576c918-kube-api-access-fg26x" (OuterVolumeSpecName: "kube-api-access-fg26x") pod "ff80ca5e-0492-400d-9b49-6fbb4576c918" (UID: "ff80ca5e-0492-400d-9b49-6fbb4576c918"). InnerVolumeSpecName "kube-api-access-fg26x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:56:07 crc kubenswrapper[4971]: I0309 09:56:07.665307 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff80ca5e-0492-400d-9b49-6fbb4576c918-scripts" (OuterVolumeSpecName: "scripts") pod "ff80ca5e-0492-400d-9b49-6fbb4576c918" (UID: "ff80ca5e-0492-400d-9b49-6fbb4576c918"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:56:07 crc kubenswrapper[4971]: I0309 09:56:07.669213 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff80ca5e-0492-400d-9b49-6fbb4576c918-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ff80ca5e-0492-400d-9b49-6fbb4576c918" (UID: "ff80ca5e-0492-400d-9b49-6fbb4576c918"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:56:07 crc kubenswrapper[4971]: I0309 09:56:07.671676 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff80ca5e-0492-400d-9b49-6fbb4576c918-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ff80ca5e-0492-400d-9b49-6fbb4576c918" (UID: "ff80ca5e-0492-400d-9b49-6fbb4576c918"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:56:07 crc kubenswrapper[4971]: I0309 09:56:07.745661 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ff80ca5e-0492-400d-9b49-6fbb4576c918-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:07 crc kubenswrapper[4971]: I0309 09:56:07.745890 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ff80ca5e-0492-400d-9b49-6fbb4576c918-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:07 crc kubenswrapper[4971]: I0309 09:56:07.745981 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg26x\" (UniqueName: \"kubernetes.io/projected/ff80ca5e-0492-400d-9b49-6fbb4576c918-kube-api-access-fg26x\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:07 crc kubenswrapper[4971]: I0309 09:56:07.746049 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff80ca5e-0492-400d-9b49-6fbb4576c918-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:07 crc kubenswrapper[4971]: I0309 09:56:07.746108 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ff80ca5e-0492-400d-9b49-6fbb4576c918-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:07 crc kubenswrapper[4971]: I0309 09:56:07.746192 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ff80ca5e-0492-400d-9b49-6fbb4576c918-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.209758 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95179886efe308d6380a65a54af516f4d4dd638ee0df9067235ad956bfdbc566" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.209807 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-x8lpf" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.655731 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4"] Mar 09 09:56:08 crc kubenswrapper[4971]: E0309 09:56:08.656066 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a220b921-116d-4764-86f8-894f497476f6" containerName="oc" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.656082 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a220b921-116d-4764-86f8-894f497476f6" containerName="oc" Mar 09 09:56:08 crc kubenswrapper[4971]: E0309 09:56:08.656105 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff80ca5e-0492-400d-9b49-6fbb4576c918" containerName="swift-ring-rebalance" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.656111 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff80ca5e-0492-400d-9b49-6fbb4576c918" containerName="swift-ring-rebalance" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.656259 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff80ca5e-0492-400d-9b49-6fbb4576c918" containerName="swift-ring-rebalance" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.656280 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a220b921-116d-4764-86f8-894f497476f6" containerName="oc" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.656751 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.658544 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.659201 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.669728 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4"] Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.761907 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0a2a8946-79bf-468c-8f59-58dfa6bde85d-ring-data-devices\") pod \"swift-ring-rebalance-debug-rxvk4\" (UID: \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.762280 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0a2a8946-79bf-468c-8f59-58dfa6bde85d-swiftconf\") pod \"swift-ring-rebalance-debug-rxvk4\" (UID: \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.762315 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a2a8946-79bf-468c-8f59-58dfa6bde85d-scripts\") pod \"swift-ring-rebalance-debug-rxvk4\" (UID: \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.762342 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0a2a8946-79bf-468c-8f59-58dfa6bde85d-etc-swift\") pod \"swift-ring-rebalance-debug-rxvk4\" (UID: \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.762424 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0a2a8946-79bf-468c-8f59-58dfa6bde85d-dispersionconf\") pod \"swift-ring-rebalance-debug-rxvk4\" (UID: \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.762466 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrgd6\" (UniqueName: \"kubernetes.io/projected/0a2a8946-79bf-468c-8f59-58dfa6bde85d-kube-api-access-lrgd6\") pod \"swift-ring-rebalance-debug-rxvk4\" (UID: \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.863518 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0a2a8946-79bf-468c-8f59-58dfa6bde85d-ring-data-devices\") pod \"swift-ring-rebalance-debug-rxvk4\" (UID: \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.863628 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0a2a8946-79bf-468c-8f59-58dfa6bde85d-swiftconf\") pod \"swift-ring-rebalance-debug-rxvk4\" (UID: \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.863670 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a2a8946-79bf-468c-8f59-58dfa6bde85d-scripts\") pod \"swift-ring-rebalance-debug-rxvk4\" (UID: \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.863699 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0a2a8946-79bf-468c-8f59-58dfa6bde85d-etc-swift\") pod \"swift-ring-rebalance-debug-rxvk4\" (UID: \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.863738 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0a2a8946-79bf-468c-8f59-58dfa6bde85d-dispersionconf\") pod \"swift-ring-rebalance-debug-rxvk4\" (UID: \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.863782 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrgd6\" (UniqueName: \"kubernetes.io/projected/0a2a8946-79bf-468c-8f59-58dfa6bde85d-kube-api-access-lrgd6\") pod \"swift-ring-rebalance-debug-rxvk4\" (UID: \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.864994 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0a2a8946-79bf-468c-8f59-58dfa6bde85d-etc-swift\") pod \"swift-ring-rebalance-debug-rxvk4\" (UID: \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.865034 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0a2a8946-79bf-468c-8f59-58dfa6bde85d-ring-data-devices\") pod \"swift-ring-rebalance-debug-rxvk4\" (UID: \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.865460 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a2a8946-79bf-468c-8f59-58dfa6bde85d-scripts\") pod \"swift-ring-rebalance-debug-rxvk4\" (UID: \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.868048 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0a2a8946-79bf-468c-8f59-58dfa6bde85d-dispersionconf\") pod \"swift-ring-rebalance-debug-rxvk4\" (UID: \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.868325 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0a2a8946-79bf-468c-8f59-58dfa6bde85d-swiftconf\") pod \"swift-ring-rebalance-debug-rxvk4\" (UID: \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.880740 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrgd6\" (UniqueName: \"kubernetes.io/projected/0a2a8946-79bf-468c-8f59-58dfa6bde85d-kube-api-access-lrgd6\") pod \"swift-ring-rebalance-debug-rxvk4\" (UID: \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4" Mar 09 09:56:08 crc kubenswrapper[4971]: I0309 09:56:08.971452 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4" Mar 09 09:56:09 crc kubenswrapper[4971]: I0309 09:56:09.160997 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff80ca5e-0492-400d-9b49-6fbb4576c918" path="/var/lib/kubelet/pods/ff80ca5e-0492-400d-9b49-6fbb4576c918/volumes" Mar 09 09:56:09 crc kubenswrapper[4971]: I0309 09:56:09.376684 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4"] Mar 09 09:56:09 crc kubenswrapper[4971]: W0309 09:56:09.382034 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a2a8946_79bf_468c_8f59_58dfa6bde85d.slice/crio-bedbe18847a6cdc729fbcbdf011a7b254e9b9ed4f80777e074d88f39680cbe0b WatchSource:0}: Error finding container bedbe18847a6cdc729fbcbdf011a7b254e9b9ed4f80777e074d88f39680cbe0b: Status 404 returned error can't find the container with id bedbe18847a6cdc729fbcbdf011a7b254e9b9ed4f80777e074d88f39680cbe0b Mar 09 09:56:10 crc kubenswrapper[4971]: I0309 09:56:10.227723 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4" event={"ID":"0a2a8946-79bf-468c-8f59-58dfa6bde85d","Type":"ContainerStarted","Data":"3ddfd428a892dd228505d8a3d4a81745e6d6d5f9552344c96768dd4602f29fe0"} Mar 09 09:56:10 crc kubenswrapper[4971]: I0309 09:56:10.228189 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4" event={"ID":"0a2a8946-79bf-468c-8f59-58dfa6bde85d","Type":"ContainerStarted","Data":"bedbe18847a6cdc729fbcbdf011a7b254e9b9ed4f80777e074d88f39680cbe0b"} Mar 09 09:56:10 crc kubenswrapper[4971]: I0309 09:56:10.246694 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4" podStartSLOduration=2.246673202 podStartE2EDuration="2.246673202s" podCreationTimestamp="2026-03-09 09:56:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:56:10.243366339 +0000 UTC m=+2173.803294179" watchObservedRunningTime="2026-03-09 09:56:10.246673202 +0000 UTC m=+2173.806601012" Mar 09 09:56:11 crc kubenswrapper[4971]: I0309 09:56:11.238512 4971 generic.go:334] "Generic (PLEG): container finished" podID="0a2a8946-79bf-468c-8f59-58dfa6bde85d" containerID="3ddfd428a892dd228505d8a3d4a81745e6d6d5f9552344c96768dd4602f29fe0" exitCode=0 Mar 09 09:56:11 crc kubenswrapper[4971]: I0309 09:56:11.238563 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4" event={"ID":"0a2a8946-79bf-468c-8f59-58dfa6bde85d","Type":"ContainerDied","Data":"3ddfd428a892dd228505d8a3d4a81745e6d6d5f9552344c96768dd4602f29fe0"} Mar 09 09:56:12 crc kubenswrapper[4971]: I0309 09:56:12.515582 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4" Mar 09 09:56:12 crc kubenswrapper[4971]: I0309 09:56:12.546040 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4"] Mar 09 09:56:12 crc kubenswrapper[4971]: I0309 09:56:12.551165 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4"] Mar 09 09:56:12 crc kubenswrapper[4971]: I0309 09:56:12.619855 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a2a8946-79bf-468c-8f59-58dfa6bde85d-scripts\") pod \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\" (UID: \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\") " Mar 09 09:56:12 crc kubenswrapper[4971]: I0309 09:56:12.619967 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0a2a8946-79bf-468c-8f59-58dfa6bde85d-ring-data-devices\") pod \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\" (UID: \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\") " Mar 09 09:56:12 crc kubenswrapper[4971]: I0309 09:56:12.620039 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrgd6\" (UniqueName: \"kubernetes.io/projected/0a2a8946-79bf-468c-8f59-58dfa6bde85d-kube-api-access-lrgd6\") pod \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\" (UID: \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\") " Mar 09 09:56:12 crc kubenswrapper[4971]: I0309 09:56:12.620092 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0a2a8946-79bf-468c-8f59-58dfa6bde85d-etc-swift\") pod \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\" (UID: \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\") " Mar 09 09:56:12 crc kubenswrapper[4971]: I0309 09:56:12.620131 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0a2a8946-79bf-468c-8f59-58dfa6bde85d-dispersionconf\") pod \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\" (UID: \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\") " Mar 09 09:56:12 crc kubenswrapper[4971]: I0309 09:56:12.620170 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0a2a8946-79bf-468c-8f59-58dfa6bde85d-swiftconf\") pod \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\" (UID: \"0a2a8946-79bf-468c-8f59-58dfa6bde85d\") " Mar 09 09:56:12 crc kubenswrapper[4971]: I0309 09:56:12.621066 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a2a8946-79bf-468c-8f59-58dfa6bde85d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0a2a8946-79bf-468c-8f59-58dfa6bde85d" (UID: "0a2a8946-79bf-468c-8f59-58dfa6bde85d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:56:12 crc kubenswrapper[4971]: I0309 09:56:12.622265 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a2a8946-79bf-468c-8f59-58dfa6bde85d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0a2a8946-79bf-468c-8f59-58dfa6bde85d" (UID: "0a2a8946-79bf-468c-8f59-58dfa6bde85d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:56:12 crc kubenswrapper[4971]: I0309 09:56:12.627892 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a2a8946-79bf-468c-8f59-58dfa6bde85d-kube-api-access-lrgd6" (OuterVolumeSpecName: "kube-api-access-lrgd6") pod "0a2a8946-79bf-468c-8f59-58dfa6bde85d" (UID: "0a2a8946-79bf-468c-8f59-58dfa6bde85d"). InnerVolumeSpecName "kube-api-access-lrgd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:56:12 crc kubenswrapper[4971]: I0309 09:56:12.644190 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a2a8946-79bf-468c-8f59-58dfa6bde85d-scripts" (OuterVolumeSpecName: "scripts") pod "0a2a8946-79bf-468c-8f59-58dfa6bde85d" (UID: "0a2a8946-79bf-468c-8f59-58dfa6bde85d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:56:12 crc kubenswrapper[4971]: I0309 09:56:12.644869 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a2a8946-79bf-468c-8f59-58dfa6bde85d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0a2a8946-79bf-468c-8f59-58dfa6bde85d" (UID: "0a2a8946-79bf-468c-8f59-58dfa6bde85d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:56:12 crc kubenswrapper[4971]: I0309 09:56:12.644986 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a2a8946-79bf-468c-8f59-58dfa6bde85d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0a2a8946-79bf-468c-8f59-58dfa6bde85d" (UID: "0a2a8946-79bf-468c-8f59-58dfa6bde85d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:56:12 crc kubenswrapper[4971]: I0309 09:56:12.723089 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0a2a8946-79bf-468c-8f59-58dfa6bde85d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:12 crc kubenswrapper[4971]: I0309 09:56:12.723126 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a2a8946-79bf-468c-8f59-58dfa6bde85d-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:12 crc kubenswrapper[4971]: I0309 09:56:12.723144 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0a2a8946-79bf-468c-8f59-58dfa6bde85d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:12 crc kubenswrapper[4971]: I0309 09:56:12.723185 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrgd6\" (UniqueName: \"kubernetes.io/projected/0a2a8946-79bf-468c-8f59-58dfa6bde85d-kube-api-access-lrgd6\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:12 crc kubenswrapper[4971]: I0309 09:56:12.723200 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0a2a8946-79bf-468c-8f59-58dfa6bde85d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:12 crc kubenswrapper[4971]: I0309 09:56:12.723209 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0a2a8946-79bf-468c-8f59-58dfa6bde85d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.160727 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a2a8946-79bf-468c-8f59-58dfa6bde85d" path="/var/lib/kubelet/pods/0a2a8946-79bf-468c-8f59-58dfa6bde85d/volumes" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.256853 4971 scope.go:117] "RemoveContainer" containerID="3ddfd428a892dd228505d8a3d4a81745e6d6d5f9552344c96768dd4602f29fe0" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.256873 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rxvk4" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.682209 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-m95df"] Mar 09 09:56:13 crc kubenswrapper[4971]: E0309 09:56:13.683010 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a2a8946-79bf-468c-8f59-58dfa6bde85d" containerName="swift-ring-rebalance" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.683078 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a2a8946-79bf-468c-8f59-58dfa6bde85d" containerName="swift-ring-rebalance" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.683456 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a2a8946-79bf-468c-8f59-58dfa6bde85d" containerName="swift-ring-rebalance" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.685163 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-m95df" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.687796 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.688020 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.693135 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-m95df"] Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.838739 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1e2735e6-b79a-4d65-837b-e39b57badd6a-dispersionconf\") pod \"swift-ring-rebalance-debug-m95df\" (UID: \"1e2735e6-b79a-4d65-837b-e39b57badd6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m95df" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.838813 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1e2735e6-b79a-4d65-837b-e39b57badd6a-ring-data-devices\") pod \"swift-ring-rebalance-debug-m95df\" (UID: \"1e2735e6-b79a-4d65-837b-e39b57badd6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m95df" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.838854 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1e2735e6-b79a-4d65-837b-e39b57badd6a-etc-swift\") pod \"swift-ring-rebalance-debug-m95df\" (UID: \"1e2735e6-b79a-4d65-837b-e39b57badd6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m95df" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.838870 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1e2735e6-b79a-4d65-837b-e39b57badd6a-swiftconf\") pod \"swift-ring-rebalance-debug-m95df\" (UID: \"1e2735e6-b79a-4d65-837b-e39b57badd6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m95df" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.839366 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bflnf\" (UniqueName: \"kubernetes.io/projected/1e2735e6-b79a-4d65-837b-e39b57badd6a-kube-api-access-bflnf\") pod \"swift-ring-rebalance-debug-m95df\" (UID: \"1e2735e6-b79a-4d65-837b-e39b57badd6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m95df" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.839524 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e2735e6-b79a-4d65-837b-e39b57badd6a-scripts\") pod \"swift-ring-rebalance-debug-m95df\" (UID: \"1e2735e6-b79a-4d65-837b-e39b57badd6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m95df" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.940648 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1e2735e6-b79a-4d65-837b-e39b57badd6a-etc-swift\") pod \"swift-ring-rebalance-debug-m95df\" (UID: \"1e2735e6-b79a-4d65-837b-e39b57badd6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m95df" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.940732 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1e2735e6-b79a-4d65-837b-e39b57badd6a-swiftconf\") pod \"swift-ring-rebalance-debug-m95df\" (UID: \"1e2735e6-b79a-4d65-837b-e39b57badd6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m95df" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.940833 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bflnf\" (UniqueName: \"kubernetes.io/projected/1e2735e6-b79a-4d65-837b-e39b57badd6a-kube-api-access-bflnf\") pod \"swift-ring-rebalance-debug-m95df\" (UID: \"1e2735e6-b79a-4d65-837b-e39b57badd6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m95df" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.940915 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e2735e6-b79a-4d65-837b-e39b57badd6a-scripts\") pod \"swift-ring-rebalance-debug-m95df\" (UID: \"1e2735e6-b79a-4d65-837b-e39b57badd6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m95df" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.940963 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1e2735e6-b79a-4d65-837b-e39b57badd6a-dispersionconf\") pod \"swift-ring-rebalance-debug-m95df\" (UID: \"1e2735e6-b79a-4d65-837b-e39b57badd6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m95df" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.941033 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1e2735e6-b79a-4d65-837b-e39b57badd6a-ring-data-devices\") pod \"swift-ring-rebalance-debug-m95df\" (UID: \"1e2735e6-b79a-4d65-837b-e39b57badd6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m95df" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.941109 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1e2735e6-b79a-4d65-837b-e39b57badd6a-etc-swift\") pod \"swift-ring-rebalance-debug-m95df\" (UID: \"1e2735e6-b79a-4d65-837b-e39b57badd6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m95df" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.941845 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e2735e6-b79a-4d65-837b-e39b57badd6a-scripts\") pod \"swift-ring-rebalance-debug-m95df\" (UID: \"1e2735e6-b79a-4d65-837b-e39b57badd6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m95df" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.942151 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1e2735e6-b79a-4d65-837b-e39b57badd6a-ring-data-devices\") pod \"swift-ring-rebalance-debug-m95df\" (UID: \"1e2735e6-b79a-4d65-837b-e39b57badd6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m95df" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.945297 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1e2735e6-b79a-4d65-837b-e39b57badd6a-swiftconf\") pod \"swift-ring-rebalance-debug-m95df\" (UID: \"1e2735e6-b79a-4d65-837b-e39b57badd6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m95df" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.945584 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1e2735e6-b79a-4d65-837b-e39b57badd6a-dispersionconf\") pod \"swift-ring-rebalance-debug-m95df\" (UID: \"1e2735e6-b79a-4d65-837b-e39b57badd6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m95df" Mar 09 09:56:13 crc kubenswrapper[4971]: I0309 09:56:13.958294 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bflnf\" (UniqueName: \"kubernetes.io/projected/1e2735e6-b79a-4d65-837b-e39b57badd6a-kube-api-access-bflnf\") pod \"swift-ring-rebalance-debug-m95df\" (UID: \"1e2735e6-b79a-4d65-837b-e39b57badd6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m95df" Mar 09 09:56:14 crc kubenswrapper[4971]: I0309 09:56:14.007143 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-m95df" Mar 09 09:56:14 crc kubenswrapper[4971]: I0309 09:56:14.418090 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-m95df"] Mar 09 09:56:14 crc kubenswrapper[4971]: W0309 09:56:14.421656 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e2735e6_b79a_4d65_837b_e39b57badd6a.slice/crio-46f1dcfaff06165844e2e34b3e6babbe9fa98a5d3c67a54f8ce797a67302938b WatchSource:0}: Error finding container 46f1dcfaff06165844e2e34b3e6babbe9fa98a5d3c67a54f8ce797a67302938b: Status 404 returned error can't find the container with id 46f1dcfaff06165844e2e34b3e6babbe9fa98a5d3c67a54f8ce797a67302938b Mar 09 09:56:15 crc kubenswrapper[4971]: I0309 09:56:15.275340 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-m95df" event={"ID":"1e2735e6-b79a-4d65-837b-e39b57badd6a","Type":"ContainerStarted","Data":"9209e709ecb9e93c272bf7c92edf9bfb58eb7683ec1e7a2aae415a62f09861f4"} Mar 09 09:56:15 crc kubenswrapper[4971]: I0309 09:56:15.276828 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-m95df" event={"ID":"1e2735e6-b79a-4d65-837b-e39b57badd6a","Type":"ContainerStarted","Data":"46f1dcfaff06165844e2e34b3e6babbe9fa98a5d3c67a54f8ce797a67302938b"} Mar 09 09:56:15 crc kubenswrapper[4971]: I0309 09:56:15.295226 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-m95df" podStartSLOduration=2.295206242 podStartE2EDuration="2.295206242s" podCreationTimestamp="2026-03-09 09:56:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:56:15.295093399 +0000 UTC m=+2178.855021219" watchObservedRunningTime="2026-03-09 09:56:15.295206242 +0000 UTC m=+2178.855134052" Mar 09 09:56:16 crc kubenswrapper[4971]: I0309 09:56:16.286062 4971 generic.go:334] "Generic (PLEG): container finished" podID="1e2735e6-b79a-4d65-837b-e39b57badd6a" containerID="9209e709ecb9e93c272bf7c92edf9bfb58eb7683ec1e7a2aae415a62f09861f4" exitCode=0 Mar 09 09:56:16 crc kubenswrapper[4971]: I0309 09:56:16.286140 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-m95df" event={"ID":"1e2735e6-b79a-4d65-837b-e39b57badd6a","Type":"ContainerDied","Data":"9209e709ecb9e93c272bf7c92edf9bfb58eb7683ec1e7a2aae415a62f09861f4"} Mar 09 09:56:17 crc kubenswrapper[4971]: I0309 09:56:17.578496 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-m95df" Mar 09 09:56:17 crc kubenswrapper[4971]: I0309 09:56:17.607513 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-m95df"] Mar 09 09:56:17 crc kubenswrapper[4971]: I0309 09:56:17.614238 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-m95df"] Mar 09 09:56:17 crc kubenswrapper[4971]: I0309 09:56:17.692838 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1e2735e6-b79a-4d65-837b-e39b57badd6a-ring-data-devices\") pod \"1e2735e6-b79a-4d65-837b-e39b57badd6a\" (UID: \"1e2735e6-b79a-4d65-837b-e39b57badd6a\") " Mar 09 09:56:17 crc kubenswrapper[4971]: I0309 09:56:17.693607 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bflnf\" (UniqueName: \"kubernetes.io/projected/1e2735e6-b79a-4d65-837b-e39b57badd6a-kube-api-access-bflnf\") pod \"1e2735e6-b79a-4d65-837b-e39b57badd6a\" (UID: \"1e2735e6-b79a-4d65-837b-e39b57badd6a\") " Mar 09 09:56:17 crc kubenswrapper[4971]: I0309 09:56:17.693774 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1e2735e6-b79a-4d65-837b-e39b57badd6a-dispersionconf\") pod \"1e2735e6-b79a-4d65-837b-e39b57badd6a\" (UID: \"1e2735e6-b79a-4d65-837b-e39b57badd6a\") " Mar 09 09:56:17 crc kubenswrapper[4971]: I0309 09:56:17.693946 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1e2735e6-b79a-4d65-837b-e39b57badd6a-swiftconf\") pod \"1e2735e6-b79a-4d65-837b-e39b57badd6a\" (UID: \"1e2735e6-b79a-4d65-837b-e39b57badd6a\") " Mar 09 09:56:17 crc kubenswrapper[4971]: I0309 09:56:17.694071 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e2735e6-b79a-4d65-837b-e39b57badd6a-scripts\") pod \"1e2735e6-b79a-4d65-837b-e39b57badd6a\" (UID: \"1e2735e6-b79a-4d65-837b-e39b57badd6a\") " Mar 09 09:56:17 crc kubenswrapper[4971]: I0309 09:56:17.694199 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1e2735e6-b79a-4d65-837b-e39b57badd6a-etc-swift\") pod \"1e2735e6-b79a-4d65-837b-e39b57badd6a\" (UID: \"1e2735e6-b79a-4d65-837b-e39b57badd6a\") " Mar 09 09:56:17 crc kubenswrapper[4971]: I0309 09:56:17.693515 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e2735e6-b79a-4d65-837b-e39b57badd6a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "1e2735e6-b79a-4d65-837b-e39b57badd6a" (UID: "1e2735e6-b79a-4d65-837b-e39b57badd6a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:56:17 crc kubenswrapper[4971]: I0309 09:56:17.694724 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1e2735e6-b79a-4d65-837b-e39b57badd6a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:17 crc kubenswrapper[4971]: I0309 09:56:17.694975 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e2735e6-b79a-4d65-837b-e39b57badd6a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1e2735e6-b79a-4d65-837b-e39b57badd6a" (UID: "1e2735e6-b79a-4d65-837b-e39b57badd6a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:56:17 crc kubenswrapper[4971]: I0309 09:56:17.699189 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e2735e6-b79a-4d65-837b-e39b57badd6a-kube-api-access-bflnf" (OuterVolumeSpecName: "kube-api-access-bflnf") pod "1e2735e6-b79a-4d65-837b-e39b57badd6a" (UID: "1e2735e6-b79a-4d65-837b-e39b57badd6a"). InnerVolumeSpecName "kube-api-access-bflnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:56:17 crc kubenswrapper[4971]: I0309 09:56:17.743283 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e2735e6-b79a-4d65-837b-e39b57badd6a-scripts" (OuterVolumeSpecName: "scripts") pod "1e2735e6-b79a-4d65-837b-e39b57badd6a" (UID: "1e2735e6-b79a-4d65-837b-e39b57badd6a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:56:17 crc kubenswrapper[4971]: I0309 09:56:17.750475 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e2735e6-b79a-4d65-837b-e39b57badd6a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "1e2735e6-b79a-4d65-837b-e39b57badd6a" (UID: "1e2735e6-b79a-4d65-837b-e39b57badd6a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:56:17 crc kubenswrapper[4971]: I0309 09:56:17.759949 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e2735e6-b79a-4d65-837b-e39b57badd6a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "1e2735e6-b79a-4d65-837b-e39b57badd6a" (UID: "1e2735e6-b79a-4d65-837b-e39b57badd6a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:56:17 crc kubenswrapper[4971]: I0309 09:56:17.796388 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1e2735e6-b79a-4d65-837b-e39b57badd6a-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:17 crc kubenswrapper[4971]: I0309 09:56:17.796429 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e2735e6-b79a-4d65-837b-e39b57badd6a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:17 crc kubenswrapper[4971]: I0309 09:56:17.796440 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1e2735e6-b79a-4d65-837b-e39b57badd6a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:17 crc kubenswrapper[4971]: I0309 09:56:17.796451 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bflnf\" (UniqueName: \"kubernetes.io/projected/1e2735e6-b79a-4d65-837b-e39b57badd6a-kube-api-access-bflnf\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:17 crc kubenswrapper[4971]: I0309 09:56:17.796461 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1e2735e6-b79a-4d65-837b-e39b57badd6a-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:18 crc kubenswrapper[4971]: I0309 09:56:18.303018 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46f1dcfaff06165844e2e34b3e6babbe9fa98a5d3c67a54f8ce797a67302938b" Mar 09 09:56:18 crc kubenswrapper[4971]: I0309 09:56:18.303085 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-m95df" Mar 09 09:56:18 crc kubenswrapper[4971]: I0309 09:56:18.758605 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9"] Mar 09 09:56:18 crc kubenswrapper[4971]: E0309 09:56:18.758949 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e2735e6-b79a-4d65-837b-e39b57badd6a" containerName="swift-ring-rebalance" Mar 09 09:56:18 crc kubenswrapper[4971]: I0309 09:56:18.758966 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e2735e6-b79a-4d65-837b-e39b57badd6a" containerName="swift-ring-rebalance" Mar 09 09:56:18 crc kubenswrapper[4971]: I0309 09:56:18.759106 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e2735e6-b79a-4d65-837b-e39b57badd6a" containerName="swift-ring-rebalance" Mar 09 09:56:18 crc kubenswrapper[4971]: I0309 09:56:18.759633 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9" Mar 09 09:56:18 crc kubenswrapper[4971]: I0309 09:56:18.762873 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:56:18 crc kubenswrapper[4971]: I0309 09:56:18.762942 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:56:18 crc kubenswrapper[4971]: I0309 09:56:18.771033 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9"] Mar 09 09:56:18 crc kubenswrapper[4971]: I0309 09:56:18.915118 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ldxj\" (UniqueName: \"kubernetes.io/projected/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-kube-api-access-7ldxj\") pod \"swift-ring-rebalance-debug-hxpx9\" (UID: \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9" Mar 09 09:56:18 crc kubenswrapper[4971]: I0309 09:56:18.915190 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-swiftconf\") pod \"swift-ring-rebalance-debug-hxpx9\" (UID: \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9" Mar 09 09:56:18 crc kubenswrapper[4971]: I0309 09:56:18.915220 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-scripts\") pod \"swift-ring-rebalance-debug-hxpx9\" (UID: \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9" Mar 09 09:56:18 crc kubenswrapper[4971]: I0309 09:56:18.915240 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-etc-swift\") pod \"swift-ring-rebalance-debug-hxpx9\" (UID: \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9" Mar 09 09:56:18 crc kubenswrapper[4971]: I0309 09:56:18.915282 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-ring-data-devices\") pod \"swift-ring-rebalance-debug-hxpx9\" (UID: \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9" Mar 09 09:56:18 crc kubenswrapper[4971]: I0309 09:56:18.915333 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-dispersionconf\") pod \"swift-ring-rebalance-debug-hxpx9\" (UID: \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9" Mar 09 09:56:19 crc kubenswrapper[4971]: I0309 09:56:19.017141 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ldxj\" (UniqueName: \"kubernetes.io/projected/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-kube-api-access-7ldxj\") pod \"swift-ring-rebalance-debug-hxpx9\" (UID: \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9" Mar 09 09:56:19 crc kubenswrapper[4971]: I0309 09:56:19.017222 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-swiftconf\") pod \"swift-ring-rebalance-debug-hxpx9\" (UID: \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9" Mar 09 09:56:19 crc kubenswrapper[4971]: I0309 09:56:19.017253 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-scripts\") pod \"swift-ring-rebalance-debug-hxpx9\" (UID: \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9" Mar 09 09:56:19 crc kubenswrapper[4971]: I0309 09:56:19.017305 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-etc-swift\") pod \"swift-ring-rebalance-debug-hxpx9\" (UID: \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9" Mar 09 09:56:19 crc kubenswrapper[4971]: I0309 09:56:19.017506 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-ring-data-devices\") pod \"swift-ring-rebalance-debug-hxpx9\" (UID: \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9" Mar 09 09:56:19 crc kubenswrapper[4971]: I0309 09:56:19.017537 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-dispersionconf\") pod \"swift-ring-rebalance-debug-hxpx9\" (UID: \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9" Mar 09 09:56:19 crc kubenswrapper[4971]: I0309 09:56:19.018310 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-scripts\") pod \"swift-ring-rebalance-debug-hxpx9\" (UID: \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9" Mar 09 09:56:19 crc kubenswrapper[4971]: I0309 09:56:19.018608 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-etc-swift\") pod \"swift-ring-rebalance-debug-hxpx9\" (UID: \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9" Mar 09 09:56:19 crc kubenswrapper[4971]: I0309 09:56:19.018811 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-ring-data-devices\") pod \"swift-ring-rebalance-debug-hxpx9\" (UID: \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9" Mar 09 09:56:19 crc kubenswrapper[4971]: I0309 09:56:19.023555 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-dispersionconf\") pod \"swift-ring-rebalance-debug-hxpx9\" (UID: \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9" Mar 09 09:56:19 crc kubenswrapper[4971]: I0309 09:56:19.025825 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-swiftconf\") pod \"swift-ring-rebalance-debug-hxpx9\" (UID: \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9" Mar 09 09:56:19 crc kubenswrapper[4971]: I0309 09:56:19.032637 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ldxj\" (UniqueName: \"kubernetes.io/projected/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-kube-api-access-7ldxj\") pod \"swift-ring-rebalance-debug-hxpx9\" (UID: \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9" Mar 09 09:56:19 crc kubenswrapper[4971]: I0309 09:56:19.078399 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9" Mar 09 09:56:19 crc kubenswrapper[4971]: I0309 09:56:19.164811 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e2735e6-b79a-4d65-837b-e39b57badd6a" path="/var/lib/kubelet/pods/1e2735e6-b79a-4d65-837b-e39b57badd6a/volumes" Mar 09 09:56:19 crc kubenswrapper[4971]: I0309 09:56:19.483962 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9"] Mar 09 09:56:19 crc kubenswrapper[4971]: W0309 09:56:19.484978 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6edb8d94_5e6e_433e_bfaf_a94dfae399b2.slice/crio-5606b93ce34dd88b36ebb6f67281e9362f562482beb47c9ca8dd9815054d2ac5 WatchSource:0}: Error finding container 5606b93ce34dd88b36ebb6f67281e9362f562482beb47c9ca8dd9815054d2ac5: Status 404 returned error can't find the container with id 5606b93ce34dd88b36ebb6f67281e9362f562482beb47c9ca8dd9815054d2ac5 Mar 09 09:56:20 crc kubenswrapper[4971]: I0309 09:56:20.320375 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9" event={"ID":"6edb8d94-5e6e-433e-bfaf-a94dfae399b2","Type":"ContainerStarted","Data":"42e367bf36498ccf71c1f045553cc090e126056bb16d1fdd40410525ccb66959"} Mar 09 09:56:20 crc kubenswrapper[4971]: I0309 09:56:20.320954 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9" event={"ID":"6edb8d94-5e6e-433e-bfaf-a94dfae399b2","Type":"ContainerStarted","Data":"5606b93ce34dd88b36ebb6f67281e9362f562482beb47c9ca8dd9815054d2ac5"} Mar 09 09:56:20 crc kubenswrapper[4971]: I0309 09:56:20.344765 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9" podStartSLOduration=2.344749691 podStartE2EDuration="2.344749691s" podCreationTimestamp="2026-03-09 09:56:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:56:20.339191084 +0000 UTC m=+2183.899118894" watchObservedRunningTime="2026-03-09 09:56:20.344749691 +0000 UTC m=+2183.904677501" Mar 09 09:56:21 crc kubenswrapper[4971]: I0309 09:56:21.330463 4971 generic.go:334] "Generic (PLEG): container finished" podID="6edb8d94-5e6e-433e-bfaf-a94dfae399b2" containerID="42e367bf36498ccf71c1f045553cc090e126056bb16d1fdd40410525ccb66959" exitCode=0 Mar 09 09:56:21 crc kubenswrapper[4971]: I0309 09:56:21.330513 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9" event={"ID":"6edb8d94-5e6e-433e-bfaf-a94dfae399b2","Type":"ContainerDied","Data":"42e367bf36498ccf71c1f045553cc090e126056bb16d1fdd40410525ccb66959"} Mar 09 09:56:22 crc kubenswrapper[4971]: I0309 09:56:22.604307 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9" Mar 09 09:56:22 crc kubenswrapper[4971]: I0309 09:56:22.637907 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9"] Mar 09 09:56:22 crc kubenswrapper[4971]: I0309 09:56:22.644942 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9"] Mar 09 09:56:22 crc kubenswrapper[4971]: I0309 09:56:22.773012 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-etc-swift\") pod \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\" (UID: \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\") " Mar 09 09:56:22 crc kubenswrapper[4971]: I0309 09:56:22.773100 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-scripts\") pod \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\" (UID: \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\") " Mar 09 09:56:22 crc kubenswrapper[4971]: I0309 09:56:22.773261 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-ring-data-devices\") pod \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\" (UID: \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\") " Mar 09 09:56:22 crc kubenswrapper[4971]: I0309 09:56:22.773307 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-dispersionconf\") pod \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\" (UID: \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\") " Mar 09 09:56:22 crc kubenswrapper[4971]: I0309 09:56:22.773363 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-swiftconf\") pod \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\" (UID: \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\") " Mar 09 09:56:22 crc kubenswrapper[4971]: I0309 09:56:22.773418 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ldxj\" (UniqueName: \"kubernetes.io/projected/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-kube-api-access-7ldxj\") pod \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\" (UID: \"6edb8d94-5e6e-433e-bfaf-a94dfae399b2\") " Mar 09 09:56:22 crc kubenswrapper[4971]: I0309 09:56:22.773773 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6edb8d94-5e6e-433e-bfaf-a94dfae399b2" (UID: "6edb8d94-5e6e-433e-bfaf-a94dfae399b2"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:56:22 crc kubenswrapper[4971]: I0309 09:56:22.773950 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6edb8d94-5e6e-433e-bfaf-a94dfae399b2" (UID: "6edb8d94-5e6e-433e-bfaf-a94dfae399b2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:56:22 crc kubenswrapper[4971]: I0309 09:56:22.774385 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:22 crc kubenswrapper[4971]: I0309 09:56:22.774409 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:22 crc kubenswrapper[4971]: I0309 09:56:22.778259 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-kube-api-access-7ldxj" (OuterVolumeSpecName: "kube-api-access-7ldxj") pod "6edb8d94-5e6e-433e-bfaf-a94dfae399b2" (UID: "6edb8d94-5e6e-433e-bfaf-a94dfae399b2"). InnerVolumeSpecName "kube-api-access-7ldxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:56:22 crc kubenswrapper[4971]: I0309 09:56:22.794745 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6edb8d94-5e6e-433e-bfaf-a94dfae399b2" (UID: "6edb8d94-5e6e-433e-bfaf-a94dfae399b2"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:56:22 crc kubenswrapper[4971]: I0309 09:56:22.797285 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-scripts" (OuterVolumeSpecName: "scripts") pod "6edb8d94-5e6e-433e-bfaf-a94dfae399b2" (UID: "6edb8d94-5e6e-433e-bfaf-a94dfae399b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:56:22 crc kubenswrapper[4971]: I0309 09:56:22.797588 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6edb8d94-5e6e-433e-bfaf-a94dfae399b2" (UID: "6edb8d94-5e6e-433e-bfaf-a94dfae399b2"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:56:22 crc kubenswrapper[4971]: I0309 09:56:22.875804 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:22 crc kubenswrapper[4971]: I0309 09:56:22.875837 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:22 crc kubenswrapper[4971]: I0309 09:56:22.875850 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:22 crc kubenswrapper[4971]: I0309 09:56:22.875864 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ldxj\" (UniqueName: \"kubernetes.io/projected/6edb8d94-5e6e-433e-bfaf-a94dfae399b2-kube-api-access-7ldxj\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.162193 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6edb8d94-5e6e-433e-bfaf-a94dfae399b2" path="/var/lib/kubelet/pods/6edb8d94-5e6e-433e-bfaf-a94dfae399b2/volumes" Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.356678 4971 scope.go:117] "RemoveContainer" containerID="42e367bf36498ccf71c1f045553cc090e126056bb16d1fdd40410525ccb66959" Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.356805 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxpx9" Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.768651 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg"] Mar 09 09:56:23 crc kubenswrapper[4971]: E0309 09:56:23.769311 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6edb8d94-5e6e-433e-bfaf-a94dfae399b2" containerName="swift-ring-rebalance" Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.769326 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6edb8d94-5e6e-433e-bfaf-a94dfae399b2" containerName="swift-ring-rebalance" Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.769514 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6edb8d94-5e6e-433e-bfaf-a94dfae399b2" containerName="swift-ring-rebalance" Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.770063 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg" Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.772779 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.772842 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.779623 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg"] Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.888463 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/32fbfd7e-856d-4570-9951-d660130473c7-etc-swift\") pod \"swift-ring-rebalance-debug-xt7lg\" (UID: \"32fbfd7e-856d-4570-9951-d660130473c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg" Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.888716 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32fbfd7e-856d-4570-9951-d660130473c7-scripts\") pod \"swift-ring-rebalance-debug-xt7lg\" (UID: \"32fbfd7e-856d-4570-9951-d660130473c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg" Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.888785 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kkq9\" (UniqueName: \"kubernetes.io/projected/32fbfd7e-856d-4570-9951-d660130473c7-kube-api-access-4kkq9\") pod \"swift-ring-rebalance-debug-xt7lg\" (UID: \"32fbfd7e-856d-4570-9951-d660130473c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg" Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.888940 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/32fbfd7e-856d-4570-9951-d660130473c7-dispersionconf\") pod \"swift-ring-rebalance-debug-xt7lg\" (UID: \"32fbfd7e-856d-4570-9951-d660130473c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg" Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.888999 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/32fbfd7e-856d-4570-9951-d660130473c7-swiftconf\") pod \"swift-ring-rebalance-debug-xt7lg\" (UID: \"32fbfd7e-856d-4570-9951-d660130473c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg" Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.889059 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/32fbfd7e-856d-4570-9951-d660130473c7-ring-data-devices\") pod \"swift-ring-rebalance-debug-xt7lg\" (UID: \"32fbfd7e-856d-4570-9951-d660130473c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg" Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.989864 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32fbfd7e-856d-4570-9951-d660130473c7-scripts\") pod \"swift-ring-rebalance-debug-xt7lg\" (UID: \"32fbfd7e-856d-4570-9951-d660130473c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg" Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.989949 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kkq9\" (UniqueName: \"kubernetes.io/projected/32fbfd7e-856d-4570-9951-d660130473c7-kube-api-access-4kkq9\") pod \"swift-ring-rebalance-debug-xt7lg\" (UID: \"32fbfd7e-856d-4570-9951-d660130473c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg" Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.989984 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/32fbfd7e-856d-4570-9951-d660130473c7-dispersionconf\") pod \"swift-ring-rebalance-debug-xt7lg\" (UID: \"32fbfd7e-856d-4570-9951-d660130473c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg" Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.990019 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/32fbfd7e-856d-4570-9951-d660130473c7-swiftconf\") pod \"swift-ring-rebalance-debug-xt7lg\" (UID: \"32fbfd7e-856d-4570-9951-d660130473c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg" Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.990045 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/32fbfd7e-856d-4570-9951-d660130473c7-ring-data-devices\") pod \"swift-ring-rebalance-debug-xt7lg\" (UID: \"32fbfd7e-856d-4570-9951-d660130473c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg" Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.990066 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/32fbfd7e-856d-4570-9951-d660130473c7-etc-swift\") pod \"swift-ring-rebalance-debug-xt7lg\" (UID: \"32fbfd7e-856d-4570-9951-d660130473c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg" Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.990696 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32fbfd7e-856d-4570-9951-d660130473c7-scripts\") pod \"swift-ring-rebalance-debug-xt7lg\" (UID: \"32fbfd7e-856d-4570-9951-d660130473c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg" Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.991253 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/32fbfd7e-856d-4570-9951-d660130473c7-ring-data-devices\") pod \"swift-ring-rebalance-debug-xt7lg\" (UID: \"32fbfd7e-856d-4570-9951-d660130473c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg" Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.991339 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/32fbfd7e-856d-4570-9951-d660130473c7-etc-swift\") pod \"swift-ring-rebalance-debug-xt7lg\" (UID: \"32fbfd7e-856d-4570-9951-d660130473c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg" Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.995947 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/32fbfd7e-856d-4570-9951-d660130473c7-swiftconf\") pod \"swift-ring-rebalance-debug-xt7lg\" (UID: \"32fbfd7e-856d-4570-9951-d660130473c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg" Mar 09 09:56:23 crc kubenswrapper[4971]: I0309 09:56:23.996643 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/32fbfd7e-856d-4570-9951-d660130473c7-dispersionconf\") pod \"swift-ring-rebalance-debug-xt7lg\" (UID: \"32fbfd7e-856d-4570-9951-d660130473c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg" Mar 09 09:56:24 crc kubenswrapper[4971]: I0309 09:56:24.008997 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kkq9\" (UniqueName: \"kubernetes.io/projected/32fbfd7e-856d-4570-9951-d660130473c7-kube-api-access-4kkq9\") pod \"swift-ring-rebalance-debug-xt7lg\" (UID: \"32fbfd7e-856d-4570-9951-d660130473c7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg" Mar 09 09:56:24 crc kubenswrapper[4971]: I0309 09:56:24.090949 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg" Mar 09 09:56:24 crc kubenswrapper[4971]: I0309 09:56:24.508501 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg"] Mar 09 09:56:24 crc kubenswrapper[4971]: W0309 09:56:24.513475 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32fbfd7e_856d_4570_9951_d660130473c7.slice/crio-9c1fcb5797a5c99253a14ede152b33b839aedd2c90ce8d88aeb00098c9446300 WatchSource:0}: Error finding container 9c1fcb5797a5c99253a14ede152b33b839aedd2c90ce8d88aeb00098c9446300: Status 404 returned error can't find the container with id 9c1fcb5797a5c99253a14ede152b33b839aedd2c90ce8d88aeb00098c9446300 Mar 09 09:56:25 crc kubenswrapper[4971]: I0309 09:56:25.375909 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg" event={"ID":"32fbfd7e-856d-4570-9951-d660130473c7","Type":"ContainerStarted","Data":"9c7e0e86b8223b66eeeee7e68d93bfc096cc6e5e37d0a55d9172ea7846991416"} Mar 09 09:56:25 crc kubenswrapper[4971]: I0309 09:56:25.376254 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg" event={"ID":"32fbfd7e-856d-4570-9951-d660130473c7","Type":"ContainerStarted","Data":"9c1fcb5797a5c99253a14ede152b33b839aedd2c90ce8d88aeb00098c9446300"} Mar 09 09:56:25 crc kubenswrapper[4971]: I0309 09:56:25.395190 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg" podStartSLOduration=2.395084151 podStartE2EDuration="2.395084151s" podCreationTimestamp="2026-03-09 09:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:56:25.390104311 +0000 UTC m=+2188.950032121" watchObservedRunningTime="2026-03-09 09:56:25.395084151 +0000 UTC m=+2188.955011961" Mar 09 09:56:26 crc kubenswrapper[4971]: I0309 09:56:26.388306 4971 generic.go:334] "Generic (PLEG): container finished" podID="32fbfd7e-856d-4570-9951-d660130473c7" containerID="9c7e0e86b8223b66eeeee7e68d93bfc096cc6e5e37d0a55d9172ea7846991416" exitCode=0 Mar 09 09:56:26 crc kubenswrapper[4971]: I0309 09:56:26.388446 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg" event={"ID":"32fbfd7e-856d-4570-9951-d660130473c7","Type":"ContainerDied","Data":"9c7e0e86b8223b66eeeee7e68d93bfc096cc6e5e37d0a55d9172ea7846991416"} Mar 09 09:56:27 crc kubenswrapper[4971]: I0309 09:56:27.695100 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg" Mar 09 09:56:27 crc kubenswrapper[4971]: I0309 09:56:27.722848 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg"] Mar 09 09:56:27 crc kubenswrapper[4971]: I0309 09:56:27.728549 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg"] Mar 09 09:56:27 crc kubenswrapper[4971]: I0309 09:56:27.848444 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/32fbfd7e-856d-4570-9951-d660130473c7-dispersionconf\") pod \"32fbfd7e-856d-4570-9951-d660130473c7\" (UID: \"32fbfd7e-856d-4570-9951-d660130473c7\") " Mar 09 09:56:27 crc kubenswrapper[4971]: I0309 09:56:27.849002 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/32fbfd7e-856d-4570-9951-d660130473c7-etc-swift\") pod \"32fbfd7e-856d-4570-9951-d660130473c7\" (UID: \"32fbfd7e-856d-4570-9951-d660130473c7\") " Mar 09 09:56:27 crc kubenswrapper[4971]: I0309 09:56:27.849057 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kkq9\" (UniqueName: \"kubernetes.io/projected/32fbfd7e-856d-4570-9951-d660130473c7-kube-api-access-4kkq9\") pod \"32fbfd7e-856d-4570-9951-d660130473c7\" (UID: \"32fbfd7e-856d-4570-9951-d660130473c7\") " Mar 09 09:56:27 crc kubenswrapper[4971]: I0309 09:56:27.849097 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/32fbfd7e-856d-4570-9951-d660130473c7-swiftconf\") pod \"32fbfd7e-856d-4570-9951-d660130473c7\" (UID: \"32fbfd7e-856d-4570-9951-d660130473c7\") " Mar 09 09:56:27 crc kubenswrapper[4971]: I0309 09:56:27.849141 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32fbfd7e-856d-4570-9951-d660130473c7-scripts\") pod \"32fbfd7e-856d-4570-9951-d660130473c7\" (UID: \"32fbfd7e-856d-4570-9951-d660130473c7\") " Mar 09 09:56:27 crc kubenswrapper[4971]: I0309 09:56:27.849258 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/32fbfd7e-856d-4570-9951-d660130473c7-ring-data-devices\") pod \"32fbfd7e-856d-4570-9951-d660130473c7\" (UID: \"32fbfd7e-856d-4570-9951-d660130473c7\") " Mar 09 09:56:27 crc kubenswrapper[4971]: I0309 09:56:27.850045 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32fbfd7e-856d-4570-9951-d660130473c7-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "32fbfd7e-856d-4570-9951-d660130473c7" (UID: "32fbfd7e-856d-4570-9951-d660130473c7"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:56:27 crc kubenswrapper[4971]: I0309 09:56:27.850463 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32fbfd7e-856d-4570-9951-d660130473c7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "32fbfd7e-856d-4570-9951-d660130473c7" (UID: "32fbfd7e-856d-4570-9951-d660130473c7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:56:27 crc kubenswrapper[4971]: I0309 09:56:27.863553 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32fbfd7e-856d-4570-9951-d660130473c7-kube-api-access-4kkq9" (OuterVolumeSpecName: "kube-api-access-4kkq9") pod "32fbfd7e-856d-4570-9951-d660130473c7" (UID: "32fbfd7e-856d-4570-9951-d660130473c7"). InnerVolumeSpecName "kube-api-access-4kkq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:56:27 crc kubenswrapper[4971]: I0309 09:56:27.872685 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32fbfd7e-856d-4570-9951-d660130473c7-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "32fbfd7e-856d-4570-9951-d660130473c7" (UID: "32fbfd7e-856d-4570-9951-d660130473c7"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:56:27 crc kubenswrapper[4971]: I0309 09:56:27.874253 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32fbfd7e-856d-4570-9951-d660130473c7-scripts" (OuterVolumeSpecName: "scripts") pod "32fbfd7e-856d-4570-9951-d660130473c7" (UID: "32fbfd7e-856d-4570-9951-d660130473c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:56:27 crc kubenswrapper[4971]: I0309 09:56:27.877499 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32fbfd7e-856d-4570-9951-d660130473c7-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "32fbfd7e-856d-4570-9951-d660130473c7" (UID: "32fbfd7e-856d-4570-9951-d660130473c7"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:56:27 crc kubenswrapper[4971]: I0309 09:56:27.950641 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/32fbfd7e-856d-4570-9951-d660130473c7-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:27 crc kubenswrapper[4971]: I0309 09:56:27.950692 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/32fbfd7e-856d-4570-9951-d660130473c7-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:27 crc kubenswrapper[4971]: I0309 09:56:27.950701 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/32fbfd7e-856d-4570-9951-d660130473c7-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:27 crc kubenswrapper[4971]: I0309 09:56:27.950711 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kkq9\" (UniqueName: \"kubernetes.io/projected/32fbfd7e-856d-4570-9951-d660130473c7-kube-api-access-4kkq9\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:27 crc kubenswrapper[4971]: I0309 09:56:27.950722 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/32fbfd7e-856d-4570-9951-d660130473c7-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:27 crc kubenswrapper[4971]: I0309 09:56:27.950731 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32fbfd7e-856d-4570-9951-d660130473c7-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:28 crc kubenswrapper[4971]: I0309 09:56:28.407381 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c1fcb5797a5c99253a14ede152b33b839aedd2c90ce8d88aeb00098c9446300" Mar 09 09:56:28 crc kubenswrapper[4971]: I0309 09:56:28.407466 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt7lg" Mar 09 09:56:28 crc kubenswrapper[4971]: I0309 09:56:28.868927 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk"] Mar 09 09:56:28 crc kubenswrapper[4971]: E0309 09:56:28.869338 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32fbfd7e-856d-4570-9951-d660130473c7" containerName="swift-ring-rebalance" Mar 09 09:56:28 crc kubenswrapper[4971]: I0309 09:56:28.869414 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="32fbfd7e-856d-4570-9951-d660130473c7" containerName="swift-ring-rebalance" Mar 09 09:56:28 crc kubenswrapper[4971]: I0309 09:56:28.869636 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="32fbfd7e-856d-4570-9951-d660130473c7" containerName="swift-ring-rebalance" Mar 09 09:56:28 crc kubenswrapper[4971]: I0309 09:56:28.870464 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk" Mar 09 09:56:28 crc kubenswrapper[4971]: I0309 09:56:28.872579 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:56:28 crc kubenswrapper[4971]: I0309 09:56:28.874301 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:56:28 crc kubenswrapper[4971]: I0309 09:56:28.879018 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk"] Mar 09 09:56:29 crc kubenswrapper[4971]: I0309 09:56:29.069068 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-dispersionconf\") pod \"swift-ring-rebalance-debug-gp9jk\" (UID: \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk" Mar 09 09:56:29 crc kubenswrapper[4971]: I0309 09:56:29.069470 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-ring-data-devices\") pod \"swift-ring-rebalance-debug-gp9jk\" (UID: \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk" Mar 09 09:56:29 crc kubenswrapper[4971]: I0309 09:56:29.069520 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-etc-swift\") pod \"swift-ring-rebalance-debug-gp9jk\" (UID: \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk" Mar 09 09:56:29 crc kubenswrapper[4971]: I0309 09:56:29.069548 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-scripts\") pod \"swift-ring-rebalance-debug-gp9jk\" (UID: \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk" Mar 09 09:56:29 crc kubenswrapper[4971]: I0309 09:56:29.069646 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-swiftconf\") pod \"swift-ring-rebalance-debug-gp9jk\" (UID: \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk" Mar 09 09:56:29 crc kubenswrapper[4971]: I0309 09:56:29.069676 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5vbb\" (UniqueName: \"kubernetes.io/projected/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-kube-api-access-t5vbb\") pod \"swift-ring-rebalance-debug-gp9jk\" (UID: \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk" Mar 09 09:56:29 crc kubenswrapper[4971]: I0309 09:56:29.161782 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32fbfd7e-856d-4570-9951-d660130473c7" path="/var/lib/kubelet/pods/32fbfd7e-856d-4570-9951-d660130473c7/volumes" Mar 09 09:56:29 crc kubenswrapper[4971]: I0309 09:56:29.171244 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-etc-swift\") pod \"swift-ring-rebalance-debug-gp9jk\" (UID: \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk" Mar 09 09:56:29 crc kubenswrapper[4971]: I0309 09:56:29.171309 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-scripts\") pod \"swift-ring-rebalance-debug-gp9jk\" (UID: \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk" Mar 09 09:56:29 crc kubenswrapper[4971]: I0309 09:56:29.171392 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-swiftconf\") pod \"swift-ring-rebalance-debug-gp9jk\" (UID: \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk" Mar 09 09:56:29 crc kubenswrapper[4971]: I0309 09:56:29.171425 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5vbb\" (UniqueName: \"kubernetes.io/projected/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-kube-api-access-t5vbb\") pod \"swift-ring-rebalance-debug-gp9jk\" (UID: \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk" Mar 09 09:56:29 crc kubenswrapper[4971]: I0309 09:56:29.171492 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-dispersionconf\") pod \"swift-ring-rebalance-debug-gp9jk\" (UID: \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk" Mar 09 09:56:29 crc kubenswrapper[4971]: I0309 09:56:29.171537 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-ring-data-devices\") pod \"swift-ring-rebalance-debug-gp9jk\" (UID: \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk" Mar 09 09:56:29 crc kubenswrapper[4971]: I0309 09:56:29.171736 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-etc-swift\") pod \"swift-ring-rebalance-debug-gp9jk\" (UID: \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk" Mar 09 09:56:29 crc kubenswrapper[4971]: I0309 09:56:29.172368 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-ring-data-devices\") pod \"swift-ring-rebalance-debug-gp9jk\" (UID: \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk" Mar 09 09:56:29 crc kubenswrapper[4971]: I0309 09:56:29.172368 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-scripts\") pod \"swift-ring-rebalance-debug-gp9jk\" (UID: \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk" Mar 09 09:56:29 crc kubenswrapper[4971]: I0309 09:56:29.179067 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-dispersionconf\") pod \"swift-ring-rebalance-debug-gp9jk\" (UID: \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk" Mar 09 09:56:29 crc kubenswrapper[4971]: I0309 09:56:29.179167 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-swiftconf\") pod \"swift-ring-rebalance-debug-gp9jk\" (UID: \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk" Mar 09 09:56:29 crc kubenswrapper[4971]: I0309 09:56:29.192075 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5vbb\" (UniqueName: \"kubernetes.io/projected/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-kube-api-access-t5vbb\") pod \"swift-ring-rebalance-debug-gp9jk\" (UID: \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk" Mar 09 09:56:29 crc kubenswrapper[4971]: I0309 09:56:29.200111 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk" Mar 09 09:56:29 crc kubenswrapper[4971]: I0309 09:56:29.633993 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk"] Mar 09 09:56:30 crc kubenswrapper[4971]: I0309 09:56:30.439389 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk" event={"ID":"f6e0f71c-dc31-470d-8508-520cd7f4e8ec","Type":"ContainerStarted","Data":"741c0fc56124ea1ce9ecf2149c689fa71bd88630fb6b6114a74818295635a8f0"} Mar 09 09:56:30 crc kubenswrapper[4971]: I0309 09:56:30.439812 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk" event={"ID":"f6e0f71c-dc31-470d-8508-520cd7f4e8ec","Type":"ContainerStarted","Data":"bfc52fe05d95b635f9fd8219595fe750a8c987dc944d6edda60733c4cbf3fcd9"} Mar 09 09:56:30 crc kubenswrapper[4971]: I0309 09:56:30.460663 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk" podStartSLOduration=2.460646523 podStartE2EDuration="2.460646523s" podCreationTimestamp="2026-03-09 09:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:56:30.458604515 +0000 UTC m=+2194.018532345" watchObservedRunningTime="2026-03-09 09:56:30.460646523 +0000 UTC m=+2194.020574333" Mar 09 09:56:31 crc kubenswrapper[4971]: I0309 09:56:31.449234 4971 generic.go:334] "Generic (PLEG): container finished" podID="f6e0f71c-dc31-470d-8508-520cd7f4e8ec" containerID="741c0fc56124ea1ce9ecf2149c689fa71bd88630fb6b6114a74818295635a8f0" exitCode=0 Mar 09 09:56:31 crc kubenswrapper[4971]: I0309 09:56:31.449278 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk" event={"ID":"f6e0f71c-dc31-470d-8508-520cd7f4e8ec","Type":"ContainerDied","Data":"741c0fc56124ea1ce9ecf2149c689fa71bd88630fb6b6114a74818295635a8f0"} Mar 09 09:56:32 crc kubenswrapper[4971]: I0309 09:56:32.741481 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk" Mar 09 09:56:32 crc kubenswrapper[4971]: I0309 09:56:32.775084 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk"] Mar 09 09:56:32 crc kubenswrapper[4971]: I0309 09:56:32.790554 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk"] Mar 09 09:56:32 crc kubenswrapper[4971]: I0309 09:56:32.924449 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-ring-data-devices\") pod \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\" (UID: \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\") " Mar 09 09:56:32 crc kubenswrapper[4971]: I0309 09:56:32.924569 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-swiftconf\") pod \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\" (UID: \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\") " Mar 09 09:56:32 crc kubenswrapper[4971]: I0309 09:56:32.924662 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-etc-swift\") pod \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\" (UID: \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\") " Mar 09 09:56:32 crc kubenswrapper[4971]: I0309 09:56:32.924686 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5vbb\" (UniqueName: \"kubernetes.io/projected/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-kube-api-access-t5vbb\") pod \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\" (UID: \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\") " Mar 09 09:56:32 crc kubenswrapper[4971]: I0309 09:56:32.924714 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-scripts\") pod \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\" (UID: \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\") " Mar 09 09:56:32 crc kubenswrapper[4971]: I0309 09:56:32.924747 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-dispersionconf\") pod \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\" (UID: \"f6e0f71c-dc31-470d-8508-520cd7f4e8ec\") " Mar 09 09:56:32 crc kubenswrapper[4971]: I0309 09:56:32.925402 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f6e0f71c-dc31-470d-8508-520cd7f4e8ec" (UID: "f6e0f71c-dc31-470d-8508-520cd7f4e8ec"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:56:32 crc kubenswrapper[4971]: I0309 09:56:32.925819 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f6e0f71c-dc31-470d-8508-520cd7f4e8ec" (UID: "f6e0f71c-dc31-470d-8508-520cd7f4e8ec"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:56:32 crc kubenswrapper[4971]: I0309 09:56:32.926007 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:32 crc kubenswrapper[4971]: I0309 09:56:32.926042 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:32 crc kubenswrapper[4971]: I0309 09:56:32.930740 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-kube-api-access-t5vbb" (OuterVolumeSpecName: "kube-api-access-t5vbb") pod "f6e0f71c-dc31-470d-8508-520cd7f4e8ec" (UID: "f6e0f71c-dc31-470d-8508-520cd7f4e8ec"). InnerVolumeSpecName "kube-api-access-t5vbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:56:32 crc kubenswrapper[4971]: I0309 09:56:32.947288 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f6e0f71c-dc31-470d-8508-520cd7f4e8ec" (UID: "f6e0f71c-dc31-470d-8508-520cd7f4e8ec"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:56:32 crc kubenswrapper[4971]: I0309 09:56:32.949333 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f6e0f71c-dc31-470d-8508-520cd7f4e8ec" (UID: "f6e0f71c-dc31-470d-8508-520cd7f4e8ec"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:56:32 crc kubenswrapper[4971]: I0309 09:56:32.964225 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-scripts" (OuterVolumeSpecName: "scripts") pod "f6e0f71c-dc31-470d-8508-520cd7f4e8ec" (UID: "f6e0f71c-dc31-470d-8508-520cd7f4e8ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:56:33 crc kubenswrapper[4971]: I0309 09:56:33.027110 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:33 crc kubenswrapper[4971]: I0309 09:56:33.027145 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5vbb\" (UniqueName: \"kubernetes.io/projected/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-kube-api-access-t5vbb\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:33 crc kubenswrapper[4971]: I0309 09:56:33.027157 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:33 crc kubenswrapper[4971]: I0309 09:56:33.027168 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f6e0f71c-dc31-470d-8508-520cd7f4e8ec-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:33 crc kubenswrapper[4971]: I0309 09:56:33.165744 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6e0f71c-dc31-470d-8508-520cd7f4e8ec" path="/var/lib/kubelet/pods/f6e0f71c-dc31-470d-8508-520cd7f4e8ec/volumes" Mar 09 09:56:33 crc kubenswrapper[4971]: I0309 09:56:33.468178 4971 scope.go:117] "RemoveContainer" containerID="741c0fc56124ea1ce9ecf2149c689fa71bd88630fb6b6114a74818295635a8f0" Mar 09 09:56:33 crc kubenswrapper[4971]: I0309 09:56:33.468219 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gp9jk" Mar 09 09:56:33 crc kubenswrapper[4971]: I0309 09:56:33.910151 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6"] Mar 09 09:56:33 crc kubenswrapper[4971]: E0309 09:56:33.910547 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6e0f71c-dc31-470d-8508-520cd7f4e8ec" containerName="swift-ring-rebalance" Mar 09 09:56:33 crc kubenswrapper[4971]: I0309 09:56:33.910564 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6e0f71c-dc31-470d-8508-520cd7f4e8ec" containerName="swift-ring-rebalance" Mar 09 09:56:33 crc kubenswrapper[4971]: I0309 09:56:33.910743 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6e0f71c-dc31-470d-8508-520cd7f4e8ec" containerName="swift-ring-rebalance" Mar 09 09:56:33 crc kubenswrapper[4971]: I0309 09:56:33.911326 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6" Mar 09 09:56:33 crc kubenswrapper[4971]: I0309 09:56:33.913988 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:56:33 crc kubenswrapper[4971]: I0309 09:56:33.917115 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:56:33 crc kubenswrapper[4971]: I0309 09:56:33.945835 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6c225d27-e34e-407c-a339-cf6016a13118-etc-swift\") pod \"swift-ring-rebalance-debug-bvbq6\" (UID: \"6c225d27-e34e-407c-a339-cf6016a13118\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6" Mar 09 09:56:33 crc kubenswrapper[4971]: I0309 09:56:33.945940 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6c225d27-e34e-407c-a339-cf6016a13118-dispersionconf\") pod \"swift-ring-rebalance-debug-bvbq6\" (UID: \"6c225d27-e34e-407c-a339-cf6016a13118\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6" Mar 09 09:56:33 crc kubenswrapper[4971]: I0309 09:56:33.946016 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxjxg\" (UniqueName: \"kubernetes.io/projected/6c225d27-e34e-407c-a339-cf6016a13118-kube-api-access-dxjxg\") pod \"swift-ring-rebalance-debug-bvbq6\" (UID: \"6c225d27-e34e-407c-a339-cf6016a13118\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6" Mar 09 09:56:33 crc kubenswrapper[4971]: I0309 09:56:33.946040 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c225d27-e34e-407c-a339-cf6016a13118-scripts\") pod \"swift-ring-rebalance-debug-bvbq6\" (UID: \"6c225d27-e34e-407c-a339-cf6016a13118\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6" Mar 09 09:56:33 crc kubenswrapper[4971]: I0309 09:56:33.946064 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6c225d27-e34e-407c-a339-cf6016a13118-ring-data-devices\") pod \"swift-ring-rebalance-debug-bvbq6\" (UID: \"6c225d27-e34e-407c-a339-cf6016a13118\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6" Mar 09 09:56:33 crc kubenswrapper[4971]: I0309 09:56:33.946113 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6c225d27-e34e-407c-a339-cf6016a13118-swiftconf\") pod \"swift-ring-rebalance-debug-bvbq6\" (UID: \"6c225d27-e34e-407c-a339-cf6016a13118\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6" Mar 09 09:56:33 crc kubenswrapper[4971]: I0309 09:56:33.948070 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6"] Mar 09 09:56:34 crc kubenswrapper[4971]: I0309 09:56:34.046970 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6c225d27-e34e-407c-a339-cf6016a13118-etc-swift\") pod \"swift-ring-rebalance-debug-bvbq6\" (UID: \"6c225d27-e34e-407c-a339-cf6016a13118\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6" Mar 09 09:56:34 crc kubenswrapper[4971]: I0309 09:56:34.047025 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6c225d27-e34e-407c-a339-cf6016a13118-dispersionconf\") pod \"swift-ring-rebalance-debug-bvbq6\" (UID: \"6c225d27-e34e-407c-a339-cf6016a13118\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6" Mar 09 09:56:34 crc kubenswrapper[4971]: I0309 09:56:34.047065 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxjxg\" (UniqueName: \"kubernetes.io/projected/6c225d27-e34e-407c-a339-cf6016a13118-kube-api-access-dxjxg\") pod \"swift-ring-rebalance-debug-bvbq6\" (UID: \"6c225d27-e34e-407c-a339-cf6016a13118\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6" Mar 09 09:56:34 crc kubenswrapper[4971]: I0309 09:56:34.047084 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c225d27-e34e-407c-a339-cf6016a13118-scripts\") pod \"swift-ring-rebalance-debug-bvbq6\" (UID: \"6c225d27-e34e-407c-a339-cf6016a13118\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6" Mar 09 09:56:34 crc kubenswrapper[4971]: I0309 09:56:34.047099 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6c225d27-e34e-407c-a339-cf6016a13118-ring-data-devices\") pod \"swift-ring-rebalance-debug-bvbq6\" (UID: \"6c225d27-e34e-407c-a339-cf6016a13118\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6" Mar 09 09:56:34 crc kubenswrapper[4971]: I0309 09:56:34.047118 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6c225d27-e34e-407c-a339-cf6016a13118-swiftconf\") pod \"swift-ring-rebalance-debug-bvbq6\" (UID: \"6c225d27-e34e-407c-a339-cf6016a13118\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6" Mar 09 09:56:34 crc kubenswrapper[4971]: I0309 09:56:34.047532 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6c225d27-e34e-407c-a339-cf6016a13118-etc-swift\") pod \"swift-ring-rebalance-debug-bvbq6\" (UID: \"6c225d27-e34e-407c-a339-cf6016a13118\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6" Mar 09 09:56:34 crc kubenswrapper[4971]: I0309 09:56:34.048109 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c225d27-e34e-407c-a339-cf6016a13118-scripts\") pod \"swift-ring-rebalance-debug-bvbq6\" (UID: \"6c225d27-e34e-407c-a339-cf6016a13118\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6" Mar 09 09:56:34 crc kubenswrapper[4971]: I0309 09:56:34.049603 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6c225d27-e34e-407c-a339-cf6016a13118-ring-data-devices\") pod \"swift-ring-rebalance-debug-bvbq6\" (UID: \"6c225d27-e34e-407c-a339-cf6016a13118\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6" Mar 09 09:56:34 crc kubenswrapper[4971]: I0309 09:56:34.050813 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6c225d27-e34e-407c-a339-cf6016a13118-dispersionconf\") pod \"swift-ring-rebalance-debug-bvbq6\" (UID: \"6c225d27-e34e-407c-a339-cf6016a13118\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6" Mar 09 09:56:34 crc kubenswrapper[4971]: I0309 09:56:34.051678 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6c225d27-e34e-407c-a339-cf6016a13118-swiftconf\") pod \"swift-ring-rebalance-debug-bvbq6\" (UID: \"6c225d27-e34e-407c-a339-cf6016a13118\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6" Mar 09 09:56:34 crc kubenswrapper[4971]: I0309 09:56:34.063525 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxjxg\" (UniqueName: \"kubernetes.io/projected/6c225d27-e34e-407c-a339-cf6016a13118-kube-api-access-dxjxg\") pod \"swift-ring-rebalance-debug-bvbq6\" (UID: \"6c225d27-e34e-407c-a339-cf6016a13118\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6" Mar 09 09:56:34 crc kubenswrapper[4971]: I0309 09:56:34.253596 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6" Mar 09 09:56:34 crc kubenswrapper[4971]: I0309 09:56:34.494320 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6"] Mar 09 09:56:34 crc kubenswrapper[4971]: W0309 09:56:34.499328 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c225d27_e34e_407c_a339_cf6016a13118.slice/crio-2b4d37b9601fda15af9b9c617af1c6178e98278dec4d0aa2eb58a594e083d0fc WatchSource:0}: Error finding container 2b4d37b9601fda15af9b9c617af1c6178e98278dec4d0aa2eb58a594e083d0fc: Status 404 returned error can't find the container with id 2b4d37b9601fda15af9b9c617af1c6178e98278dec4d0aa2eb58a594e083d0fc Mar 09 09:56:35 crc kubenswrapper[4971]: I0309 09:56:35.506206 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6" event={"ID":"6c225d27-e34e-407c-a339-cf6016a13118","Type":"ContainerStarted","Data":"9610e8eb00338ed9eaafbdffba515c31d540923887aef38c53f3a1c5b79eb694"} Mar 09 09:56:35 crc kubenswrapper[4971]: I0309 09:56:35.506624 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6" event={"ID":"6c225d27-e34e-407c-a339-cf6016a13118","Type":"ContainerStarted","Data":"2b4d37b9601fda15af9b9c617af1c6178e98278dec4d0aa2eb58a594e083d0fc"} Mar 09 09:56:35 crc kubenswrapper[4971]: I0309 09:56:35.535299 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6" podStartSLOduration=2.535280309 podStartE2EDuration="2.535280309s" podCreationTimestamp="2026-03-09 09:56:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:56:35.529829805 +0000 UTC m=+2199.089757625" watchObservedRunningTime="2026-03-09 09:56:35.535280309 +0000 UTC m=+2199.095208129" Mar 09 09:56:36 crc kubenswrapper[4971]: I0309 09:56:36.519937 4971 generic.go:334] "Generic (PLEG): container finished" podID="6c225d27-e34e-407c-a339-cf6016a13118" containerID="9610e8eb00338ed9eaafbdffba515c31d540923887aef38c53f3a1c5b79eb694" exitCode=0 Mar 09 09:56:36 crc kubenswrapper[4971]: I0309 09:56:36.520017 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6" event={"ID":"6c225d27-e34e-407c-a339-cf6016a13118","Type":"ContainerDied","Data":"9610e8eb00338ed9eaafbdffba515c31d540923887aef38c53f3a1c5b79eb694"} Mar 09 09:56:37 crc kubenswrapper[4971]: I0309 09:56:37.898112 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6" Mar 09 09:56:37 crc kubenswrapper[4971]: I0309 09:56:37.929330 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6"] Mar 09 09:56:37 crc kubenswrapper[4971]: I0309 09:56:37.937150 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6"] Mar 09 09:56:38 crc kubenswrapper[4971]: I0309 09:56:38.017132 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6c225d27-e34e-407c-a339-cf6016a13118-ring-data-devices\") pod \"6c225d27-e34e-407c-a339-cf6016a13118\" (UID: \"6c225d27-e34e-407c-a339-cf6016a13118\") " Mar 09 09:56:38 crc kubenswrapper[4971]: I0309 09:56:38.017204 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6c225d27-e34e-407c-a339-cf6016a13118-swiftconf\") pod \"6c225d27-e34e-407c-a339-cf6016a13118\" (UID: \"6c225d27-e34e-407c-a339-cf6016a13118\") " Mar 09 09:56:38 crc kubenswrapper[4971]: I0309 09:56:38.017276 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c225d27-e34e-407c-a339-cf6016a13118-scripts\") pod \"6c225d27-e34e-407c-a339-cf6016a13118\" (UID: \"6c225d27-e34e-407c-a339-cf6016a13118\") " Mar 09 09:56:38 crc kubenswrapper[4971]: I0309 09:56:38.017310 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxjxg\" (UniqueName: \"kubernetes.io/projected/6c225d27-e34e-407c-a339-cf6016a13118-kube-api-access-dxjxg\") pod \"6c225d27-e34e-407c-a339-cf6016a13118\" (UID: \"6c225d27-e34e-407c-a339-cf6016a13118\") " Mar 09 09:56:38 crc kubenswrapper[4971]: I0309 09:56:38.017367 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6c225d27-e34e-407c-a339-cf6016a13118-etc-swift\") pod \"6c225d27-e34e-407c-a339-cf6016a13118\" (UID: \"6c225d27-e34e-407c-a339-cf6016a13118\") " Mar 09 09:56:38 crc kubenswrapper[4971]: I0309 09:56:38.017388 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6c225d27-e34e-407c-a339-cf6016a13118-dispersionconf\") pod \"6c225d27-e34e-407c-a339-cf6016a13118\" (UID: \"6c225d27-e34e-407c-a339-cf6016a13118\") " Mar 09 09:56:38 crc kubenswrapper[4971]: I0309 09:56:38.017714 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c225d27-e34e-407c-a339-cf6016a13118-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6c225d27-e34e-407c-a339-cf6016a13118" (UID: "6c225d27-e34e-407c-a339-cf6016a13118"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:56:38 crc kubenswrapper[4971]: I0309 09:56:38.018094 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c225d27-e34e-407c-a339-cf6016a13118-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6c225d27-e34e-407c-a339-cf6016a13118" (UID: "6c225d27-e34e-407c-a339-cf6016a13118"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:56:38 crc kubenswrapper[4971]: I0309 09:56:38.022548 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c225d27-e34e-407c-a339-cf6016a13118-kube-api-access-dxjxg" (OuterVolumeSpecName: "kube-api-access-dxjxg") pod "6c225d27-e34e-407c-a339-cf6016a13118" (UID: "6c225d27-e34e-407c-a339-cf6016a13118"). InnerVolumeSpecName "kube-api-access-dxjxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:56:38 crc kubenswrapper[4971]: I0309 09:56:38.041514 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c225d27-e34e-407c-a339-cf6016a13118-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6c225d27-e34e-407c-a339-cf6016a13118" (UID: "6c225d27-e34e-407c-a339-cf6016a13118"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:56:38 crc kubenswrapper[4971]: I0309 09:56:38.041532 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c225d27-e34e-407c-a339-cf6016a13118-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6c225d27-e34e-407c-a339-cf6016a13118" (UID: "6c225d27-e34e-407c-a339-cf6016a13118"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:56:38 crc kubenswrapper[4971]: I0309 09:56:38.042278 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c225d27-e34e-407c-a339-cf6016a13118-scripts" (OuterVolumeSpecName: "scripts") pod "6c225d27-e34e-407c-a339-cf6016a13118" (UID: "6c225d27-e34e-407c-a339-cf6016a13118"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:56:38 crc kubenswrapper[4971]: I0309 09:56:38.118898 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6c225d27-e34e-407c-a339-cf6016a13118-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:38 crc kubenswrapper[4971]: I0309 09:56:38.118923 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6c225d27-e34e-407c-a339-cf6016a13118-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:38 crc kubenswrapper[4971]: I0309 09:56:38.118932 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c225d27-e34e-407c-a339-cf6016a13118-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:38 crc kubenswrapper[4971]: I0309 09:56:38.118943 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxjxg\" (UniqueName: \"kubernetes.io/projected/6c225d27-e34e-407c-a339-cf6016a13118-kube-api-access-dxjxg\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:38 crc kubenswrapper[4971]: I0309 09:56:38.118953 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6c225d27-e34e-407c-a339-cf6016a13118-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:38 crc kubenswrapper[4971]: I0309 09:56:38.118960 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6c225d27-e34e-407c-a339-cf6016a13118-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:38 crc kubenswrapper[4971]: I0309 09:56:38.543488 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b4d37b9601fda15af9b9c617af1c6178e98278dec4d0aa2eb58a594e083d0fc" Mar 09 09:56:38 crc kubenswrapper[4971]: I0309 09:56:38.543577 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bvbq6" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.080799 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm"] Mar 09 09:56:39 crc kubenswrapper[4971]: E0309 09:56:39.081216 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c225d27-e34e-407c-a339-cf6016a13118" containerName="swift-ring-rebalance" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.081234 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c225d27-e34e-407c-a339-cf6016a13118" containerName="swift-ring-rebalance" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.081517 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c225d27-e34e-407c-a339-cf6016a13118" containerName="swift-ring-rebalance" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.082254 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.084822 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.091932 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.093561 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm"] Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.134715 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-swiftconf\") pod \"swift-ring-rebalance-debug-lfqvm\" (UID: \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.134766 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-scripts\") pod \"swift-ring-rebalance-debug-lfqvm\" (UID: \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.134786 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7vfz\" (UniqueName: \"kubernetes.io/projected/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-kube-api-access-t7vfz\") pod \"swift-ring-rebalance-debug-lfqvm\" (UID: \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.135050 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-ring-data-devices\") pod \"swift-ring-rebalance-debug-lfqvm\" (UID: \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.135104 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-dispersionconf\") pod \"swift-ring-rebalance-debug-lfqvm\" (UID: \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.135225 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-etc-swift\") pod \"swift-ring-rebalance-debug-lfqvm\" (UID: \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.161668 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c225d27-e34e-407c-a339-cf6016a13118" path="/var/lib/kubelet/pods/6c225d27-e34e-407c-a339-cf6016a13118/volumes" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.236189 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-swiftconf\") pod \"swift-ring-rebalance-debug-lfqvm\" (UID: \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.236265 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-scripts\") pod \"swift-ring-rebalance-debug-lfqvm\" (UID: \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.236298 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7vfz\" (UniqueName: \"kubernetes.io/projected/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-kube-api-access-t7vfz\") pod \"swift-ring-rebalance-debug-lfqvm\" (UID: \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.236407 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-ring-data-devices\") pod \"swift-ring-rebalance-debug-lfqvm\" (UID: \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.236433 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-dispersionconf\") pod \"swift-ring-rebalance-debug-lfqvm\" (UID: \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.236463 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-etc-swift\") pod \"swift-ring-rebalance-debug-lfqvm\" (UID: \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.236994 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-etc-swift\") pod \"swift-ring-rebalance-debug-lfqvm\" (UID: \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.237566 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-ring-data-devices\") pod \"swift-ring-rebalance-debug-lfqvm\" (UID: \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.237852 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-scripts\") pod \"swift-ring-rebalance-debug-lfqvm\" (UID: \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.241934 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-dispersionconf\") pod \"swift-ring-rebalance-debug-lfqvm\" (UID: \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.243188 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-swiftconf\") pod \"swift-ring-rebalance-debug-lfqvm\" (UID: \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.267442 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7vfz\" (UniqueName: \"kubernetes.io/projected/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-kube-api-access-t7vfz\") pod \"swift-ring-rebalance-debug-lfqvm\" (UID: \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.444467 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm" Mar 09 09:56:39 crc kubenswrapper[4971]: I0309 09:56:39.864646 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm"] Mar 09 09:56:40 crc kubenswrapper[4971]: I0309 09:56:40.570011 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm" event={"ID":"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5","Type":"ContainerStarted","Data":"9f835c1a1165b5020d440f6b1c6a007983ccf2a12f490c385ffe2b9d17c8a14e"} Mar 09 09:56:40 crc kubenswrapper[4971]: I0309 09:56:40.570517 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm" event={"ID":"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5","Type":"ContainerStarted","Data":"de3e45eb725aa6525c53a8e0d4f9c84383682fbf67fbd9cd7b39d72241bfe9b9"} Mar 09 09:56:41 crc kubenswrapper[4971]: I0309 09:56:41.581384 4971 generic.go:334] "Generic (PLEG): container finished" podID="e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5" containerID="9f835c1a1165b5020d440f6b1c6a007983ccf2a12f490c385ffe2b9d17c8a14e" exitCode=0 Mar 09 09:56:41 crc kubenswrapper[4971]: I0309 09:56:41.581461 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm" event={"ID":"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5","Type":"ContainerDied","Data":"9f835c1a1165b5020d440f6b1c6a007983ccf2a12f490c385ffe2b9d17c8a14e"} Mar 09 09:56:42 crc kubenswrapper[4971]: I0309 09:56:42.910526 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm" Mar 09 09:56:42 crc kubenswrapper[4971]: I0309 09:56:42.941075 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm"] Mar 09 09:56:42 crc kubenswrapper[4971]: I0309 09:56:42.947575 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm"] Mar 09 09:56:42 crc kubenswrapper[4971]: I0309 09:56:42.995593 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-ring-data-devices\") pod \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\" (UID: \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\") " Mar 09 09:56:42 crc kubenswrapper[4971]: I0309 09:56:42.995672 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-swiftconf\") pod \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\" (UID: \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\") " Mar 09 09:56:42 crc kubenswrapper[4971]: I0309 09:56:42.995705 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-scripts\") pod \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\" (UID: \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\") " Mar 09 09:56:42 crc kubenswrapper[4971]: I0309 09:56:42.995734 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-dispersionconf\") pod \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\" (UID: \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\") " Mar 09 09:56:42 crc kubenswrapper[4971]: I0309 09:56:42.995757 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-etc-swift\") pod \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\" (UID: \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\") " Mar 09 09:56:42 crc kubenswrapper[4971]: I0309 09:56:42.995826 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7vfz\" (UniqueName: \"kubernetes.io/projected/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-kube-api-access-t7vfz\") pod \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\" (UID: \"e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5\") " Mar 09 09:56:42 crc kubenswrapper[4971]: I0309 09:56:42.996346 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5" (UID: "e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:56:42 crc kubenswrapper[4971]: I0309 09:56:42.996572 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5" (UID: "e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:56:43 crc kubenswrapper[4971]: I0309 09:56:43.000651 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-kube-api-access-t7vfz" (OuterVolumeSpecName: "kube-api-access-t7vfz") pod "e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5" (UID: "e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5"). InnerVolumeSpecName "kube-api-access-t7vfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:56:43 crc kubenswrapper[4971]: I0309 09:56:43.016901 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-scripts" (OuterVolumeSpecName: "scripts") pod "e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5" (UID: "e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:56:43 crc kubenswrapper[4971]: I0309 09:56:43.017902 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5" (UID: "e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:56:43 crc kubenswrapper[4971]: I0309 09:56:43.021222 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5" (UID: "e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:56:43 crc kubenswrapper[4971]: I0309 09:56:43.097369 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:43 crc kubenswrapper[4971]: I0309 09:56:43.097404 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:43 crc kubenswrapper[4971]: I0309 09:56:43.097413 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:43 crc kubenswrapper[4971]: I0309 09:56:43.097421 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:43 crc kubenswrapper[4971]: I0309 09:56:43.097429 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:43 crc kubenswrapper[4971]: I0309 09:56:43.097437 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7vfz\" (UniqueName: \"kubernetes.io/projected/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5-kube-api-access-t7vfz\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:43 crc kubenswrapper[4971]: I0309 09:56:43.162453 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5" path="/var/lib/kubelet/pods/e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5/volumes" Mar 09 09:56:43 crc kubenswrapper[4971]: I0309 09:56:43.606004 4971 scope.go:117] "RemoveContainer" containerID="9f835c1a1165b5020d440f6b1c6a007983ccf2a12f490c385ffe2b9d17c8a14e" Mar 09 09:56:43 crc kubenswrapper[4971]: I0309 09:56:43.606045 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lfqvm" Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.087285 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt"] Mar 09 09:56:44 crc kubenswrapper[4971]: E0309 09:56:44.088341 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5" containerName="swift-ring-rebalance" Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.088401 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5" containerName="swift-ring-rebalance" Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.088786 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4cbbe1e-20e0-4ab8-a63e-f119e588fbb5" containerName="swift-ring-rebalance" Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.089783 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt" Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.091766 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.092881 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.100834 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt"] Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.212534 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e75c25ee-9046-485f-8353-db0ce7ec9b9b-ring-data-devices\") pod \"swift-ring-rebalance-debug-mh6bt\" (UID: \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt" Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.212852 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e75c25ee-9046-485f-8353-db0ce7ec9b9b-dispersionconf\") pod \"swift-ring-rebalance-debug-mh6bt\" (UID: \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt" Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.212966 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e75c25ee-9046-485f-8353-db0ce7ec9b9b-etc-swift\") pod \"swift-ring-rebalance-debug-mh6bt\" (UID: \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt" Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.213081 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e75c25ee-9046-485f-8353-db0ce7ec9b9b-swiftconf\") pod \"swift-ring-rebalance-debug-mh6bt\" (UID: \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt" Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.213283 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e75c25ee-9046-485f-8353-db0ce7ec9b9b-scripts\") pod \"swift-ring-rebalance-debug-mh6bt\" (UID: \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt" Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.213338 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcgb6\" (UniqueName: \"kubernetes.io/projected/e75c25ee-9046-485f-8353-db0ce7ec9b9b-kube-api-access-xcgb6\") pod \"swift-ring-rebalance-debug-mh6bt\" (UID: \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt" Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.314385 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e75c25ee-9046-485f-8353-db0ce7ec9b9b-ring-data-devices\") pod \"swift-ring-rebalance-debug-mh6bt\" (UID: \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt" Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.314476 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e75c25ee-9046-485f-8353-db0ce7ec9b9b-dispersionconf\") pod \"swift-ring-rebalance-debug-mh6bt\" (UID: \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt" Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.314514 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e75c25ee-9046-485f-8353-db0ce7ec9b9b-etc-swift\") pod \"swift-ring-rebalance-debug-mh6bt\" (UID: \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt" Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.314546 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e75c25ee-9046-485f-8353-db0ce7ec9b9b-swiftconf\") pod \"swift-ring-rebalance-debug-mh6bt\" (UID: \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt" Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.314605 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e75c25ee-9046-485f-8353-db0ce7ec9b9b-scripts\") pod \"swift-ring-rebalance-debug-mh6bt\" (UID: \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt" Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.314634 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcgb6\" (UniqueName: \"kubernetes.io/projected/e75c25ee-9046-485f-8353-db0ce7ec9b9b-kube-api-access-xcgb6\") pod \"swift-ring-rebalance-debug-mh6bt\" (UID: \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt" Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.315226 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e75c25ee-9046-485f-8353-db0ce7ec9b9b-etc-swift\") pod \"swift-ring-rebalance-debug-mh6bt\" (UID: \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt" Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.315331 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e75c25ee-9046-485f-8353-db0ce7ec9b9b-ring-data-devices\") pod \"swift-ring-rebalance-debug-mh6bt\" (UID: \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt" Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.315680 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e75c25ee-9046-485f-8353-db0ce7ec9b9b-scripts\") pod \"swift-ring-rebalance-debug-mh6bt\" (UID: \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt" Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.322326 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e75c25ee-9046-485f-8353-db0ce7ec9b9b-swiftconf\") pod \"swift-ring-rebalance-debug-mh6bt\" (UID: \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt" Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.323951 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e75c25ee-9046-485f-8353-db0ce7ec9b9b-dispersionconf\") pod \"swift-ring-rebalance-debug-mh6bt\" (UID: \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt" Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.333918 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcgb6\" (UniqueName: \"kubernetes.io/projected/e75c25ee-9046-485f-8353-db0ce7ec9b9b-kube-api-access-xcgb6\") pod \"swift-ring-rebalance-debug-mh6bt\" (UID: \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt" Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.424970 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt" Mar 09 09:56:44 crc kubenswrapper[4971]: I0309 09:56:44.906560 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt"] Mar 09 09:56:44 crc kubenswrapper[4971]: W0309 09:56:44.917513 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode75c25ee_9046_485f_8353_db0ce7ec9b9b.slice/crio-098b945ca14570eddbb55647fefe3d512365dfd48f5be90006a106446bfa6391 WatchSource:0}: Error finding container 098b945ca14570eddbb55647fefe3d512365dfd48f5be90006a106446bfa6391: Status 404 returned error can't find the container with id 098b945ca14570eddbb55647fefe3d512365dfd48f5be90006a106446bfa6391 Mar 09 09:56:45 crc kubenswrapper[4971]: I0309 09:56:45.628374 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt" event={"ID":"e75c25ee-9046-485f-8353-db0ce7ec9b9b","Type":"ContainerStarted","Data":"bf4e0b4cd830d25934f87c427bbbad31e1102c0c0ea5bc2a89a31559114f2caf"} Mar 09 09:56:45 crc kubenswrapper[4971]: I0309 09:56:45.628701 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt" event={"ID":"e75c25ee-9046-485f-8353-db0ce7ec9b9b","Type":"ContainerStarted","Data":"098b945ca14570eddbb55647fefe3d512365dfd48f5be90006a106446bfa6391"} Mar 09 09:56:45 crc kubenswrapper[4971]: I0309 09:56:45.648581 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt" podStartSLOduration=1.648562417 podStartE2EDuration="1.648562417s" podCreationTimestamp="2026-03-09 09:56:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:56:45.645558633 +0000 UTC m=+2209.205486473" watchObservedRunningTime="2026-03-09 09:56:45.648562417 +0000 UTC m=+2209.208490227" Mar 09 09:56:46 crc kubenswrapper[4971]: I0309 09:56:46.651721 4971 generic.go:334] "Generic (PLEG): container finished" podID="e75c25ee-9046-485f-8353-db0ce7ec9b9b" containerID="bf4e0b4cd830d25934f87c427bbbad31e1102c0c0ea5bc2a89a31559114f2caf" exitCode=0 Mar 09 09:56:46 crc kubenswrapper[4971]: I0309 09:56:46.651786 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt" event={"ID":"e75c25ee-9046-485f-8353-db0ce7ec9b9b","Type":"ContainerDied","Data":"bf4e0b4cd830d25934f87c427bbbad31e1102c0c0ea5bc2a89a31559114f2caf"} Mar 09 09:56:47 crc kubenswrapper[4971]: I0309 09:56:47.975286 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt" Mar 09 09:56:48 crc kubenswrapper[4971]: I0309 09:56:48.006910 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt"] Mar 09 09:56:48 crc kubenswrapper[4971]: I0309 09:56:48.014368 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt"] Mar 09 09:56:48 crc kubenswrapper[4971]: I0309 09:56:48.079728 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e75c25ee-9046-485f-8353-db0ce7ec9b9b-etc-swift\") pod \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\" (UID: \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\") " Mar 09 09:56:48 crc kubenswrapper[4971]: I0309 09:56:48.079863 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e75c25ee-9046-485f-8353-db0ce7ec9b9b-ring-data-devices\") pod \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\" (UID: \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\") " Mar 09 09:56:48 crc kubenswrapper[4971]: I0309 09:56:48.079948 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e75c25ee-9046-485f-8353-db0ce7ec9b9b-dispersionconf\") pod \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\" (UID: \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\") " Mar 09 09:56:48 crc kubenswrapper[4971]: I0309 09:56:48.080090 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e75c25ee-9046-485f-8353-db0ce7ec9b9b-scripts\") pod \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\" (UID: \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\") " Mar 09 09:56:48 crc kubenswrapper[4971]: I0309 09:56:48.080191 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgb6\" (UniqueName: \"kubernetes.io/projected/e75c25ee-9046-485f-8353-db0ce7ec9b9b-kube-api-access-xcgb6\") pod \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\" (UID: \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\") " Mar 09 09:56:48 crc kubenswrapper[4971]: I0309 09:56:48.080253 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e75c25ee-9046-485f-8353-db0ce7ec9b9b-swiftconf\") pod \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\" (UID: \"e75c25ee-9046-485f-8353-db0ce7ec9b9b\") " Mar 09 09:56:48 crc kubenswrapper[4971]: I0309 09:56:48.080660 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e75c25ee-9046-485f-8353-db0ce7ec9b9b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e75c25ee-9046-485f-8353-db0ce7ec9b9b" (UID: "e75c25ee-9046-485f-8353-db0ce7ec9b9b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:56:48 crc kubenswrapper[4971]: I0309 09:56:48.080853 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e75c25ee-9046-485f-8353-db0ce7ec9b9b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e75c25ee-9046-485f-8353-db0ce7ec9b9b" (UID: "e75c25ee-9046-485f-8353-db0ce7ec9b9b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:56:48 crc kubenswrapper[4971]: I0309 09:56:48.080879 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e75c25ee-9046-485f-8353-db0ce7ec9b9b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:48 crc kubenswrapper[4971]: I0309 09:56:48.093591 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e75c25ee-9046-485f-8353-db0ce7ec9b9b-kube-api-access-xcgb6" (OuterVolumeSpecName: "kube-api-access-xcgb6") pod "e75c25ee-9046-485f-8353-db0ce7ec9b9b" (UID: "e75c25ee-9046-485f-8353-db0ce7ec9b9b"). InnerVolumeSpecName "kube-api-access-xcgb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:56:48 crc kubenswrapper[4971]: I0309 09:56:48.107774 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e75c25ee-9046-485f-8353-db0ce7ec9b9b-scripts" (OuterVolumeSpecName: "scripts") pod "e75c25ee-9046-485f-8353-db0ce7ec9b9b" (UID: "e75c25ee-9046-485f-8353-db0ce7ec9b9b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:56:48 crc kubenswrapper[4971]: I0309 09:56:48.112129 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e75c25ee-9046-485f-8353-db0ce7ec9b9b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e75c25ee-9046-485f-8353-db0ce7ec9b9b" (UID: "e75c25ee-9046-485f-8353-db0ce7ec9b9b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:56:48 crc kubenswrapper[4971]: I0309 09:56:48.112514 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e75c25ee-9046-485f-8353-db0ce7ec9b9b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e75c25ee-9046-485f-8353-db0ce7ec9b9b" (UID: "e75c25ee-9046-485f-8353-db0ce7ec9b9b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:56:48 crc kubenswrapper[4971]: I0309 09:56:48.182377 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e75c25ee-9046-485f-8353-db0ce7ec9b9b-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:48 crc kubenswrapper[4971]: I0309 09:56:48.182407 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e75c25ee-9046-485f-8353-db0ce7ec9b9b-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:48 crc kubenswrapper[4971]: I0309 09:56:48.182416 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e75c25ee-9046-485f-8353-db0ce7ec9b9b-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:48 crc kubenswrapper[4971]: I0309 09:56:48.182461 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e75c25ee-9046-485f-8353-db0ce7ec9b9b-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:48 crc kubenswrapper[4971]: I0309 09:56:48.182473 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgb6\" (UniqueName: \"kubernetes.io/projected/e75c25ee-9046-485f-8353-db0ce7ec9b9b-kube-api-access-xcgb6\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:48 crc kubenswrapper[4971]: I0309 09:56:48.674759 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="098b945ca14570eddbb55647fefe3d512365dfd48f5be90006a106446bfa6391" Mar 09 09:56:48 crc kubenswrapper[4971]: I0309 09:56:48.675109 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mh6bt" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.170823 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e75c25ee-9046-485f-8353-db0ce7ec9b9b" path="/var/lib/kubelet/pods/e75c25ee-9046-485f-8353-db0ce7ec9b9b/volumes" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.171282 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn"] Mar 09 09:56:49 crc kubenswrapper[4971]: E0309 09:56:49.171543 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e75c25ee-9046-485f-8353-db0ce7ec9b9b" containerName="swift-ring-rebalance" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.171553 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e75c25ee-9046-485f-8353-db0ce7ec9b9b" containerName="swift-ring-rebalance" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.171703 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e75c25ee-9046-485f-8353-db0ce7ec9b9b" containerName="swift-ring-rebalance" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.172130 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn"] Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.172202 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.175149 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.175721 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.303311 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b50632a3-cc70-4f7a-b015-e26085d986c8-dispersionconf\") pod \"swift-ring-rebalance-debug-wwdrn\" (UID: \"b50632a3-cc70-4f7a-b015-e26085d986c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.303402 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6f4p\" (UniqueName: \"kubernetes.io/projected/b50632a3-cc70-4f7a-b015-e26085d986c8-kube-api-access-j6f4p\") pod \"swift-ring-rebalance-debug-wwdrn\" (UID: \"b50632a3-cc70-4f7a-b015-e26085d986c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.303470 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b50632a3-cc70-4f7a-b015-e26085d986c8-ring-data-devices\") pod \"swift-ring-rebalance-debug-wwdrn\" (UID: \"b50632a3-cc70-4f7a-b015-e26085d986c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.303523 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b50632a3-cc70-4f7a-b015-e26085d986c8-scripts\") pod \"swift-ring-rebalance-debug-wwdrn\" (UID: \"b50632a3-cc70-4f7a-b015-e26085d986c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.303547 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b50632a3-cc70-4f7a-b015-e26085d986c8-swiftconf\") pod \"swift-ring-rebalance-debug-wwdrn\" (UID: \"b50632a3-cc70-4f7a-b015-e26085d986c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.303607 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b50632a3-cc70-4f7a-b015-e26085d986c8-etc-swift\") pod \"swift-ring-rebalance-debug-wwdrn\" (UID: \"b50632a3-cc70-4f7a-b015-e26085d986c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.404713 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b50632a3-cc70-4f7a-b015-e26085d986c8-etc-swift\") pod \"swift-ring-rebalance-debug-wwdrn\" (UID: \"b50632a3-cc70-4f7a-b015-e26085d986c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.404796 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b50632a3-cc70-4f7a-b015-e26085d986c8-dispersionconf\") pod \"swift-ring-rebalance-debug-wwdrn\" (UID: \"b50632a3-cc70-4f7a-b015-e26085d986c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.404827 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6f4p\" (UniqueName: \"kubernetes.io/projected/b50632a3-cc70-4f7a-b015-e26085d986c8-kube-api-access-j6f4p\") pod \"swift-ring-rebalance-debug-wwdrn\" (UID: \"b50632a3-cc70-4f7a-b015-e26085d986c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.404881 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b50632a3-cc70-4f7a-b015-e26085d986c8-ring-data-devices\") pod \"swift-ring-rebalance-debug-wwdrn\" (UID: \"b50632a3-cc70-4f7a-b015-e26085d986c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.404911 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b50632a3-cc70-4f7a-b015-e26085d986c8-scripts\") pod \"swift-ring-rebalance-debug-wwdrn\" (UID: \"b50632a3-cc70-4f7a-b015-e26085d986c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.404932 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b50632a3-cc70-4f7a-b015-e26085d986c8-swiftconf\") pod \"swift-ring-rebalance-debug-wwdrn\" (UID: \"b50632a3-cc70-4f7a-b015-e26085d986c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.405981 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b50632a3-cc70-4f7a-b015-e26085d986c8-etc-swift\") pod \"swift-ring-rebalance-debug-wwdrn\" (UID: \"b50632a3-cc70-4f7a-b015-e26085d986c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.406112 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b50632a3-cc70-4f7a-b015-e26085d986c8-ring-data-devices\") pod \"swift-ring-rebalance-debug-wwdrn\" (UID: \"b50632a3-cc70-4f7a-b015-e26085d986c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.406117 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b50632a3-cc70-4f7a-b015-e26085d986c8-scripts\") pod \"swift-ring-rebalance-debug-wwdrn\" (UID: \"b50632a3-cc70-4f7a-b015-e26085d986c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.409387 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b50632a3-cc70-4f7a-b015-e26085d986c8-dispersionconf\") pod \"swift-ring-rebalance-debug-wwdrn\" (UID: \"b50632a3-cc70-4f7a-b015-e26085d986c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.420278 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b50632a3-cc70-4f7a-b015-e26085d986c8-swiftconf\") pod \"swift-ring-rebalance-debug-wwdrn\" (UID: \"b50632a3-cc70-4f7a-b015-e26085d986c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.422244 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6f4p\" (UniqueName: \"kubernetes.io/projected/b50632a3-cc70-4f7a-b015-e26085d986c8-kube-api-access-j6f4p\") pod \"swift-ring-rebalance-debug-wwdrn\" (UID: \"b50632a3-cc70-4f7a-b015-e26085d986c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.489772 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn" Mar 09 09:56:49 crc kubenswrapper[4971]: I0309 09:56:49.906008 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn"] Mar 09 09:56:50 crc kubenswrapper[4971]: I0309 09:56:50.700685 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn" event={"ID":"b50632a3-cc70-4f7a-b015-e26085d986c8","Type":"ContainerStarted","Data":"da8b1d2e49398d1582a5fdce9551bb44f5bd8e0e36d0db20e03fa298549bbeee"} Mar 09 09:56:50 crc kubenswrapper[4971]: I0309 09:56:50.701194 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn" event={"ID":"b50632a3-cc70-4f7a-b015-e26085d986c8","Type":"ContainerStarted","Data":"4f37342e391f8d8e68122f3a952a20462a9145575bdcd89b79edb371fd78d1c9"} Mar 09 09:56:50 crc kubenswrapper[4971]: I0309 09:56:50.733691 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn" podStartSLOduration=1.733660838 podStartE2EDuration="1.733660838s" podCreationTimestamp="2026-03-09 09:56:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:56:50.721568047 +0000 UTC m=+2214.281495927" watchObservedRunningTime="2026-03-09 09:56:50.733660838 +0000 UTC m=+2214.293588658" Mar 09 09:56:51 crc kubenswrapper[4971]: I0309 09:56:51.710190 4971 generic.go:334] "Generic (PLEG): container finished" podID="b50632a3-cc70-4f7a-b015-e26085d986c8" containerID="da8b1d2e49398d1582a5fdce9551bb44f5bd8e0e36d0db20e03fa298549bbeee" exitCode=0 Mar 09 09:56:51 crc kubenswrapper[4971]: I0309 09:56:51.710270 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn" event={"ID":"b50632a3-cc70-4f7a-b015-e26085d986c8","Type":"ContainerDied","Data":"da8b1d2e49398d1582a5fdce9551bb44f5bd8e0e36d0db20e03fa298549bbeee"} Mar 09 09:56:53 crc kubenswrapper[4971]: I0309 09:56:53.005473 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn" Mar 09 09:56:53 crc kubenswrapper[4971]: I0309 09:56:53.034504 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn"] Mar 09 09:56:53 crc kubenswrapper[4971]: I0309 09:56:53.044429 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn"] Mar 09 09:56:53 crc kubenswrapper[4971]: I0309 09:56:53.166729 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6f4p\" (UniqueName: \"kubernetes.io/projected/b50632a3-cc70-4f7a-b015-e26085d986c8-kube-api-access-j6f4p\") pod \"b50632a3-cc70-4f7a-b015-e26085d986c8\" (UID: \"b50632a3-cc70-4f7a-b015-e26085d986c8\") " Mar 09 09:56:53 crc kubenswrapper[4971]: I0309 09:56:53.167124 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b50632a3-cc70-4f7a-b015-e26085d986c8-scripts\") pod \"b50632a3-cc70-4f7a-b015-e26085d986c8\" (UID: \"b50632a3-cc70-4f7a-b015-e26085d986c8\") " Mar 09 09:56:53 crc kubenswrapper[4971]: I0309 09:56:53.167158 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b50632a3-cc70-4f7a-b015-e26085d986c8-etc-swift\") pod \"b50632a3-cc70-4f7a-b015-e26085d986c8\" (UID: \"b50632a3-cc70-4f7a-b015-e26085d986c8\") " Mar 09 09:56:53 crc kubenswrapper[4971]: I0309 09:56:53.167178 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b50632a3-cc70-4f7a-b015-e26085d986c8-dispersionconf\") pod \"b50632a3-cc70-4f7a-b015-e26085d986c8\" (UID: \"b50632a3-cc70-4f7a-b015-e26085d986c8\") " Mar 09 09:56:53 crc kubenswrapper[4971]: I0309 09:56:53.167201 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b50632a3-cc70-4f7a-b015-e26085d986c8-swiftconf\") pod \"b50632a3-cc70-4f7a-b015-e26085d986c8\" (UID: \"b50632a3-cc70-4f7a-b015-e26085d986c8\") " Mar 09 09:56:53 crc kubenswrapper[4971]: I0309 09:56:53.167225 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b50632a3-cc70-4f7a-b015-e26085d986c8-ring-data-devices\") pod \"b50632a3-cc70-4f7a-b015-e26085d986c8\" (UID: \"b50632a3-cc70-4f7a-b015-e26085d986c8\") " Mar 09 09:56:53 crc kubenswrapper[4971]: I0309 09:56:53.168244 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b50632a3-cc70-4f7a-b015-e26085d986c8-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b50632a3-cc70-4f7a-b015-e26085d986c8" (UID: "b50632a3-cc70-4f7a-b015-e26085d986c8"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:56:53 crc kubenswrapper[4971]: I0309 09:56:53.169253 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b50632a3-cc70-4f7a-b015-e26085d986c8-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b50632a3-cc70-4f7a-b015-e26085d986c8" (UID: "b50632a3-cc70-4f7a-b015-e26085d986c8"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:56:53 crc kubenswrapper[4971]: I0309 09:56:53.172090 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b50632a3-cc70-4f7a-b015-e26085d986c8-kube-api-access-j6f4p" (OuterVolumeSpecName: "kube-api-access-j6f4p") pod "b50632a3-cc70-4f7a-b015-e26085d986c8" (UID: "b50632a3-cc70-4f7a-b015-e26085d986c8"). InnerVolumeSpecName "kube-api-access-j6f4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:56:53 crc kubenswrapper[4971]: I0309 09:56:53.191752 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b50632a3-cc70-4f7a-b015-e26085d986c8-scripts" (OuterVolumeSpecName: "scripts") pod "b50632a3-cc70-4f7a-b015-e26085d986c8" (UID: "b50632a3-cc70-4f7a-b015-e26085d986c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:56:53 crc kubenswrapper[4971]: I0309 09:56:53.192622 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b50632a3-cc70-4f7a-b015-e26085d986c8-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b50632a3-cc70-4f7a-b015-e26085d986c8" (UID: "b50632a3-cc70-4f7a-b015-e26085d986c8"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:56:53 crc kubenswrapper[4971]: I0309 09:56:53.193416 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b50632a3-cc70-4f7a-b015-e26085d986c8-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b50632a3-cc70-4f7a-b015-e26085d986c8" (UID: "b50632a3-cc70-4f7a-b015-e26085d986c8"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:56:53 crc kubenswrapper[4971]: I0309 09:56:53.268876 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b50632a3-cc70-4f7a-b015-e26085d986c8-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:53 crc kubenswrapper[4971]: I0309 09:56:53.268914 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b50632a3-cc70-4f7a-b015-e26085d986c8-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:53 crc kubenswrapper[4971]: I0309 09:56:53.268923 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b50632a3-cc70-4f7a-b015-e26085d986c8-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:53 crc kubenswrapper[4971]: I0309 09:56:53.268934 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b50632a3-cc70-4f7a-b015-e26085d986c8-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:53 crc kubenswrapper[4971]: I0309 09:56:53.268942 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b50632a3-cc70-4f7a-b015-e26085d986c8-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:53 crc kubenswrapper[4971]: I0309 09:56:53.268951 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6f4p\" (UniqueName: \"kubernetes.io/projected/b50632a3-cc70-4f7a-b015-e26085d986c8-kube-api-access-j6f4p\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:53 crc kubenswrapper[4971]: I0309 09:56:53.730259 4971 scope.go:117] "RemoveContainer" containerID="da8b1d2e49398d1582a5fdce9551bb44f5bd8e0e36d0db20e03fa298549bbeee" Mar 09 09:56:53 crc kubenswrapper[4971]: I0309 09:56:53.730435 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wwdrn" Mar 09 09:56:54 crc kubenswrapper[4971]: I0309 09:56:54.219801 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fssjx"] Mar 09 09:56:54 crc kubenswrapper[4971]: E0309 09:56:54.220154 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b50632a3-cc70-4f7a-b015-e26085d986c8" containerName="swift-ring-rebalance" Mar 09 09:56:54 crc kubenswrapper[4971]: I0309 09:56:54.220170 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b50632a3-cc70-4f7a-b015-e26085d986c8" containerName="swift-ring-rebalance" Mar 09 09:56:54 crc kubenswrapper[4971]: I0309 09:56:54.220390 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b50632a3-cc70-4f7a-b015-e26085d986c8" containerName="swift-ring-rebalance" Mar 09 09:56:54 crc kubenswrapper[4971]: I0309 09:56:54.221010 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fssjx" Mar 09 09:56:54 crc kubenswrapper[4971]: I0309 09:56:54.223009 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:56:54 crc kubenswrapper[4971]: I0309 09:56:54.226379 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:56:54 crc kubenswrapper[4971]: I0309 09:56:54.238452 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fssjx"] Mar 09 09:56:54 crc kubenswrapper[4971]: I0309 09:56:54.384134 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a1597159-1fa5-4364-a8e1-67a1a2b14715-dispersionconf\") pod \"swift-ring-rebalance-debug-fssjx\" (UID: \"a1597159-1fa5-4364-a8e1-67a1a2b14715\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fssjx" Mar 09 09:56:54 crc kubenswrapper[4971]: I0309 09:56:54.384194 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a1597159-1fa5-4364-a8e1-67a1a2b14715-etc-swift\") pod \"swift-ring-rebalance-debug-fssjx\" (UID: \"a1597159-1fa5-4364-a8e1-67a1a2b14715\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fssjx" Mar 09 09:56:54 crc kubenswrapper[4971]: I0309 09:56:54.384219 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkpbw\" (UniqueName: \"kubernetes.io/projected/a1597159-1fa5-4364-a8e1-67a1a2b14715-kube-api-access-jkpbw\") pod \"swift-ring-rebalance-debug-fssjx\" (UID: \"a1597159-1fa5-4364-a8e1-67a1a2b14715\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fssjx" Mar 09 09:56:54 crc kubenswrapper[4971]: I0309 09:56:54.385037 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1597159-1fa5-4364-a8e1-67a1a2b14715-scripts\") pod \"swift-ring-rebalance-debug-fssjx\" (UID: \"a1597159-1fa5-4364-a8e1-67a1a2b14715\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fssjx" Mar 09 09:56:54 crc kubenswrapper[4971]: I0309 09:56:54.385110 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a1597159-1fa5-4364-a8e1-67a1a2b14715-ring-data-devices\") pod \"swift-ring-rebalance-debug-fssjx\" (UID: \"a1597159-1fa5-4364-a8e1-67a1a2b14715\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fssjx" Mar 09 09:56:54 crc kubenswrapper[4971]: I0309 09:56:54.385176 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a1597159-1fa5-4364-a8e1-67a1a2b14715-swiftconf\") pod \"swift-ring-rebalance-debug-fssjx\" (UID: \"a1597159-1fa5-4364-a8e1-67a1a2b14715\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fssjx" Mar 09 09:56:54 crc kubenswrapper[4971]: I0309 09:56:54.486857 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a1597159-1fa5-4364-a8e1-67a1a2b14715-etc-swift\") pod \"swift-ring-rebalance-debug-fssjx\" (UID: \"a1597159-1fa5-4364-a8e1-67a1a2b14715\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fssjx" Mar 09 09:56:54 crc kubenswrapper[4971]: I0309 09:56:54.486938 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkpbw\" (UniqueName: \"kubernetes.io/projected/a1597159-1fa5-4364-a8e1-67a1a2b14715-kube-api-access-jkpbw\") pod \"swift-ring-rebalance-debug-fssjx\" (UID: \"a1597159-1fa5-4364-a8e1-67a1a2b14715\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fssjx" Mar 09 09:56:54 crc kubenswrapper[4971]: I0309 09:56:54.487047 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1597159-1fa5-4364-a8e1-67a1a2b14715-scripts\") pod \"swift-ring-rebalance-debug-fssjx\" (UID: \"a1597159-1fa5-4364-a8e1-67a1a2b14715\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fssjx" Mar 09 09:56:54 crc kubenswrapper[4971]: I0309 09:56:54.487108 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a1597159-1fa5-4364-a8e1-67a1a2b14715-ring-data-devices\") pod \"swift-ring-rebalance-debug-fssjx\" (UID: \"a1597159-1fa5-4364-a8e1-67a1a2b14715\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fssjx" Mar 09 09:56:54 crc kubenswrapper[4971]: I0309 09:56:54.487174 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a1597159-1fa5-4364-a8e1-67a1a2b14715-swiftconf\") pod \"swift-ring-rebalance-debug-fssjx\" (UID: \"a1597159-1fa5-4364-a8e1-67a1a2b14715\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fssjx" Mar 09 09:56:54 crc kubenswrapper[4971]: I0309 09:56:54.487330 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a1597159-1fa5-4364-a8e1-67a1a2b14715-dispersionconf\") pod \"swift-ring-rebalance-debug-fssjx\" (UID: \"a1597159-1fa5-4364-a8e1-67a1a2b14715\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fssjx" Mar 09 09:56:54 crc kubenswrapper[4971]: I0309 09:56:54.487524 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a1597159-1fa5-4364-a8e1-67a1a2b14715-etc-swift\") pod \"swift-ring-rebalance-debug-fssjx\" (UID: \"a1597159-1fa5-4364-a8e1-67a1a2b14715\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fssjx" Mar 09 09:56:54 crc kubenswrapper[4971]: I0309 09:56:54.487966 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1597159-1fa5-4364-a8e1-67a1a2b14715-scripts\") pod \"swift-ring-rebalance-debug-fssjx\" (UID: \"a1597159-1fa5-4364-a8e1-67a1a2b14715\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fssjx" Mar 09 09:56:54 crc kubenswrapper[4971]: I0309 09:56:54.488285 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a1597159-1fa5-4364-a8e1-67a1a2b14715-ring-data-devices\") pod \"swift-ring-rebalance-debug-fssjx\" (UID: \"a1597159-1fa5-4364-a8e1-67a1a2b14715\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fssjx" Mar 09 09:56:54 crc kubenswrapper[4971]: I0309 09:56:54.490784 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a1597159-1fa5-4364-a8e1-67a1a2b14715-swiftconf\") pod \"swift-ring-rebalance-debug-fssjx\" (UID: \"a1597159-1fa5-4364-a8e1-67a1a2b14715\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fssjx" Mar 09 09:56:54 crc kubenswrapper[4971]: I0309 09:56:54.490813 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a1597159-1fa5-4364-a8e1-67a1a2b14715-dispersionconf\") pod \"swift-ring-rebalance-debug-fssjx\" (UID: \"a1597159-1fa5-4364-a8e1-67a1a2b14715\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fssjx" Mar 09 09:56:54 crc kubenswrapper[4971]: I0309 09:56:54.503918 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkpbw\" (UniqueName: \"kubernetes.io/projected/a1597159-1fa5-4364-a8e1-67a1a2b14715-kube-api-access-jkpbw\") pod \"swift-ring-rebalance-debug-fssjx\" (UID: \"a1597159-1fa5-4364-a8e1-67a1a2b14715\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fssjx" Mar 09 09:56:54 crc kubenswrapper[4971]: I0309 09:56:54.539599 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fssjx" Mar 09 09:56:55 crc kubenswrapper[4971]: I0309 09:56:55.015623 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fssjx"] Mar 09 09:56:55 crc kubenswrapper[4971]: I0309 09:56:55.168140 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b50632a3-cc70-4f7a-b015-e26085d986c8" path="/var/lib/kubelet/pods/b50632a3-cc70-4f7a-b015-e26085d986c8/volumes" Mar 09 09:56:55 crc kubenswrapper[4971]: I0309 09:56:55.769963 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fssjx" event={"ID":"a1597159-1fa5-4364-a8e1-67a1a2b14715","Type":"ContainerStarted","Data":"76e0391295db63046de77962ba199197b6ccf7fb83aafa13c1ded24ebb4178d3"} Mar 09 09:56:55 crc kubenswrapper[4971]: I0309 09:56:55.770439 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fssjx" event={"ID":"a1597159-1fa5-4364-a8e1-67a1a2b14715","Type":"ContainerStarted","Data":"e04e851f1014772b7c4dee9889aa741c2af76180fd38803e6d4c60223205d32b"} Mar 09 09:56:55 crc kubenswrapper[4971]: I0309 09:56:55.796242 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fssjx" podStartSLOduration=1.796220935 podStartE2EDuration="1.796220935s" podCreationTimestamp="2026-03-09 09:56:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:56:55.792540811 +0000 UTC m=+2219.352468621" watchObservedRunningTime="2026-03-09 09:56:55.796220935 +0000 UTC m=+2219.356148745" Mar 09 09:56:56 crc kubenswrapper[4971]: I0309 09:56:56.781856 4971 generic.go:334] "Generic (PLEG): container finished" podID="a1597159-1fa5-4364-a8e1-67a1a2b14715" containerID="76e0391295db63046de77962ba199197b6ccf7fb83aafa13c1ded24ebb4178d3" exitCode=0 Mar 09 09:56:56 crc kubenswrapper[4971]: I0309 09:56:56.781909 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fssjx" event={"ID":"a1597159-1fa5-4364-a8e1-67a1a2b14715","Type":"ContainerDied","Data":"76e0391295db63046de77962ba199197b6ccf7fb83aafa13c1ded24ebb4178d3"} Mar 09 09:56:58 crc kubenswrapper[4971]: I0309 09:56:58.083331 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fssjx" Mar 09 09:56:58 crc kubenswrapper[4971]: I0309 09:56:58.121978 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fssjx"] Mar 09 09:56:58 crc kubenswrapper[4971]: I0309 09:56:58.129228 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fssjx"] Mar 09 09:56:58 crc kubenswrapper[4971]: I0309 09:56:58.243172 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1597159-1fa5-4364-a8e1-67a1a2b14715-scripts\") pod \"a1597159-1fa5-4364-a8e1-67a1a2b14715\" (UID: \"a1597159-1fa5-4364-a8e1-67a1a2b14715\") " Mar 09 09:56:58 crc kubenswrapper[4971]: I0309 09:56:58.243250 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a1597159-1fa5-4364-a8e1-67a1a2b14715-swiftconf\") pod \"a1597159-1fa5-4364-a8e1-67a1a2b14715\" (UID: \"a1597159-1fa5-4364-a8e1-67a1a2b14715\") " Mar 09 09:56:58 crc kubenswrapper[4971]: I0309 09:56:58.243307 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkpbw\" (UniqueName: \"kubernetes.io/projected/a1597159-1fa5-4364-a8e1-67a1a2b14715-kube-api-access-jkpbw\") pod \"a1597159-1fa5-4364-a8e1-67a1a2b14715\" (UID: \"a1597159-1fa5-4364-a8e1-67a1a2b14715\") " Mar 09 09:56:58 crc kubenswrapper[4971]: I0309 09:56:58.243401 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a1597159-1fa5-4364-a8e1-67a1a2b14715-dispersionconf\") pod \"a1597159-1fa5-4364-a8e1-67a1a2b14715\" (UID: \"a1597159-1fa5-4364-a8e1-67a1a2b14715\") " Mar 09 09:56:58 crc kubenswrapper[4971]: I0309 09:56:58.243425 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a1597159-1fa5-4364-a8e1-67a1a2b14715-etc-swift\") pod \"a1597159-1fa5-4364-a8e1-67a1a2b14715\" (UID: \"a1597159-1fa5-4364-a8e1-67a1a2b14715\") " Mar 09 09:56:58 crc kubenswrapper[4971]: I0309 09:56:58.243446 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a1597159-1fa5-4364-a8e1-67a1a2b14715-ring-data-devices\") pod \"a1597159-1fa5-4364-a8e1-67a1a2b14715\" (UID: \"a1597159-1fa5-4364-a8e1-67a1a2b14715\") " Mar 09 09:56:58 crc kubenswrapper[4971]: I0309 09:56:58.244461 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1597159-1fa5-4364-a8e1-67a1a2b14715-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a1597159-1fa5-4364-a8e1-67a1a2b14715" (UID: "a1597159-1fa5-4364-a8e1-67a1a2b14715"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:56:58 crc kubenswrapper[4971]: I0309 09:56:58.244774 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1597159-1fa5-4364-a8e1-67a1a2b14715-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a1597159-1fa5-4364-a8e1-67a1a2b14715" (UID: "a1597159-1fa5-4364-a8e1-67a1a2b14715"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:56:58 crc kubenswrapper[4971]: I0309 09:56:58.248782 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1597159-1fa5-4364-a8e1-67a1a2b14715-kube-api-access-jkpbw" (OuterVolumeSpecName: "kube-api-access-jkpbw") pod "a1597159-1fa5-4364-a8e1-67a1a2b14715" (UID: "a1597159-1fa5-4364-a8e1-67a1a2b14715"). InnerVolumeSpecName "kube-api-access-jkpbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:56:58 crc kubenswrapper[4971]: I0309 09:56:58.263545 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1597159-1fa5-4364-a8e1-67a1a2b14715-scripts" (OuterVolumeSpecName: "scripts") pod "a1597159-1fa5-4364-a8e1-67a1a2b14715" (UID: "a1597159-1fa5-4364-a8e1-67a1a2b14715"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:56:58 crc kubenswrapper[4971]: I0309 09:56:58.265736 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1597159-1fa5-4364-a8e1-67a1a2b14715-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a1597159-1fa5-4364-a8e1-67a1a2b14715" (UID: "a1597159-1fa5-4364-a8e1-67a1a2b14715"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:56:58 crc kubenswrapper[4971]: I0309 09:56:58.272716 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1597159-1fa5-4364-a8e1-67a1a2b14715-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a1597159-1fa5-4364-a8e1-67a1a2b14715" (UID: "a1597159-1fa5-4364-a8e1-67a1a2b14715"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:56:58 crc kubenswrapper[4971]: I0309 09:56:58.344877 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkpbw\" (UniqueName: \"kubernetes.io/projected/a1597159-1fa5-4364-a8e1-67a1a2b14715-kube-api-access-jkpbw\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:58 crc kubenswrapper[4971]: I0309 09:56:58.344911 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a1597159-1fa5-4364-a8e1-67a1a2b14715-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:58 crc kubenswrapper[4971]: I0309 09:56:58.344923 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a1597159-1fa5-4364-a8e1-67a1a2b14715-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:58 crc kubenswrapper[4971]: I0309 09:56:58.344933 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a1597159-1fa5-4364-a8e1-67a1a2b14715-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:58 crc kubenswrapper[4971]: I0309 09:56:58.344945 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1597159-1fa5-4364-a8e1-67a1a2b14715-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:58 crc kubenswrapper[4971]: I0309 09:56:58.344954 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a1597159-1fa5-4364-a8e1-67a1a2b14715-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:58 crc kubenswrapper[4971]: I0309 09:56:58.804764 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e04e851f1014772b7c4dee9889aa741c2af76180fd38803e6d4c60223205d32b" Mar 09 09:56:58 crc kubenswrapper[4971]: I0309 09:56:58.804820 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fssjx" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.164231 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1597159-1fa5-4364-a8e1-67a1a2b14715" path="/var/lib/kubelet/pods/a1597159-1fa5-4364-a8e1-67a1a2b14715/volumes" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.283087 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh"] Mar 09 09:56:59 crc kubenswrapper[4971]: E0309 09:56:59.283446 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1597159-1fa5-4364-a8e1-67a1a2b14715" containerName="swift-ring-rebalance" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.283467 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1597159-1fa5-4364-a8e1-67a1a2b14715" containerName="swift-ring-rebalance" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.283645 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1597159-1fa5-4364-a8e1-67a1a2b14715" containerName="swift-ring-rebalance" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.284131 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.286636 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.288509 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.293588 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh"] Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.359650 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d13f70cc-1438-4be9-9144-dbece4129ef8-ring-data-devices\") pod \"swift-ring-rebalance-debug-nfbbh\" (UID: \"d13f70cc-1438-4be9-9144-dbece4129ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.359716 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rlvv\" (UniqueName: \"kubernetes.io/projected/d13f70cc-1438-4be9-9144-dbece4129ef8-kube-api-access-6rlvv\") pod \"swift-ring-rebalance-debug-nfbbh\" (UID: \"d13f70cc-1438-4be9-9144-dbece4129ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.359786 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d13f70cc-1438-4be9-9144-dbece4129ef8-scripts\") pod \"swift-ring-rebalance-debug-nfbbh\" (UID: \"d13f70cc-1438-4be9-9144-dbece4129ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.359848 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d13f70cc-1438-4be9-9144-dbece4129ef8-dispersionconf\") pod \"swift-ring-rebalance-debug-nfbbh\" (UID: \"d13f70cc-1438-4be9-9144-dbece4129ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.359969 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d13f70cc-1438-4be9-9144-dbece4129ef8-etc-swift\") pod \"swift-ring-rebalance-debug-nfbbh\" (UID: \"d13f70cc-1438-4be9-9144-dbece4129ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.359997 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d13f70cc-1438-4be9-9144-dbece4129ef8-swiftconf\") pod \"swift-ring-rebalance-debug-nfbbh\" (UID: \"d13f70cc-1438-4be9-9144-dbece4129ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.461213 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d13f70cc-1438-4be9-9144-dbece4129ef8-ring-data-devices\") pod \"swift-ring-rebalance-debug-nfbbh\" (UID: \"d13f70cc-1438-4be9-9144-dbece4129ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.461264 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rlvv\" (UniqueName: \"kubernetes.io/projected/d13f70cc-1438-4be9-9144-dbece4129ef8-kube-api-access-6rlvv\") pod \"swift-ring-rebalance-debug-nfbbh\" (UID: \"d13f70cc-1438-4be9-9144-dbece4129ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.461298 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d13f70cc-1438-4be9-9144-dbece4129ef8-scripts\") pod \"swift-ring-rebalance-debug-nfbbh\" (UID: \"d13f70cc-1438-4be9-9144-dbece4129ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.461334 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d13f70cc-1438-4be9-9144-dbece4129ef8-dispersionconf\") pod \"swift-ring-rebalance-debug-nfbbh\" (UID: \"d13f70cc-1438-4be9-9144-dbece4129ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.461419 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d13f70cc-1438-4be9-9144-dbece4129ef8-etc-swift\") pod \"swift-ring-rebalance-debug-nfbbh\" (UID: \"d13f70cc-1438-4be9-9144-dbece4129ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.461440 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d13f70cc-1438-4be9-9144-dbece4129ef8-swiftconf\") pod \"swift-ring-rebalance-debug-nfbbh\" (UID: \"d13f70cc-1438-4be9-9144-dbece4129ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.462104 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d13f70cc-1438-4be9-9144-dbece4129ef8-etc-swift\") pod \"swift-ring-rebalance-debug-nfbbh\" (UID: \"d13f70cc-1438-4be9-9144-dbece4129ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.462750 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d13f70cc-1438-4be9-9144-dbece4129ef8-ring-data-devices\") pod \"swift-ring-rebalance-debug-nfbbh\" (UID: \"d13f70cc-1438-4be9-9144-dbece4129ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.463109 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d13f70cc-1438-4be9-9144-dbece4129ef8-scripts\") pod \"swift-ring-rebalance-debug-nfbbh\" (UID: \"d13f70cc-1438-4be9-9144-dbece4129ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.466559 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d13f70cc-1438-4be9-9144-dbece4129ef8-swiftconf\") pod \"swift-ring-rebalance-debug-nfbbh\" (UID: \"d13f70cc-1438-4be9-9144-dbece4129ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.466904 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d13f70cc-1438-4be9-9144-dbece4129ef8-dispersionconf\") pod \"swift-ring-rebalance-debug-nfbbh\" (UID: \"d13f70cc-1438-4be9-9144-dbece4129ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.479284 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rlvv\" (UniqueName: \"kubernetes.io/projected/d13f70cc-1438-4be9-9144-dbece4129ef8-kube-api-access-6rlvv\") pod \"swift-ring-rebalance-debug-nfbbh\" (UID: \"d13f70cc-1438-4be9-9144-dbece4129ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.649834 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh" Mar 09 09:56:59 crc kubenswrapper[4971]: I0309 09:56:59.863185 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh"] Mar 09 09:56:59 crc kubenswrapper[4971]: W0309 09:56:59.868272 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd13f70cc_1438_4be9_9144_dbece4129ef8.slice/crio-5d8d72c58379b1377d85c44e6f8c55b67970902a30c4345f17105c00df15475c WatchSource:0}: Error finding container 5d8d72c58379b1377d85c44e6f8c55b67970902a30c4345f17105c00df15475c: Status 404 returned error can't find the container with id 5d8d72c58379b1377d85c44e6f8c55b67970902a30c4345f17105c00df15475c Mar 09 09:57:00 crc kubenswrapper[4971]: I0309 09:57:00.825493 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh" event={"ID":"d13f70cc-1438-4be9-9144-dbece4129ef8","Type":"ContainerStarted","Data":"475793facec3a2f431770ecc9f9694cfcb75932344012d26b4b3ba8f7b779aff"} Mar 09 09:57:00 crc kubenswrapper[4971]: I0309 09:57:00.825902 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh" event={"ID":"d13f70cc-1438-4be9-9144-dbece4129ef8","Type":"ContainerStarted","Data":"5d8d72c58379b1377d85c44e6f8c55b67970902a30c4345f17105c00df15475c"} Mar 09 09:57:00 crc kubenswrapper[4971]: I0309 09:57:00.840975 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh" podStartSLOduration=1.840954797 podStartE2EDuration="1.840954797s" podCreationTimestamp="2026-03-09 09:56:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:57:00.838671573 +0000 UTC m=+2224.398599373" watchObservedRunningTime="2026-03-09 09:57:00.840954797 +0000 UTC m=+2224.400882617" Mar 09 09:57:01 crc kubenswrapper[4971]: I0309 09:57:01.840527 4971 generic.go:334] "Generic (PLEG): container finished" podID="d13f70cc-1438-4be9-9144-dbece4129ef8" containerID="475793facec3a2f431770ecc9f9694cfcb75932344012d26b4b3ba8f7b779aff" exitCode=0 Mar 09 09:57:01 crc kubenswrapper[4971]: I0309 09:57:01.840607 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh" event={"ID":"d13f70cc-1438-4be9-9144-dbece4129ef8","Type":"ContainerDied","Data":"475793facec3a2f431770ecc9f9694cfcb75932344012d26b4b3ba8f7b779aff"} Mar 09 09:57:03 crc kubenswrapper[4971]: I0309 09:57:03.171721 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh" Mar 09 09:57:03 crc kubenswrapper[4971]: I0309 09:57:03.208275 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh"] Mar 09 09:57:03 crc kubenswrapper[4971]: I0309 09:57:03.216833 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d13f70cc-1438-4be9-9144-dbece4129ef8-dispersionconf\") pod \"d13f70cc-1438-4be9-9144-dbece4129ef8\" (UID: \"d13f70cc-1438-4be9-9144-dbece4129ef8\") " Mar 09 09:57:03 crc kubenswrapper[4971]: I0309 09:57:03.216885 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d13f70cc-1438-4be9-9144-dbece4129ef8-swiftconf\") pod \"d13f70cc-1438-4be9-9144-dbece4129ef8\" (UID: \"d13f70cc-1438-4be9-9144-dbece4129ef8\") " Mar 09 09:57:03 crc kubenswrapper[4971]: I0309 09:57:03.216948 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rlvv\" (UniqueName: \"kubernetes.io/projected/d13f70cc-1438-4be9-9144-dbece4129ef8-kube-api-access-6rlvv\") pod \"d13f70cc-1438-4be9-9144-dbece4129ef8\" (UID: \"d13f70cc-1438-4be9-9144-dbece4129ef8\") " Mar 09 09:57:03 crc kubenswrapper[4971]: I0309 09:57:03.216968 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d13f70cc-1438-4be9-9144-dbece4129ef8-scripts\") pod \"d13f70cc-1438-4be9-9144-dbece4129ef8\" (UID: \"d13f70cc-1438-4be9-9144-dbece4129ef8\") " Mar 09 09:57:03 crc kubenswrapper[4971]: I0309 09:57:03.216989 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d13f70cc-1438-4be9-9144-dbece4129ef8-ring-data-devices\") pod \"d13f70cc-1438-4be9-9144-dbece4129ef8\" (UID: \"d13f70cc-1438-4be9-9144-dbece4129ef8\") " Mar 09 09:57:03 crc kubenswrapper[4971]: I0309 09:57:03.217017 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d13f70cc-1438-4be9-9144-dbece4129ef8-etc-swift\") pod \"d13f70cc-1438-4be9-9144-dbece4129ef8\" (UID: \"d13f70cc-1438-4be9-9144-dbece4129ef8\") " Mar 09 09:57:03 crc kubenswrapper[4971]: I0309 09:57:03.219270 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d13f70cc-1438-4be9-9144-dbece4129ef8-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d13f70cc-1438-4be9-9144-dbece4129ef8" (UID: "d13f70cc-1438-4be9-9144-dbece4129ef8"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:57:03 crc kubenswrapper[4971]: I0309 09:57:03.219389 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d13f70cc-1438-4be9-9144-dbece4129ef8-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d13f70cc-1438-4be9-9144-dbece4129ef8" (UID: "d13f70cc-1438-4be9-9144-dbece4129ef8"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:57:03 crc kubenswrapper[4971]: I0309 09:57:03.224257 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d13f70cc-1438-4be9-9144-dbece4129ef8-kube-api-access-6rlvv" (OuterVolumeSpecName: "kube-api-access-6rlvv") pod "d13f70cc-1438-4be9-9144-dbece4129ef8" (UID: "d13f70cc-1438-4be9-9144-dbece4129ef8"). InnerVolumeSpecName "kube-api-access-6rlvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:57:03 crc kubenswrapper[4971]: I0309 09:57:03.230998 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh"] Mar 09 09:57:03 crc kubenswrapper[4971]: I0309 09:57:03.242514 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d13f70cc-1438-4be9-9144-dbece4129ef8-scripts" (OuterVolumeSpecName: "scripts") pod "d13f70cc-1438-4be9-9144-dbece4129ef8" (UID: "d13f70cc-1438-4be9-9144-dbece4129ef8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:57:03 crc kubenswrapper[4971]: I0309 09:57:03.266555 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d13f70cc-1438-4be9-9144-dbece4129ef8-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d13f70cc-1438-4be9-9144-dbece4129ef8" (UID: "d13f70cc-1438-4be9-9144-dbece4129ef8"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:03 crc kubenswrapper[4971]: I0309 09:57:03.269476 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d13f70cc-1438-4be9-9144-dbece4129ef8-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d13f70cc-1438-4be9-9144-dbece4129ef8" (UID: "d13f70cc-1438-4be9-9144-dbece4129ef8"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:03 crc kubenswrapper[4971]: I0309 09:57:03.320294 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d13f70cc-1438-4be9-9144-dbece4129ef8-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:03 crc kubenswrapper[4971]: I0309 09:57:03.320363 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d13f70cc-1438-4be9-9144-dbece4129ef8-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:03 crc kubenswrapper[4971]: I0309 09:57:03.320376 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d13f70cc-1438-4be9-9144-dbece4129ef8-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:03 crc kubenswrapper[4971]: I0309 09:57:03.320386 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d13f70cc-1438-4be9-9144-dbece4129ef8-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:03 crc kubenswrapper[4971]: I0309 09:57:03.320398 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rlvv\" (UniqueName: \"kubernetes.io/projected/d13f70cc-1438-4be9-9144-dbece4129ef8-kube-api-access-6rlvv\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:03 crc kubenswrapper[4971]: I0309 09:57:03.320412 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d13f70cc-1438-4be9-9144-dbece4129ef8-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:03 crc kubenswrapper[4971]: I0309 09:57:03.860323 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d8d72c58379b1377d85c44e6f8c55b67970902a30c4345f17105c00df15475c" Mar 09 09:57:03 crc kubenswrapper[4971]: I0309 09:57:03.860409 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nfbbh" Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.332422 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw"] Mar 09 09:57:04 crc kubenswrapper[4971]: E0309 09:57:04.333637 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d13f70cc-1438-4be9-9144-dbece4129ef8" containerName="swift-ring-rebalance" Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.333830 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13f70cc-1438-4be9-9144-dbece4129ef8" containerName="swift-ring-rebalance" Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.335688 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="d13f70cc-1438-4be9-9144-dbece4129ef8" containerName="swift-ring-rebalance" Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.336622 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw" Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.338825 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.341223 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.343721 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw"] Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.537056 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-ring-data-devices\") pod \"swift-ring-rebalance-debug-9fdjw\" (UID: \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw" Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.537110 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-etc-swift\") pod \"swift-ring-rebalance-debug-9fdjw\" (UID: \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw" Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.537147 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-scripts\") pod \"swift-ring-rebalance-debug-9fdjw\" (UID: \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw" Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.537171 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-swiftconf\") pod \"swift-ring-rebalance-debug-9fdjw\" (UID: \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw" Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.537309 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frpr9\" (UniqueName: \"kubernetes.io/projected/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-kube-api-access-frpr9\") pod \"swift-ring-rebalance-debug-9fdjw\" (UID: \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw" Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.537510 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-dispersionconf\") pod \"swift-ring-rebalance-debug-9fdjw\" (UID: \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw" Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.640049 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-dispersionconf\") pod \"swift-ring-rebalance-debug-9fdjw\" (UID: \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw" Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.640176 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-ring-data-devices\") pod \"swift-ring-rebalance-debug-9fdjw\" (UID: \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw" Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.640212 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-etc-swift\") pod \"swift-ring-rebalance-debug-9fdjw\" (UID: \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw" Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.640245 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-scripts\") pod \"swift-ring-rebalance-debug-9fdjw\" (UID: \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw" Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.640276 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-swiftconf\") pod \"swift-ring-rebalance-debug-9fdjw\" (UID: \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw" Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.640313 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frpr9\" (UniqueName: \"kubernetes.io/projected/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-kube-api-access-frpr9\") pod \"swift-ring-rebalance-debug-9fdjw\" (UID: \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw" Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.640946 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-etc-swift\") pod \"swift-ring-rebalance-debug-9fdjw\" (UID: \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw" Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.641776 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-scripts\") pod \"swift-ring-rebalance-debug-9fdjw\" (UID: \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw" Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.641849 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-ring-data-devices\") pod \"swift-ring-rebalance-debug-9fdjw\" (UID: \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw" Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.645737 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-swiftconf\") pod \"swift-ring-rebalance-debug-9fdjw\" (UID: \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw" Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.647584 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-dispersionconf\") pod \"swift-ring-rebalance-debug-9fdjw\" (UID: \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw" Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.663165 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frpr9\" (UniqueName: \"kubernetes.io/projected/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-kube-api-access-frpr9\") pod \"swift-ring-rebalance-debug-9fdjw\" (UID: \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw" Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.666285 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw" Mar 09 09:57:04 crc kubenswrapper[4971]: W0309 09:57:04.910833 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07ecf9bb_72ba_4e80_88dc_04e27bcbc641.slice/crio-e44b84be2ada174208784106e7482a21aac3e5a15732fdcb26a51c19ccf8180d WatchSource:0}: Error finding container e44b84be2ada174208784106e7482a21aac3e5a15732fdcb26a51c19ccf8180d: Status 404 returned error can't find the container with id e44b84be2ada174208784106e7482a21aac3e5a15732fdcb26a51c19ccf8180d Mar 09 09:57:04 crc kubenswrapper[4971]: I0309 09:57:04.913287 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw"] Mar 09 09:57:05 crc kubenswrapper[4971]: I0309 09:57:05.160867 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d13f70cc-1438-4be9-9144-dbece4129ef8" path="/var/lib/kubelet/pods/d13f70cc-1438-4be9-9144-dbece4129ef8/volumes" Mar 09 09:57:05 crc kubenswrapper[4971]: I0309 09:57:05.893748 4971 scope.go:117] "RemoveContainer" containerID="b94c023fd09c46b872894ea4b638d1206bd64cfd0dd05ff4d670a7f12624ed17" Mar 09 09:57:05 crc kubenswrapper[4971]: I0309 09:57:05.894316 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw" event={"ID":"07ecf9bb-72ba-4e80-88dc-04e27bcbc641","Type":"ContainerStarted","Data":"664d86477fa926710de60a78aee673ad9c9795e7460c47390a6e656aea7cf8e2"} Mar 09 09:57:05 crc kubenswrapper[4971]: I0309 09:57:05.894373 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw" event={"ID":"07ecf9bb-72ba-4e80-88dc-04e27bcbc641","Type":"ContainerStarted","Data":"e44b84be2ada174208784106e7482a21aac3e5a15732fdcb26a51c19ccf8180d"} Mar 09 09:57:05 crc kubenswrapper[4971]: I0309 09:57:05.927311 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw" podStartSLOduration=1.9272892929999998 podStartE2EDuration="1.927289293s" podCreationTimestamp="2026-03-09 09:57:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:57:05.91439445 +0000 UTC m=+2229.474322270" watchObservedRunningTime="2026-03-09 09:57:05.927289293 +0000 UTC m=+2229.487217103" Mar 09 09:57:05 crc kubenswrapper[4971]: I0309 09:57:05.952841 4971 scope.go:117] "RemoveContainer" containerID="2d9637d355dbf9e3eb1b55f43bcdece9da435ade7e950c9dcf4dfe0c05c04d65" Mar 09 09:57:05 crc kubenswrapper[4971]: I0309 09:57:05.977481 4971 scope.go:117] "RemoveContainer" containerID="cc6b8c6d16fdcd26b230328427571d004b086a3b69f56cebfabf68371b62b69c" Mar 09 09:57:06 crc kubenswrapper[4971]: I0309 09:57:06.912929 4971 generic.go:334] "Generic (PLEG): container finished" podID="07ecf9bb-72ba-4e80-88dc-04e27bcbc641" containerID="664d86477fa926710de60a78aee673ad9c9795e7460c47390a6e656aea7cf8e2" exitCode=0 Mar 09 09:57:06 crc kubenswrapper[4971]: I0309 09:57:06.912972 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw" event={"ID":"07ecf9bb-72ba-4e80-88dc-04e27bcbc641","Type":"ContainerDied","Data":"664d86477fa926710de60a78aee673ad9c9795e7460c47390a6e656aea7cf8e2"} Mar 09 09:57:08 crc kubenswrapper[4971]: I0309 09:57:08.255418 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw" Mar 09 09:57:08 crc kubenswrapper[4971]: I0309 09:57:08.285135 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw"] Mar 09 09:57:08 crc kubenswrapper[4971]: I0309 09:57:08.291292 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw"] Mar 09 09:57:08 crc kubenswrapper[4971]: I0309 09:57:08.406890 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-dispersionconf\") pod \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\" (UID: \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\") " Mar 09 09:57:08 crc kubenswrapper[4971]: I0309 09:57:08.406951 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-ring-data-devices\") pod \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\" (UID: \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\") " Mar 09 09:57:08 crc kubenswrapper[4971]: I0309 09:57:08.407022 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-scripts\") pod \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\" (UID: \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\") " Mar 09 09:57:08 crc kubenswrapper[4971]: I0309 09:57:08.407048 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frpr9\" (UniqueName: \"kubernetes.io/projected/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-kube-api-access-frpr9\") pod \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\" (UID: \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\") " Mar 09 09:57:08 crc kubenswrapper[4971]: I0309 09:57:08.407094 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-swiftconf\") pod \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\" (UID: \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\") " Mar 09 09:57:08 crc kubenswrapper[4971]: I0309 09:57:08.407133 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-etc-swift\") pod \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\" (UID: \"07ecf9bb-72ba-4e80-88dc-04e27bcbc641\") " Mar 09 09:57:08 crc kubenswrapper[4971]: I0309 09:57:08.408421 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "07ecf9bb-72ba-4e80-88dc-04e27bcbc641" (UID: "07ecf9bb-72ba-4e80-88dc-04e27bcbc641"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:57:08 crc kubenswrapper[4971]: I0309 09:57:08.408847 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "07ecf9bb-72ba-4e80-88dc-04e27bcbc641" (UID: "07ecf9bb-72ba-4e80-88dc-04e27bcbc641"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:57:08 crc kubenswrapper[4971]: I0309 09:57:08.413561 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-kube-api-access-frpr9" (OuterVolumeSpecName: "kube-api-access-frpr9") pod "07ecf9bb-72ba-4e80-88dc-04e27bcbc641" (UID: "07ecf9bb-72ba-4e80-88dc-04e27bcbc641"). InnerVolumeSpecName "kube-api-access-frpr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:57:08 crc kubenswrapper[4971]: I0309 09:57:08.432328 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-scripts" (OuterVolumeSpecName: "scripts") pod "07ecf9bb-72ba-4e80-88dc-04e27bcbc641" (UID: "07ecf9bb-72ba-4e80-88dc-04e27bcbc641"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:57:08 crc kubenswrapper[4971]: I0309 09:57:08.433291 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "07ecf9bb-72ba-4e80-88dc-04e27bcbc641" (UID: "07ecf9bb-72ba-4e80-88dc-04e27bcbc641"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:08 crc kubenswrapper[4971]: I0309 09:57:08.439569 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "07ecf9bb-72ba-4e80-88dc-04e27bcbc641" (UID: "07ecf9bb-72ba-4e80-88dc-04e27bcbc641"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:08 crc kubenswrapper[4971]: I0309 09:57:08.508484 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:08 crc kubenswrapper[4971]: I0309 09:57:08.508525 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frpr9\" (UniqueName: \"kubernetes.io/projected/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-kube-api-access-frpr9\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:08 crc kubenswrapper[4971]: I0309 09:57:08.508543 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:08 crc kubenswrapper[4971]: I0309 09:57:08.508555 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:08 crc kubenswrapper[4971]: I0309 09:57:08.508566 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:08 crc kubenswrapper[4971]: I0309 09:57:08.508576 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/07ecf9bb-72ba-4e80-88dc-04e27bcbc641-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:08 crc kubenswrapper[4971]: I0309 09:57:08.931070 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e44b84be2ada174208784106e7482a21aac3e5a15732fdcb26a51c19ccf8180d" Mar 09 09:57:08 crc kubenswrapper[4971]: I0309 09:57:08.931113 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9fdjw" Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.161947 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07ecf9bb-72ba-4e80-88dc-04e27bcbc641" path="/var/lib/kubelet/pods/07ecf9bb-72ba-4e80-88dc-04e27bcbc641/volumes" Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.460391 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-77qtd"] Mar 09 09:57:09 crc kubenswrapper[4971]: E0309 09:57:09.460787 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07ecf9bb-72ba-4e80-88dc-04e27bcbc641" containerName="swift-ring-rebalance" Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.460804 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ecf9bb-72ba-4e80-88dc-04e27bcbc641" containerName="swift-ring-rebalance" Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.460985 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="07ecf9bb-72ba-4e80-88dc-04e27bcbc641" containerName="swift-ring-rebalance" Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.461656 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-77qtd" Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.465042 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.465185 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.479339 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-77qtd"] Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.626075 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cfb016a3-a09c-43d2-85e5-0b549789f92e-dispersionconf\") pod \"swift-ring-rebalance-debug-77qtd\" (UID: \"cfb016a3-a09c-43d2-85e5-0b549789f92e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77qtd" Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.626523 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cfb016a3-a09c-43d2-85e5-0b549789f92e-ring-data-devices\") pod \"swift-ring-rebalance-debug-77qtd\" (UID: \"cfb016a3-a09c-43d2-85e5-0b549789f92e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77qtd" Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.626754 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cfb016a3-a09c-43d2-85e5-0b549789f92e-etc-swift\") pod \"swift-ring-rebalance-debug-77qtd\" (UID: \"cfb016a3-a09c-43d2-85e5-0b549789f92e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77qtd" Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.626972 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-764rt\" (UniqueName: \"kubernetes.io/projected/cfb016a3-a09c-43d2-85e5-0b549789f92e-kube-api-access-764rt\") pod \"swift-ring-rebalance-debug-77qtd\" (UID: \"cfb016a3-a09c-43d2-85e5-0b549789f92e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77qtd" Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.627161 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cfb016a3-a09c-43d2-85e5-0b549789f92e-swiftconf\") pod \"swift-ring-rebalance-debug-77qtd\" (UID: \"cfb016a3-a09c-43d2-85e5-0b549789f92e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77qtd" Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.627420 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfb016a3-a09c-43d2-85e5-0b549789f92e-scripts\") pod \"swift-ring-rebalance-debug-77qtd\" (UID: \"cfb016a3-a09c-43d2-85e5-0b549789f92e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77qtd" Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.728372 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cfb016a3-a09c-43d2-85e5-0b549789f92e-etc-swift\") pod \"swift-ring-rebalance-debug-77qtd\" (UID: \"cfb016a3-a09c-43d2-85e5-0b549789f92e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77qtd" Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.728986 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-764rt\" (UniqueName: \"kubernetes.io/projected/cfb016a3-a09c-43d2-85e5-0b549789f92e-kube-api-access-764rt\") pod \"swift-ring-rebalance-debug-77qtd\" (UID: \"cfb016a3-a09c-43d2-85e5-0b549789f92e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77qtd" Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.729086 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cfb016a3-a09c-43d2-85e5-0b549789f92e-swiftconf\") pod \"swift-ring-rebalance-debug-77qtd\" (UID: \"cfb016a3-a09c-43d2-85e5-0b549789f92e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77qtd" Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.729209 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfb016a3-a09c-43d2-85e5-0b549789f92e-scripts\") pod \"swift-ring-rebalance-debug-77qtd\" (UID: \"cfb016a3-a09c-43d2-85e5-0b549789f92e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77qtd" Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.728846 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cfb016a3-a09c-43d2-85e5-0b549789f92e-etc-swift\") pod \"swift-ring-rebalance-debug-77qtd\" (UID: \"cfb016a3-a09c-43d2-85e5-0b549789f92e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77qtd" Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.729321 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cfb016a3-a09c-43d2-85e5-0b549789f92e-dispersionconf\") pod \"swift-ring-rebalance-debug-77qtd\" (UID: \"cfb016a3-a09c-43d2-85e5-0b549789f92e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77qtd" Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.729423 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cfb016a3-a09c-43d2-85e5-0b549789f92e-ring-data-devices\") pod \"swift-ring-rebalance-debug-77qtd\" (UID: \"cfb016a3-a09c-43d2-85e5-0b549789f92e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77qtd" Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.729990 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfb016a3-a09c-43d2-85e5-0b549789f92e-scripts\") pod \"swift-ring-rebalance-debug-77qtd\" (UID: \"cfb016a3-a09c-43d2-85e5-0b549789f92e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77qtd" Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.730025 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cfb016a3-a09c-43d2-85e5-0b549789f92e-ring-data-devices\") pod \"swift-ring-rebalance-debug-77qtd\" (UID: \"cfb016a3-a09c-43d2-85e5-0b549789f92e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77qtd" Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.737159 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cfb016a3-a09c-43d2-85e5-0b549789f92e-swiftconf\") pod \"swift-ring-rebalance-debug-77qtd\" (UID: \"cfb016a3-a09c-43d2-85e5-0b549789f92e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77qtd" Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.737195 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cfb016a3-a09c-43d2-85e5-0b549789f92e-dispersionconf\") pod \"swift-ring-rebalance-debug-77qtd\" (UID: \"cfb016a3-a09c-43d2-85e5-0b549789f92e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77qtd" Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.745266 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-764rt\" (UniqueName: \"kubernetes.io/projected/cfb016a3-a09c-43d2-85e5-0b549789f92e-kube-api-access-764rt\") pod \"swift-ring-rebalance-debug-77qtd\" (UID: \"cfb016a3-a09c-43d2-85e5-0b549789f92e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77qtd" Mar 09 09:57:09 crc kubenswrapper[4971]: I0309 09:57:09.779559 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-77qtd" Mar 09 09:57:10 crc kubenswrapper[4971]: I0309 09:57:10.197904 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-77qtd"] Mar 09 09:57:10 crc kubenswrapper[4971]: W0309 09:57:10.207580 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfb016a3_a09c_43d2_85e5_0b549789f92e.slice/crio-e510ed1864744bd934e00c3e9a518d5065872714f9c466368c79dc5a796ddfa2 WatchSource:0}: Error finding container e510ed1864744bd934e00c3e9a518d5065872714f9c466368c79dc5a796ddfa2: Status 404 returned error can't find the container with id e510ed1864744bd934e00c3e9a518d5065872714f9c466368c79dc5a796ddfa2 Mar 09 09:57:10 crc kubenswrapper[4971]: I0309 09:57:10.953121 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-77qtd" event={"ID":"cfb016a3-a09c-43d2-85e5-0b549789f92e","Type":"ContainerStarted","Data":"919e6e55c80df5f96f166baccbc961dfd066fb667b6821478672eb0a661336ac"} Mar 09 09:57:10 crc kubenswrapper[4971]: I0309 09:57:10.954606 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-77qtd" event={"ID":"cfb016a3-a09c-43d2-85e5-0b549789f92e","Type":"ContainerStarted","Data":"e510ed1864744bd934e00c3e9a518d5065872714f9c466368c79dc5a796ddfa2"} Mar 09 09:57:10 crc kubenswrapper[4971]: I0309 09:57:10.970075 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-77qtd" podStartSLOduration=1.970056112 podStartE2EDuration="1.970056112s" podCreationTimestamp="2026-03-09 09:57:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:57:10.969528917 +0000 UTC m=+2234.529456747" watchObservedRunningTime="2026-03-09 09:57:10.970056112 +0000 UTC m=+2234.529983922" Mar 09 09:57:11 crc kubenswrapper[4971]: I0309 09:57:11.965048 4971 generic.go:334] "Generic (PLEG): container finished" podID="cfb016a3-a09c-43d2-85e5-0b549789f92e" containerID="919e6e55c80df5f96f166baccbc961dfd066fb667b6821478672eb0a661336ac" exitCode=0 Mar 09 09:57:11 crc kubenswrapper[4971]: I0309 09:57:11.965094 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-77qtd" event={"ID":"cfb016a3-a09c-43d2-85e5-0b549789f92e","Type":"ContainerDied","Data":"919e6e55c80df5f96f166baccbc961dfd066fb667b6821478672eb0a661336ac"} Mar 09 09:57:13 crc kubenswrapper[4971]: I0309 09:57:13.269273 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-77qtd" Mar 09 09:57:13 crc kubenswrapper[4971]: I0309 09:57:13.307645 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-77qtd"] Mar 09 09:57:13 crc kubenswrapper[4971]: I0309 09:57:13.316236 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-77qtd"] Mar 09 09:57:13 crc kubenswrapper[4971]: I0309 09:57:13.384807 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cfb016a3-a09c-43d2-85e5-0b549789f92e-dispersionconf\") pod \"cfb016a3-a09c-43d2-85e5-0b549789f92e\" (UID: \"cfb016a3-a09c-43d2-85e5-0b549789f92e\") " Mar 09 09:57:13 crc kubenswrapper[4971]: I0309 09:57:13.384942 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cfb016a3-a09c-43d2-85e5-0b549789f92e-ring-data-devices\") pod \"cfb016a3-a09c-43d2-85e5-0b549789f92e\" (UID: \"cfb016a3-a09c-43d2-85e5-0b549789f92e\") " Mar 09 09:57:13 crc kubenswrapper[4971]: I0309 09:57:13.384986 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cfb016a3-a09c-43d2-85e5-0b549789f92e-etc-swift\") pod \"cfb016a3-a09c-43d2-85e5-0b549789f92e\" (UID: \"cfb016a3-a09c-43d2-85e5-0b549789f92e\") " Mar 09 09:57:13 crc kubenswrapper[4971]: I0309 09:57:13.385004 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-764rt\" (UniqueName: \"kubernetes.io/projected/cfb016a3-a09c-43d2-85e5-0b549789f92e-kube-api-access-764rt\") pod \"cfb016a3-a09c-43d2-85e5-0b549789f92e\" (UID: \"cfb016a3-a09c-43d2-85e5-0b549789f92e\") " Mar 09 09:57:13 crc kubenswrapper[4971]: I0309 09:57:13.385036 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cfb016a3-a09c-43d2-85e5-0b549789f92e-swiftconf\") pod \"cfb016a3-a09c-43d2-85e5-0b549789f92e\" (UID: \"cfb016a3-a09c-43d2-85e5-0b549789f92e\") " Mar 09 09:57:13 crc kubenswrapper[4971]: I0309 09:57:13.385074 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfb016a3-a09c-43d2-85e5-0b549789f92e-scripts\") pod \"cfb016a3-a09c-43d2-85e5-0b549789f92e\" (UID: \"cfb016a3-a09c-43d2-85e5-0b549789f92e\") " Mar 09 09:57:13 crc kubenswrapper[4971]: I0309 09:57:13.386034 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfb016a3-a09c-43d2-85e5-0b549789f92e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "cfb016a3-a09c-43d2-85e5-0b549789f92e" (UID: "cfb016a3-a09c-43d2-85e5-0b549789f92e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:57:13 crc kubenswrapper[4971]: I0309 09:57:13.386220 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfb016a3-a09c-43d2-85e5-0b549789f92e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "cfb016a3-a09c-43d2-85e5-0b549789f92e" (UID: "cfb016a3-a09c-43d2-85e5-0b549789f92e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:57:13 crc kubenswrapper[4971]: I0309 09:57:13.397503 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb016a3-a09c-43d2-85e5-0b549789f92e-kube-api-access-764rt" (OuterVolumeSpecName: "kube-api-access-764rt") pod "cfb016a3-a09c-43d2-85e5-0b549789f92e" (UID: "cfb016a3-a09c-43d2-85e5-0b549789f92e"). InnerVolumeSpecName "kube-api-access-764rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:57:13 crc kubenswrapper[4971]: I0309 09:57:13.405208 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfb016a3-a09c-43d2-85e5-0b549789f92e-scripts" (OuterVolumeSpecName: "scripts") pod "cfb016a3-a09c-43d2-85e5-0b549789f92e" (UID: "cfb016a3-a09c-43d2-85e5-0b549789f92e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:57:13 crc kubenswrapper[4971]: I0309 09:57:13.411026 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb016a3-a09c-43d2-85e5-0b549789f92e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "cfb016a3-a09c-43d2-85e5-0b549789f92e" (UID: "cfb016a3-a09c-43d2-85e5-0b549789f92e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:13 crc kubenswrapper[4971]: I0309 09:57:13.413056 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb016a3-a09c-43d2-85e5-0b549789f92e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "cfb016a3-a09c-43d2-85e5-0b549789f92e" (UID: "cfb016a3-a09c-43d2-85e5-0b549789f92e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:13 crc kubenswrapper[4971]: I0309 09:57:13.486911 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cfb016a3-a09c-43d2-85e5-0b549789f92e-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:13 crc kubenswrapper[4971]: I0309 09:57:13.486946 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfb016a3-a09c-43d2-85e5-0b549789f92e-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:13 crc kubenswrapper[4971]: I0309 09:57:13.486955 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cfb016a3-a09c-43d2-85e5-0b549789f92e-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:13 crc kubenswrapper[4971]: I0309 09:57:13.486969 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cfb016a3-a09c-43d2-85e5-0b549789f92e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:13 crc kubenswrapper[4971]: I0309 09:57:13.486983 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cfb016a3-a09c-43d2-85e5-0b549789f92e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:13 crc kubenswrapper[4971]: I0309 09:57:13.487035 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-764rt\" (UniqueName: \"kubernetes.io/projected/cfb016a3-a09c-43d2-85e5-0b549789f92e-kube-api-access-764rt\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:13 crc kubenswrapper[4971]: I0309 09:57:13.992284 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e510ed1864744bd934e00c3e9a518d5065872714f9c466368c79dc5a796ddfa2" Mar 09 09:57:13 crc kubenswrapper[4971]: I0309 09:57:13.992407 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-77qtd" Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.438895 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5"] Mar 09 09:57:14 crc kubenswrapper[4971]: E0309 09:57:14.439272 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb016a3-a09c-43d2-85e5-0b549789f92e" containerName="swift-ring-rebalance" Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.439291 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb016a3-a09c-43d2-85e5-0b549789f92e" containerName="swift-ring-rebalance" Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.439517 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfb016a3-a09c-43d2-85e5-0b549789f92e" containerName="swift-ring-rebalance" Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.440113 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5" Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.445727 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.446069 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.450480 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5"] Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.598655 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/50239ed2-0dab-4c67-baba-f508b90fd33e-swiftconf\") pod \"swift-ring-rebalance-debug-hs6z5\" (UID: \"50239ed2-0dab-4c67-baba-f508b90fd33e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5" Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.598699 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/50239ed2-0dab-4c67-baba-f508b90fd33e-ring-data-devices\") pod \"swift-ring-rebalance-debug-hs6z5\" (UID: \"50239ed2-0dab-4c67-baba-f508b90fd33e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5" Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.598733 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/50239ed2-0dab-4c67-baba-f508b90fd33e-etc-swift\") pod \"swift-ring-rebalance-debug-hs6z5\" (UID: \"50239ed2-0dab-4c67-baba-f508b90fd33e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5" Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.598819 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/50239ed2-0dab-4c67-baba-f508b90fd33e-scripts\") pod \"swift-ring-rebalance-debug-hs6z5\" (UID: \"50239ed2-0dab-4c67-baba-f508b90fd33e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5" Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.598843 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k6kc\" (UniqueName: \"kubernetes.io/projected/50239ed2-0dab-4c67-baba-f508b90fd33e-kube-api-access-4k6kc\") pod \"swift-ring-rebalance-debug-hs6z5\" (UID: \"50239ed2-0dab-4c67-baba-f508b90fd33e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5" Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.598894 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/50239ed2-0dab-4c67-baba-f508b90fd33e-dispersionconf\") pod \"swift-ring-rebalance-debug-hs6z5\" (UID: \"50239ed2-0dab-4c67-baba-f508b90fd33e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5" Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.700299 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/50239ed2-0dab-4c67-baba-f508b90fd33e-scripts\") pod \"swift-ring-rebalance-debug-hs6z5\" (UID: \"50239ed2-0dab-4c67-baba-f508b90fd33e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5" Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.700382 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k6kc\" (UniqueName: \"kubernetes.io/projected/50239ed2-0dab-4c67-baba-f508b90fd33e-kube-api-access-4k6kc\") pod \"swift-ring-rebalance-debug-hs6z5\" (UID: \"50239ed2-0dab-4c67-baba-f508b90fd33e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5" Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.700441 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/50239ed2-0dab-4c67-baba-f508b90fd33e-dispersionconf\") pod \"swift-ring-rebalance-debug-hs6z5\" (UID: \"50239ed2-0dab-4c67-baba-f508b90fd33e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5" Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.700511 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/50239ed2-0dab-4c67-baba-f508b90fd33e-swiftconf\") pod \"swift-ring-rebalance-debug-hs6z5\" (UID: \"50239ed2-0dab-4c67-baba-f508b90fd33e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5" Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.700537 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/50239ed2-0dab-4c67-baba-f508b90fd33e-ring-data-devices\") pod \"swift-ring-rebalance-debug-hs6z5\" (UID: \"50239ed2-0dab-4c67-baba-f508b90fd33e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5" Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.700566 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/50239ed2-0dab-4c67-baba-f508b90fd33e-etc-swift\") pod \"swift-ring-rebalance-debug-hs6z5\" (UID: \"50239ed2-0dab-4c67-baba-f508b90fd33e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5" Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.701183 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/50239ed2-0dab-4c67-baba-f508b90fd33e-etc-swift\") pod \"swift-ring-rebalance-debug-hs6z5\" (UID: \"50239ed2-0dab-4c67-baba-f508b90fd33e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5" Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.701527 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/50239ed2-0dab-4c67-baba-f508b90fd33e-scripts\") pod \"swift-ring-rebalance-debug-hs6z5\" (UID: \"50239ed2-0dab-4c67-baba-f508b90fd33e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5" Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.701842 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/50239ed2-0dab-4c67-baba-f508b90fd33e-ring-data-devices\") pod \"swift-ring-rebalance-debug-hs6z5\" (UID: \"50239ed2-0dab-4c67-baba-f508b90fd33e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5" Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.705199 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/50239ed2-0dab-4c67-baba-f508b90fd33e-swiftconf\") pod \"swift-ring-rebalance-debug-hs6z5\" (UID: \"50239ed2-0dab-4c67-baba-f508b90fd33e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5" Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.706678 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/50239ed2-0dab-4c67-baba-f508b90fd33e-dispersionconf\") pod \"swift-ring-rebalance-debug-hs6z5\" (UID: \"50239ed2-0dab-4c67-baba-f508b90fd33e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5" Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.729977 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k6kc\" (UniqueName: \"kubernetes.io/projected/50239ed2-0dab-4c67-baba-f508b90fd33e-kube-api-access-4k6kc\") pod \"swift-ring-rebalance-debug-hs6z5\" (UID: \"50239ed2-0dab-4c67-baba-f508b90fd33e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5" Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.776920 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5" Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.795286 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:57:14 crc kubenswrapper[4971]: I0309 09:57:14.795344 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:57:15 crc kubenswrapper[4971]: I0309 09:57:15.160948 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfb016a3-a09c-43d2-85e5-0b549789f92e" path="/var/lib/kubelet/pods/cfb016a3-a09c-43d2-85e5-0b549789f92e/volumes" Mar 09 09:57:15 crc kubenswrapper[4971]: I0309 09:57:15.212170 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5"] Mar 09 09:57:15 crc kubenswrapper[4971]: W0309 09:57:15.224769 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50239ed2_0dab_4c67_baba_f508b90fd33e.slice/crio-a87f745783bb8b545614559023cd96cc374786b6eb27f7ac2a862bc51c9e9c62 WatchSource:0}: Error finding container a87f745783bb8b545614559023cd96cc374786b6eb27f7ac2a862bc51c9e9c62: Status 404 returned error can't find the container with id a87f745783bb8b545614559023cd96cc374786b6eb27f7ac2a862bc51c9e9c62 Mar 09 09:57:16 crc kubenswrapper[4971]: I0309 09:57:16.007733 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5" event={"ID":"50239ed2-0dab-4c67-baba-f508b90fd33e","Type":"ContainerStarted","Data":"25f4f5ce8ee26693d1bdd6e6c575a9ade25a381aed2bebdb78375e00d502030e"} Mar 09 09:57:16 crc kubenswrapper[4971]: I0309 09:57:16.008988 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5" event={"ID":"50239ed2-0dab-4c67-baba-f508b90fd33e","Type":"ContainerStarted","Data":"a87f745783bb8b545614559023cd96cc374786b6eb27f7ac2a862bc51c9e9c62"} Mar 09 09:57:16 crc kubenswrapper[4971]: I0309 09:57:16.031466 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5" podStartSLOduration=2.031445355 podStartE2EDuration="2.031445355s" podCreationTimestamp="2026-03-09 09:57:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:57:16.024992733 +0000 UTC m=+2239.584920543" watchObservedRunningTime="2026-03-09 09:57:16.031445355 +0000 UTC m=+2239.591373165" Mar 09 09:57:17 crc kubenswrapper[4971]: I0309 09:57:17.017442 4971 generic.go:334] "Generic (PLEG): container finished" podID="50239ed2-0dab-4c67-baba-f508b90fd33e" containerID="25f4f5ce8ee26693d1bdd6e6c575a9ade25a381aed2bebdb78375e00d502030e" exitCode=0 Mar 09 09:57:17 crc kubenswrapper[4971]: I0309 09:57:17.017649 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5" event={"ID":"50239ed2-0dab-4c67-baba-f508b90fd33e","Type":"ContainerDied","Data":"25f4f5ce8ee26693d1bdd6e6c575a9ade25a381aed2bebdb78375e00d502030e"} Mar 09 09:57:18 crc kubenswrapper[4971]: I0309 09:57:18.301701 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5" Mar 09 09:57:18 crc kubenswrapper[4971]: I0309 09:57:18.339604 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5"] Mar 09 09:57:18 crc kubenswrapper[4971]: I0309 09:57:18.357294 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/50239ed2-0dab-4c67-baba-f508b90fd33e-swiftconf\") pod \"50239ed2-0dab-4c67-baba-f508b90fd33e\" (UID: \"50239ed2-0dab-4c67-baba-f508b90fd33e\") " Mar 09 09:57:18 crc kubenswrapper[4971]: I0309 09:57:18.357387 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/50239ed2-0dab-4c67-baba-f508b90fd33e-scripts\") pod \"50239ed2-0dab-4c67-baba-f508b90fd33e\" (UID: \"50239ed2-0dab-4c67-baba-f508b90fd33e\") " Mar 09 09:57:18 crc kubenswrapper[4971]: I0309 09:57:18.357424 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4k6kc\" (UniqueName: \"kubernetes.io/projected/50239ed2-0dab-4c67-baba-f508b90fd33e-kube-api-access-4k6kc\") pod \"50239ed2-0dab-4c67-baba-f508b90fd33e\" (UID: \"50239ed2-0dab-4c67-baba-f508b90fd33e\") " Mar 09 09:57:18 crc kubenswrapper[4971]: I0309 09:57:18.357539 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/50239ed2-0dab-4c67-baba-f508b90fd33e-etc-swift\") pod \"50239ed2-0dab-4c67-baba-f508b90fd33e\" (UID: \"50239ed2-0dab-4c67-baba-f508b90fd33e\") " Mar 09 09:57:18 crc kubenswrapper[4971]: I0309 09:57:18.357650 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/50239ed2-0dab-4c67-baba-f508b90fd33e-dispersionconf\") pod \"50239ed2-0dab-4c67-baba-f508b90fd33e\" (UID: \"50239ed2-0dab-4c67-baba-f508b90fd33e\") " Mar 09 09:57:18 crc kubenswrapper[4971]: I0309 09:57:18.357723 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/50239ed2-0dab-4c67-baba-f508b90fd33e-ring-data-devices\") pod \"50239ed2-0dab-4c67-baba-f508b90fd33e\" (UID: \"50239ed2-0dab-4c67-baba-f508b90fd33e\") " Mar 09 09:57:18 crc kubenswrapper[4971]: I0309 09:57:18.358780 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50239ed2-0dab-4c67-baba-f508b90fd33e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "50239ed2-0dab-4c67-baba-f508b90fd33e" (UID: "50239ed2-0dab-4c67-baba-f508b90fd33e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:57:18 crc kubenswrapper[4971]: I0309 09:57:18.358882 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50239ed2-0dab-4c67-baba-f508b90fd33e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "50239ed2-0dab-4c67-baba-f508b90fd33e" (UID: "50239ed2-0dab-4c67-baba-f508b90fd33e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:57:18 crc kubenswrapper[4971]: I0309 09:57:18.359594 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5"] Mar 09 09:57:18 crc kubenswrapper[4971]: I0309 09:57:18.363121 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50239ed2-0dab-4c67-baba-f508b90fd33e-kube-api-access-4k6kc" (OuterVolumeSpecName: "kube-api-access-4k6kc") pod "50239ed2-0dab-4c67-baba-f508b90fd33e" (UID: "50239ed2-0dab-4c67-baba-f508b90fd33e"). InnerVolumeSpecName "kube-api-access-4k6kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:57:18 crc kubenswrapper[4971]: I0309 09:57:18.379065 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50239ed2-0dab-4c67-baba-f508b90fd33e-scripts" (OuterVolumeSpecName: "scripts") pod "50239ed2-0dab-4c67-baba-f508b90fd33e" (UID: "50239ed2-0dab-4c67-baba-f508b90fd33e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:57:18 crc kubenswrapper[4971]: I0309 09:57:18.380101 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50239ed2-0dab-4c67-baba-f508b90fd33e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "50239ed2-0dab-4c67-baba-f508b90fd33e" (UID: "50239ed2-0dab-4c67-baba-f508b90fd33e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:18 crc kubenswrapper[4971]: I0309 09:57:18.387833 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50239ed2-0dab-4c67-baba-f508b90fd33e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "50239ed2-0dab-4c67-baba-f508b90fd33e" (UID: "50239ed2-0dab-4c67-baba-f508b90fd33e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:18 crc kubenswrapper[4971]: I0309 09:57:18.458765 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/50239ed2-0dab-4c67-baba-f508b90fd33e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:18 crc kubenswrapper[4971]: I0309 09:57:18.458805 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/50239ed2-0dab-4c67-baba-f508b90fd33e-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:18 crc kubenswrapper[4971]: I0309 09:57:18.458814 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/50239ed2-0dab-4c67-baba-f508b90fd33e-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:18 crc kubenswrapper[4971]: I0309 09:57:18.458824 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4k6kc\" (UniqueName: \"kubernetes.io/projected/50239ed2-0dab-4c67-baba-f508b90fd33e-kube-api-access-4k6kc\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:18 crc kubenswrapper[4971]: I0309 09:57:18.458834 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/50239ed2-0dab-4c67-baba-f508b90fd33e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:18 crc kubenswrapper[4971]: I0309 09:57:18.458842 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/50239ed2-0dab-4c67-baba-f508b90fd33e-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.044102 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a87f745783bb8b545614559023cd96cc374786b6eb27f7ac2a862bc51c9e9c62" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.044156 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hs6z5" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.162918 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50239ed2-0dab-4c67-baba-f508b90fd33e" path="/var/lib/kubelet/pods/50239ed2-0dab-4c67-baba-f508b90fd33e/volumes" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.474136 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-htngf"] Mar 09 09:57:19 crc kubenswrapper[4971]: E0309 09:57:19.474431 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50239ed2-0dab-4c67-baba-f508b90fd33e" containerName="swift-ring-rebalance" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.474442 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="50239ed2-0dab-4c67-baba-f508b90fd33e" containerName="swift-ring-rebalance" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.474593 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="50239ed2-0dab-4c67-baba-f508b90fd33e" containerName="swift-ring-rebalance" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.475112 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-htngf" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.477784 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.480947 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.482154 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-etc-swift\") pod \"swift-ring-rebalance-debug-htngf\" (UID: \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-htngf" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.482222 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-swiftconf\") pod \"swift-ring-rebalance-debug-htngf\" (UID: \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-htngf" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.482263 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-ring-data-devices\") pod \"swift-ring-rebalance-debug-htngf\" (UID: \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-htngf" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.482284 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt8n5\" (UniqueName: \"kubernetes.io/projected/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-kube-api-access-nt8n5\") pod \"swift-ring-rebalance-debug-htngf\" (UID: \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-htngf" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.482373 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-scripts\") pod \"swift-ring-rebalance-debug-htngf\" (UID: \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-htngf" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.482394 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-dispersionconf\") pod \"swift-ring-rebalance-debug-htngf\" (UID: \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-htngf" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.484169 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-htngf"] Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.583909 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-dispersionconf\") pod \"swift-ring-rebalance-debug-htngf\" (UID: \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-htngf" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.583959 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-scripts\") pod \"swift-ring-rebalance-debug-htngf\" (UID: \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-htngf" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.584094 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-etc-swift\") pod \"swift-ring-rebalance-debug-htngf\" (UID: \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-htngf" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.584176 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-swiftconf\") pod \"swift-ring-rebalance-debug-htngf\" (UID: \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-htngf" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.584228 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-ring-data-devices\") pod \"swift-ring-rebalance-debug-htngf\" (UID: \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-htngf" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.584249 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt8n5\" (UniqueName: \"kubernetes.io/projected/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-kube-api-access-nt8n5\") pod \"swift-ring-rebalance-debug-htngf\" (UID: \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-htngf" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.584527 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-etc-swift\") pod \"swift-ring-rebalance-debug-htngf\" (UID: \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-htngf" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.585097 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-scripts\") pod \"swift-ring-rebalance-debug-htngf\" (UID: \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-htngf" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.585117 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-ring-data-devices\") pod \"swift-ring-rebalance-debug-htngf\" (UID: \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-htngf" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.587925 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-dispersionconf\") pod \"swift-ring-rebalance-debug-htngf\" (UID: \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-htngf" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.588957 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-swiftconf\") pod \"swift-ring-rebalance-debug-htngf\" (UID: \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-htngf" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.602825 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt8n5\" (UniqueName: \"kubernetes.io/projected/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-kube-api-access-nt8n5\") pod \"swift-ring-rebalance-debug-htngf\" (UID: \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-htngf" Mar 09 09:57:19 crc kubenswrapper[4971]: I0309 09:57:19.797516 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-htngf" Mar 09 09:57:20 crc kubenswrapper[4971]: I0309 09:57:20.213706 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-htngf"] Mar 09 09:57:21 crc kubenswrapper[4971]: I0309 09:57:21.064946 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-htngf" event={"ID":"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f","Type":"ContainerStarted","Data":"db9706a2cd00b931324fa2f277088837c2872c9b3268294838243eb2e2d7fecc"} Mar 09 09:57:21 crc kubenswrapper[4971]: I0309 09:57:21.065422 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-htngf" event={"ID":"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f","Type":"ContainerStarted","Data":"df0182494ac764f2c75c236026ba81c9d8d4383fa51bb3bf47ce7456338c5dd5"} Mar 09 09:57:21 crc kubenswrapper[4971]: I0309 09:57:21.089784 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-htngf" podStartSLOduration=2.089769062 podStartE2EDuration="2.089769062s" podCreationTimestamp="2026-03-09 09:57:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:57:21.088824765 +0000 UTC m=+2244.648752615" watchObservedRunningTime="2026-03-09 09:57:21.089769062 +0000 UTC m=+2244.649696872" Mar 09 09:57:22 crc kubenswrapper[4971]: I0309 09:57:22.076760 4971 generic.go:334] "Generic (PLEG): container finished" podID="dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f" containerID="db9706a2cd00b931324fa2f277088837c2872c9b3268294838243eb2e2d7fecc" exitCode=0 Mar 09 09:57:22 crc kubenswrapper[4971]: I0309 09:57:22.076816 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-htngf" event={"ID":"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f","Type":"ContainerDied","Data":"db9706a2cd00b931324fa2f277088837c2872c9b3268294838243eb2e2d7fecc"} Mar 09 09:57:23 crc kubenswrapper[4971]: I0309 09:57:23.355095 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-htngf" Mar 09 09:57:23 crc kubenswrapper[4971]: I0309 09:57:23.395601 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-htngf"] Mar 09 09:57:23 crc kubenswrapper[4971]: I0309 09:57:23.401511 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-htngf"] Mar 09 09:57:23 crc kubenswrapper[4971]: I0309 09:57:23.439821 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-ring-data-devices\") pod \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\" (UID: \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\") " Mar 09 09:57:23 crc kubenswrapper[4971]: I0309 09:57:23.439954 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-etc-swift\") pod \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\" (UID: \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\") " Mar 09 09:57:23 crc kubenswrapper[4971]: I0309 09:57:23.439984 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-dispersionconf\") pod \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\" (UID: \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\") " Mar 09 09:57:23 crc kubenswrapper[4971]: I0309 09:57:23.440026 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-scripts\") pod \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\" (UID: \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\") " Mar 09 09:57:23 crc kubenswrapper[4971]: I0309 09:57:23.440051 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-swiftconf\") pod \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\" (UID: \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\") " Mar 09 09:57:23 crc kubenswrapper[4971]: I0309 09:57:23.440099 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt8n5\" (UniqueName: \"kubernetes.io/projected/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-kube-api-access-nt8n5\") pod \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\" (UID: \"dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f\") " Mar 09 09:57:23 crc kubenswrapper[4971]: I0309 09:57:23.440863 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f" (UID: "dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:57:23 crc kubenswrapper[4971]: I0309 09:57:23.441204 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f" (UID: "dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:57:23 crc kubenswrapper[4971]: I0309 09:57:23.446643 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-kube-api-access-nt8n5" (OuterVolumeSpecName: "kube-api-access-nt8n5") pod "dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f" (UID: "dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f"). InnerVolumeSpecName "kube-api-access-nt8n5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:57:23 crc kubenswrapper[4971]: I0309 09:57:23.461739 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-scripts" (OuterVolumeSpecName: "scripts") pod "dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f" (UID: "dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:57:23 crc kubenswrapper[4971]: I0309 09:57:23.462773 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f" (UID: "dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:23 crc kubenswrapper[4971]: I0309 09:57:23.473619 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f" (UID: "dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:23 crc kubenswrapper[4971]: I0309 09:57:23.542551 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt8n5\" (UniqueName: \"kubernetes.io/projected/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-kube-api-access-nt8n5\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:23 crc kubenswrapper[4971]: I0309 09:57:23.542590 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:23 crc kubenswrapper[4971]: I0309 09:57:23.542605 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:23 crc kubenswrapper[4971]: I0309 09:57:23.542617 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:23 crc kubenswrapper[4971]: I0309 09:57:23.542629 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:23 crc kubenswrapper[4971]: I0309 09:57:23.542639 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.098816 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df0182494ac764f2c75c236026ba81c9d8d4383fa51bb3bf47ce7456338c5dd5" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.098870 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-htngf" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.529400 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lgm55"] Mar 09 09:57:24 crc kubenswrapper[4971]: E0309 09:57:24.529795 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f" containerName="swift-ring-rebalance" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.529813 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f" containerName="swift-ring-rebalance" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.530016 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f" containerName="swift-ring-rebalance" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.530655 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lgm55" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.532642 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.538993 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.541176 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lgm55"] Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.555089 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkbpv\" (UniqueName: \"kubernetes.io/projected/ae3e5225-fb93-48c0-ab21-93c15a605088-kube-api-access-hkbpv\") pod \"swift-ring-rebalance-debug-lgm55\" (UID: \"ae3e5225-fb93-48c0-ab21-93c15a605088\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lgm55" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.555160 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ae3e5225-fb93-48c0-ab21-93c15a605088-etc-swift\") pod \"swift-ring-rebalance-debug-lgm55\" (UID: \"ae3e5225-fb93-48c0-ab21-93c15a605088\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lgm55" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.555180 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ae3e5225-fb93-48c0-ab21-93c15a605088-dispersionconf\") pod \"swift-ring-rebalance-debug-lgm55\" (UID: \"ae3e5225-fb93-48c0-ab21-93c15a605088\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lgm55" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.555199 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ae3e5225-fb93-48c0-ab21-93c15a605088-ring-data-devices\") pod \"swift-ring-rebalance-debug-lgm55\" (UID: \"ae3e5225-fb93-48c0-ab21-93c15a605088\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lgm55" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.555220 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae3e5225-fb93-48c0-ab21-93c15a605088-scripts\") pod \"swift-ring-rebalance-debug-lgm55\" (UID: \"ae3e5225-fb93-48c0-ab21-93c15a605088\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lgm55" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.555247 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ae3e5225-fb93-48c0-ab21-93c15a605088-swiftconf\") pod \"swift-ring-rebalance-debug-lgm55\" (UID: \"ae3e5225-fb93-48c0-ab21-93c15a605088\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lgm55" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.656681 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkbpv\" (UniqueName: \"kubernetes.io/projected/ae3e5225-fb93-48c0-ab21-93c15a605088-kube-api-access-hkbpv\") pod \"swift-ring-rebalance-debug-lgm55\" (UID: \"ae3e5225-fb93-48c0-ab21-93c15a605088\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lgm55" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.656771 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ae3e5225-fb93-48c0-ab21-93c15a605088-etc-swift\") pod \"swift-ring-rebalance-debug-lgm55\" (UID: \"ae3e5225-fb93-48c0-ab21-93c15a605088\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lgm55" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.656825 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ae3e5225-fb93-48c0-ab21-93c15a605088-dispersionconf\") pod \"swift-ring-rebalance-debug-lgm55\" (UID: \"ae3e5225-fb93-48c0-ab21-93c15a605088\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lgm55" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.656854 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ae3e5225-fb93-48c0-ab21-93c15a605088-ring-data-devices\") pod \"swift-ring-rebalance-debug-lgm55\" (UID: \"ae3e5225-fb93-48c0-ab21-93c15a605088\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lgm55" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.656885 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae3e5225-fb93-48c0-ab21-93c15a605088-scripts\") pod \"swift-ring-rebalance-debug-lgm55\" (UID: \"ae3e5225-fb93-48c0-ab21-93c15a605088\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lgm55" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.656934 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ae3e5225-fb93-48c0-ab21-93c15a605088-swiftconf\") pod \"swift-ring-rebalance-debug-lgm55\" (UID: \"ae3e5225-fb93-48c0-ab21-93c15a605088\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lgm55" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.657776 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ae3e5225-fb93-48c0-ab21-93c15a605088-etc-swift\") pod \"swift-ring-rebalance-debug-lgm55\" (UID: \"ae3e5225-fb93-48c0-ab21-93c15a605088\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lgm55" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.657855 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ae3e5225-fb93-48c0-ab21-93c15a605088-ring-data-devices\") pod \"swift-ring-rebalance-debug-lgm55\" (UID: \"ae3e5225-fb93-48c0-ab21-93c15a605088\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lgm55" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.658197 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae3e5225-fb93-48c0-ab21-93c15a605088-scripts\") pod \"swift-ring-rebalance-debug-lgm55\" (UID: \"ae3e5225-fb93-48c0-ab21-93c15a605088\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lgm55" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.661692 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ae3e5225-fb93-48c0-ab21-93c15a605088-dispersionconf\") pod \"swift-ring-rebalance-debug-lgm55\" (UID: \"ae3e5225-fb93-48c0-ab21-93c15a605088\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lgm55" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.661804 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ae3e5225-fb93-48c0-ab21-93c15a605088-swiftconf\") pod \"swift-ring-rebalance-debug-lgm55\" (UID: \"ae3e5225-fb93-48c0-ab21-93c15a605088\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lgm55" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.673218 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkbpv\" (UniqueName: \"kubernetes.io/projected/ae3e5225-fb93-48c0-ab21-93c15a605088-kube-api-access-hkbpv\") pod \"swift-ring-rebalance-debug-lgm55\" (UID: \"ae3e5225-fb93-48c0-ab21-93c15a605088\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lgm55" Mar 09 09:57:24 crc kubenswrapper[4971]: I0309 09:57:24.848142 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lgm55" Mar 09 09:57:25 crc kubenswrapper[4971]: I0309 09:57:25.146120 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lgm55"] Mar 09 09:57:25 crc kubenswrapper[4971]: I0309 09:57:25.162140 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f" path="/var/lib/kubelet/pods/dab93ee8-27e0-44ca-89b2-59dcf7a2ea1f/volumes" Mar 09 09:57:26 crc kubenswrapper[4971]: I0309 09:57:26.116583 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lgm55" event={"ID":"ae3e5225-fb93-48c0-ab21-93c15a605088","Type":"ContainerStarted","Data":"2a6dda8fdf759fe3ad9f95692ab3bfba56c2aaf30fb3b99e79c07b2f71f9b5d0"} Mar 09 09:57:26 crc kubenswrapper[4971]: I0309 09:57:26.116996 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lgm55" event={"ID":"ae3e5225-fb93-48c0-ab21-93c15a605088","Type":"ContainerStarted","Data":"01a1a4b3ef0e23e832b533c0817d7dbe0ef1e5705ff81ee6e10ef95a202caeab"} Mar 09 09:57:26 crc kubenswrapper[4971]: I0309 09:57:26.135997 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lgm55" podStartSLOduration=2.135978096 podStartE2EDuration="2.135978096s" podCreationTimestamp="2026-03-09 09:57:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:57:26.13185772 +0000 UTC m=+2249.691785520" watchObservedRunningTime="2026-03-09 09:57:26.135978096 +0000 UTC m=+2249.695905906" Mar 09 09:57:27 crc kubenswrapper[4971]: I0309 09:57:27.127385 4971 generic.go:334] "Generic (PLEG): container finished" podID="ae3e5225-fb93-48c0-ab21-93c15a605088" containerID="2a6dda8fdf759fe3ad9f95692ab3bfba56c2aaf30fb3b99e79c07b2f71f9b5d0" exitCode=0 Mar 09 09:57:27 crc kubenswrapper[4971]: I0309 09:57:27.127481 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lgm55" event={"ID":"ae3e5225-fb93-48c0-ab21-93c15a605088","Type":"ContainerDied","Data":"2a6dda8fdf759fe3ad9f95692ab3bfba56c2aaf30fb3b99e79c07b2f71f9b5d0"} Mar 09 09:57:28 crc kubenswrapper[4971]: I0309 09:57:28.407073 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lgm55" Mar 09 09:57:28 crc kubenswrapper[4971]: I0309 09:57:28.438952 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lgm55"] Mar 09 09:57:28 crc kubenswrapper[4971]: I0309 09:57:28.445820 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lgm55"] Mar 09 09:57:28 crc kubenswrapper[4971]: I0309 09:57:28.517520 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ae3e5225-fb93-48c0-ab21-93c15a605088-swiftconf\") pod \"ae3e5225-fb93-48c0-ab21-93c15a605088\" (UID: \"ae3e5225-fb93-48c0-ab21-93c15a605088\") " Mar 09 09:57:28 crc kubenswrapper[4971]: I0309 09:57:28.517613 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkbpv\" (UniqueName: \"kubernetes.io/projected/ae3e5225-fb93-48c0-ab21-93c15a605088-kube-api-access-hkbpv\") pod \"ae3e5225-fb93-48c0-ab21-93c15a605088\" (UID: \"ae3e5225-fb93-48c0-ab21-93c15a605088\") " Mar 09 09:57:28 crc kubenswrapper[4971]: I0309 09:57:28.517641 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ae3e5225-fb93-48c0-ab21-93c15a605088-etc-swift\") pod \"ae3e5225-fb93-48c0-ab21-93c15a605088\" (UID: \"ae3e5225-fb93-48c0-ab21-93c15a605088\") " Mar 09 09:57:28 crc kubenswrapper[4971]: I0309 09:57:28.517702 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae3e5225-fb93-48c0-ab21-93c15a605088-scripts\") pod \"ae3e5225-fb93-48c0-ab21-93c15a605088\" (UID: \"ae3e5225-fb93-48c0-ab21-93c15a605088\") " Mar 09 09:57:28 crc kubenswrapper[4971]: I0309 09:57:28.517728 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ae3e5225-fb93-48c0-ab21-93c15a605088-ring-data-devices\") pod \"ae3e5225-fb93-48c0-ab21-93c15a605088\" (UID: \"ae3e5225-fb93-48c0-ab21-93c15a605088\") " Mar 09 09:57:28 crc kubenswrapper[4971]: I0309 09:57:28.517868 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ae3e5225-fb93-48c0-ab21-93c15a605088-dispersionconf\") pod \"ae3e5225-fb93-48c0-ab21-93c15a605088\" (UID: \"ae3e5225-fb93-48c0-ab21-93c15a605088\") " Mar 09 09:57:28 crc kubenswrapper[4971]: I0309 09:57:28.518898 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae3e5225-fb93-48c0-ab21-93c15a605088-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ae3e5225-fb93-48c0-ab21-93c15a605088" (UID: "ae3e5225-fb93-48c0-ab21-93c15a605088"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:57:28 crc kubenswrapper[4971]: I0309 09:57:28.519151 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae3e5225-fb93-48c0-ab21-93c15a605088-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ae3e5225-fb93-48c0-ab21-93c15a605088" (UID: "ae3e5225-fb93-48c0-ab21-93c15a605088"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:57:28 crc kubenswrapper[4971]: I0309 09:57:28.527611 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae3e5225-fb93-48c0-ab21-93c15a605088-kube-api-access-hkbpv" (OuterVolumeSpecName: "kube-api-access-hkbpv") pod "ae3e5225-fb93-48c0-ab21-93c15a605088" (UID: "ae3e5225-fb93-48c0-ab21-93c15a605088"). InnerVolumeSpecName "kube-api-access-hkbpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:57:28 crc kubenswrapper[4971]: I0309 09:57:28.541467 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae3e5225-fb93-48c0-ab21-93c15a605088-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ae3e5225-fb93-48c0-ab21-93c15a605088" (UID: "ae3e5225-fb93-48c0-ab21-93c15a605088"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:28 crc kubenswrapper[4971]: I0309 09:57:28.544050 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae3e5225-fb93-48c0-ab21-93c15a605088-scripts" (OuterVolumeSpecName: "scripts") pod "ae3e5225-fb93-48c0-ab21-93c15a605088" (UID: "ae3e5225-fb93-48c0-ab21-93c15a605088"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:57:28 crc kubenswrapper[4971]: I0309 09:57:28.555860 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae3e5225-fb93-48c0-ab21-93c15a605088-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ae3e5225-fb93-48c0-ab21-93c15a605088" (UID: "ae3e5225-fb93-48c0-ab21-93c15a605088"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:28 crc kubenswrapper[4971]: I0309 09:57:28.620264 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ae3e5225-fb93-48c0-ab21-93c15a605088-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:28 crc kubenswrapper[4971]: I0309 09:57:28.620316 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkbpv\" (UniqueName: \"kubernetes.io/projected/ae3e5225-fb93-48c0-ab21-93c15a605088-kube-api-access-hkbpv\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:28 crc kubenswrapper[4971]: I0309 09:57:28.620339 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ae3e5225-fb93-48c0-ab21-93c15a605088-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:28 crc kubenswrapper[4971]: I0309 09:57:28.620392 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae3e5225-fb93-48c0-ab21-93c15a605088-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:28 crc kubenswrapper[4971]: I0309 09:57:28.620420 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ae3e5225-fb93-48c0-ab21-93c15a605088-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:28 crc kubenswrapper[4971]: I0309 09:57:28.620446 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ae3e5225-fb93-48c0-ab21-93c15a605088-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.152087 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lgm55" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.166668 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae3e5225-fb93-48c0-ab21-93c15a605088" path="/var/lib/kubelet/pods/ae3e5225-fb93-48c0-ab21-93c15a605088/volumes" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.167601 4971 scope.go:117] "RemoveContainer" containerID="2a6dda8fdf759fe3ad9f95692ab3bfba56c2aaf30fb3b99e79c07b2f71f9b5d0" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.582725 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6rls7"] Mar 09 09:57:29 crc kubenswrapper[4971]: E0309 09:57:29.583257 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae3e5225-fb93-48c0-ab21-93c15a605088" containerName="swift-ring-rebalance" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.583269 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae3e5225-fb93-48c0-ab21-93c15a605088" containerName="swift-ring-rebalance" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.583444 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae3e5225-fb93-48c0-ab21-93c15a605088" containerName="swift-ring-rebalance" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.583903 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6rls7" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.587039 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.587230 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.592530 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6rls7"] Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.737935 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/af11ecac-a96b-42d7-ae17-de8956293e71-ring-data-devices\") pod \"swift-ring-rebalance-debug-6rls7\" (UID: \"af11ecac-a96b-42d7-ae17-de8956293e71\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6rls7" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.737996 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/af11ecac-a96b-42d7-ae17-de8956293e71-dispersionconf\") pod \"swift-ring-rebalance-debug-6rls7\" (UID: \"af11ecac-a96b-42d7-ae17-de8956293e71\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6rls7" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.738023 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af11ecac-a96b-42d7-ae17-de8956293e71-scripts\") pod \"swift-ring-rebalance-debug-6rls7\" (UID: \"af11ecac-a96b-42d7-ae17-de8956293e71\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6rls7" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.738107 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qtjd\" (UniqueName: \"kubernetes.io/projected/af11ecac-a96b-42d7-ae17-de8956293e71-kube-api-access-2qtjd\") pod \"swift-ring-rebalance-debug-6rls7\" (UID: \"af11ecac-a96b-42d7-ae17-de8956293e71\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6rls7" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.738160 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/af11ecac-a96b-42d7-ae17-de8956293e71-swiftconf\") pod \"swift-ring-rebalance-debug-6rls7\" (UID: \"af11ecac-a96b-42d7-ae17-de8956293e71\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6rls7" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.738369 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/af11ecac-a96b-42d7-ae17-de8956293e71-etc-swift\") pod \"swift-ring-rebalance-debug-6rls7\" (UID: \"af11ecac-a96b-42d7-ae17-de8956293e71\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6rls7" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.839584 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qtjd\" (UniqueName: \"kubernetes.io/projected/af11ecac-a96b-42d7-ae17-de8956293e71-kube-api-access-2qtjd\") pod \"swift-ring-rebalance-debug-6rls7\" (UID: \"af11ecac-a96b-42d7-ae17-de8956293e71\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6rls7" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.840010 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/af11ecac-a96b-42d7-ae17-de8956293e71-swiftconf\") pod \"swift-ring-rebalance-debug-6rls7\" (UID: \"af11ecac-a96b-42d7-ae17-de8956293e71\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6rls7" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.840777 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/af11ecac-a96b-42d7-ae17-de8956293e71-etc-swift\") pod \"swift-ring-rebalance-debug-6rls7\" (UID: \"af11ecac-a96b-42d7-ae17-de8956293e71\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6rls7" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.840848 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/af11ecac-a96b-42d7-ae17-de8956293e71-ring-data-devices\") pod \"swift-ring-rebalance-debug-6rls7\" (UID: \"af11ecac-a96b-42d7-ae17-de8956293e71\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6rls7" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.840912 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af11ecac-a96b-42d7-ae17-de8956293e71-scripts\") pod \"swift-ring-rebalance-debug-6rls7\" (UID: \"af11ecac-a96b-42d7-ae17-de8956293e71\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6rls7" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.840958 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/af11ecac-a96b-42d7-ae17-de8956293e71-dispersionconf\") pod \"swift-ring-rebalance-debug-6rls7\" (UID: \"af11ecac-a96b-42d7-ae17-de8956293e71\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6rls7" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.841175 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/af11ecac-a96b-42d7-ae17-de8956293e71-etc-swift\") pod \"swift-ring-rebalance-debug-6rls7\" (UID: \"af11ecac-a96b-42d7-ae17-de8956293e71\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6rls7" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.841576 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/af11ecac-a96b-42d7-ae17-de8956293e71-ring-data-devices\") pod \"swift-ring-rebalance-debug-6rls7\" (UID: \"af11ecac-a96b-42d7-ae17-de8956293e71\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6rls7" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.844525 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/af11ecac-a96b-42d7-ae17-de8956293e71-swiftconf\") pod \"swift-ring-rebalance-debug-6rls7\" (UID: \"af11ecac-a96b-42d7-ae17-de8956293e71\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6rls7" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.844576 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af11ecac-a96b-42d7-ae17-de8956293e71-scripts\") pod \"swift-ring-rebalance-debug-6rls7\" (UID: \"af11ecac-a96b-42d7-ae17-de8956293e71\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6rls7" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.844809 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/af11ecac-a96b-42d7-ae17-de8956293e71-dispersionconf\") pod \"swift-ring-rebalance-debug-6rls7\" (UID: \"af11ecac-a96b-42d7-ae17-de8956293e71\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6rls7" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.863894 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qtjd\" (UniqueName: \"kubernetes.io/projected/af11ecac-a96b-42d7-ae17-de8956293e71-kube-api-access-2qtjd\") pod \"swift-ring-rebalance-debug-6rls7\" (UID: \"af11ecac-a96b-42d7-ae17-de8956293e71\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6rls7" Mar 09 09:57:29 crc kubenswrapper[4971]: I0309 09:57:29.900659 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6rls7" Mar 09 09:57:30 crc kubenswrapper[4971]: I0309 09:57:30.124694 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6rls7"] Mar 09 09:57:30 crc kubenswrapper[4971]: I0309 09:57:30.162053 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6rls7" event={"ID":"af11ecac-a96b-42d7-ae17-de8956293e71","Type":"ContainerStarted","Data":"653185a9ca527d05f3b9819400c1f5c303d7cac98395e5275afb2ca844684e3a"} Mar 09 09:57:31 crc kubenswrapper[4971]: I0309 09:57:31.174126 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6rls7" event={"ID":"af11ecac-a96b-42d7-ae17-de8956293e71","Type":"ContainerStarted","Data":"fdc681a59d04d51318a102f0fd0e3dd9f580263ce86d9ae72d11b8f85b2eb334"} Mar 09 09:57:31 crc kubenswrapper[4971]: I0309 09:57:31.196903 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6rls7" podStartSLOduration=2.196886476 podStartE2EDuration="2.196886476s" podCreationTimestamp="2026-03-09 09:57:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:57:31.195323982 +0000 UTC m=+2254.755251802" watchObservedRunningTime="2026-03-09 09:57:31.196886476 +0000 UTC m=+2254.756814286" Mar 09 09:57:32 crc kubenswrapper[4971]: I0309 09:57:32.182850 4971 generic.go:334] "Generic (PLEG): container finished" podID="af11ecac-a96b-42d7-ae17-de8956293e71" containerID="fdc681a59d04d51318a102f0fd0e3dd9f580263ce86d9ae72d11b8f85b2eb334" exitCode=0 Mar 09 09:57:32 crc kubenswrapper[4971]: I0309 09:57:32.182892 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6rls7" event={"ID":"af11ecac-a96b-42d7-ae17-de8956293e71","Type":"ContainerDied","Data":"fdc681a59d04d51318a102f0fd0e3dd9f580263ce86d9ae72d11b8f85b2eb334"} Mar 09 09:57:33 crc kubenswrapper[4971]: I0309 09:57:33.493484 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6rls7" Mar 09 09:57:33 crc kubenswrapper[4971]: I0309 09:57:33.526001 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6rls7"] Mar 09 09:57:33 crc kubenswrapper[4971]: I0309 09:57:33.533239 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6rls7"] Mar 09 09:57:33 crc kubenswrapper[4971]: I0309 09:57:33.606720 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/af11ecac-a96b-42d7-ae17-de8956293e71-etc-swift\") pod \"af11ecac-a96b-42d7-ae17-de8956293e71\" (UID: \"af11ecac-a96b-42d7-ae17-de8956293e71\") " Mar 09 09:57:33 crc kubenswrapper[4971]: I0309 09:57:33.606839 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qtjd\" (UniqueName: \"kubernetes.io/projected/af11ecac-a96b-42d7-ae17-de8956293e71-kube-api-access-2qtjd\") pod \"af11ecac-a96b-42d7-ae17-de8956293e71\" (UID: \"af11ecac-a96b-42d7-ae17-de8956293e71\") " Mar 09 09:57:33 crc kubenswrapper[4971]: I0309 09:57:33.606907 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/af11ecac-a96b-42d7-ae17-de8956293e71-dispersionconf\") pod \"af11ecac-a96b-42d7-ae17-de8956293e71\" (UID: \"af11ecac-a96b-42d7-ae17-de8956293e71\") " Mar 09 09:57:33 crc kubenswrapper[4971]: I0309 09:57:33.606938 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/af11ecac-a96b-42d7-ae17-de8956293e71-ring-data-devices\") pod \"af11ecac-a96b-42d7-ae17-de8956293e71\" (UID: \"af11ecac-a96b-42d7-ae17-de8956293e71\") " Mar 09 09:57:33 crc kubenswrapper[4971]: I0309 09:57:33.606960 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/af11ecac-a96b-42d7-ae17-de8956293e71-swiftconf\") pod \"af11ecac-a96b-42d7-ae17-de8956293e71\" (UID: \"af11ecac-a96b-42d7-ae17-de8956293e71\") " Mar 09 09:57:33 crc kubenswrapper[4971]: I0309 09:57:33.606998 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af11ecac-a96b-42d7-ae17-de8956293e71-scripts\") pod \"af11ecac-a96b-42d7-ae17-de8956293e71\" (UID: \"af11ecac-a96b-42d7-ae17-de8956293e71\") " Mar 09 09:57:33 crc kubenswrapper[4971]: I0309 09:57:33.607918 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af11ecac-a96b-42d7-ae17-de8956293e71-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "af11ecac-a96b-42d7-ae17-de8956293e71" (UID: "af11ecac-a96b-42d7-ae17-de8956293e71"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:57:33 crc kubenswrapper[4971]: I0309 09:57:33.608240 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af11ecac-a96b-42d7-ae17-de8956293e71-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "af11ecac-a96b-42d7-ae17-de8956293e71" (UID: "af11ecac-a96b-42d7-ae17-de8956293e71"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:57:33 crc kubenswrapper[4971]: I0309 09:57:33.612516 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af11ecac-a96b-42d7-ae17-de8956293e71-kube-api-access-2qtjd" (OuterVolumeSpecName: "kube-api-access-2qtjd") pod "af11ecac-a96b-42d7-ae17-de8956293e71" (UID: "af11ecac-a96b-42d7-ae17-de8956293e71"). InnerVolumeSpecName "kube-api-access-2qtjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:57:33 crc kubenswrapper[4971]: I0309 09:57:33.630096 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af11ecac-a96b-42d7-ae17-de8956293e71-scripts" (OuterVolumeSpecName: "scripts") pod "af11ecac-a96b-42d7-ae17-de8956293e71" (UID: "af11ecac-a96b-42d7-ae17-de8956293e71"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:57:33 crc kubenswrapper[4971]: I0309 09:57:33.630385 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af11ecac-a96b-42d7-ae17-de8956293e71-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "af11ecac-a96b-42d7-ae17-de8956293e71" (UID: "af11ecac-a96b-42d7-ae17-de8956293e71"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:33 crc kubenswrapper[4971]: I0309 09:57:33.639800 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af11ecac-a96b-42d7-ae17-de8956293e71-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "af11ecac-a96b-42d7-ae17-de8956293e71" (UID: "af11ecac-a96b-42d7-ae17-de8956293e71"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:33 crc kubenswrapper[4971]: I0309 09:57:33.708814 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/af11ecac-a96b-42d7-ae17-de8956293e71-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:33 crc kubenswrapper[4971]: I0309 09:57:33.708848 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/af11ecac-a96b-42d7-ae17-de8956293e71-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:33 crc kubenswrapper[4971]: I0309 09:57:33.708862 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/af11ecac-a96b-42d7-ae17-de8956293e71-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:33 crc kubenswrapper[4971]: I0309 09:57:33.708871 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af11ecac-a96b-42d7-ae17-de8956293e71-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:33 crc kubenswrapper[4971]: I0309 09:57:33.708879 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/af11ecac-a96b-42d7-ae17-de8956293e71-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:33 crc kubenswrapper[4971]: I0309 09:57:33.708888 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qtjd\" (UniqueName: \"kubernetes.io/projected/af11ecac-a96b-42d7-ae17-de8956293e71-kube-api-access-2qtjd\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.203476 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="653185a9ca527d05f3b9819400c1f5c303d7cac98395e5275afb2ca844684e3a" Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.203566 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6rls7" Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.669642 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xt665"] Mar 09 09:57:34 crc kubenswrapper[4971]: E0309 09:57:34.670951 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af11ecac-a96b-42d7-ae17-de8956293e71" containerName="swift-ring-rebalance" Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.671057 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="af11ecac-a96b-42d7-ae17-de8956293e71" containerName="swift-ring-rebalance" Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.671241 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="af11ecac-a96b-42d7-ae17-de8956293e71" containerName="swift-ring-rebalance" Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.671820 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt665" Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.675565 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.676047 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.687055 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xt665"] Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.824202 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-dispersionconf\") pod \"swift-ring-rebalance-debug-xt665\" (UID: \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt665" Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.824253 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-ring-data-devices\") pod \"swift-ring-rebalance-debug-xt665\" (UID: \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt665" Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.824272 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-scripts\") pod \"swift-ring-rebalance-debug-xt665\" (UID: \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt665" Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.824450 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f659n\" (UniqueName: \"kubernetes.io/projected/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-kube-api-access-f659n\") pod \"swift-ring-rebalance-debug-xt665\" (UID: \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt665" Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.824490 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-swiftconf\") pod \"swift-ring-rebalance-debug-xt665\" (UID: \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt665" Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.824570 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-etc-swift\") pod \"swift-ring-rebalance-debug-xt665\" (UID: \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt665" Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.926332 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-ring-data-devices\") pod \"swift-ring-rebalance-debug-xt665\" (UID: \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt665" Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.926400 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-scripts\") pod \"swift-ring-rebalance-debug-xt665\" (UID: \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt665" Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.926473 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f659n\" (UniqueName: \"kubernetes.io/projected/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-kube-api-access-f659n\") pod \"swift-ring-rebalance-debug-xt665\" (UID: \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt665" Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.926497 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-swiftconf\") pod \"swift-ring-rebalance-debug-xt665\" (UID: \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt665" Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.926554 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-etc-swift\") pod \"swift-ring-rebalance-debug-xt665\" (UID: \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt665" Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.926604 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-dispersionconf\") pod \"swift-ring-rebalance-debug-xt665\" (UID: \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt665" Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.927297 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-etc-swift\") pod \"swift-ring-rebalance-debug-xt665\" (UID: \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt665" Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.927311 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-ring-data-devices\") pod \"swift-ring-rebalance-debug-xt665\" (UID: \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt665" Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.927335 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-scripts\") pod \"swift-ring-rebalance-debug-xt665\" (UID: \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt665" Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.930639 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-swiftconf\") pod \"swift-ring-rebalance-debug-xt665\" (UID: \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt665" Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.936213 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-dispersionconf\") pod \"swift-ring-rebalance-debug-xt665\" (UID: \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt665" Mar 09 09:57:34 crc kubenswrapper[4971]: I0309 09:57:34.943860 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f659n\" (UniqueName: \"kubernetes.io/projected/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-kube-api-access-f659n\") pod \"swift-ring-rebalance-debug-xt665\" (UID: \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt665" Mar 09 09:57:35 crc kubenswrapper[4971]: I0309 09:57:35.012853 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt665" Mar 09 09:57:35 crc kubenswrapper[4971]: I0309 09:57:35.160309 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af11ecac-a96b-42d7-ae17-de8956293e71" path="/var/lib/kubelet/pods/af11ecac-a96b-42d7-ae17-de8956293e71/volumes" Mar 09 09:57:35 crc kubenswrapper[4971]: I0309 09:57:35.411997 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xt665"] Mar 09 09:57:35 crc kubenswrapper[4971]: W0309 09:57:35.416800 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06e629d8_d2bf_4c02_8936_2d70a37f5d1a.slice/crio-ea83c849c2352ef8a8084e27a6f7b7cd7d90cfdf867326e20cf62ebed28eb0e9 WatchSource:0}: Error finding container ea83c849c2352ef8a8084e27a6f7b7cd7d90cfdf867326e20cf62ebed28eb0e9: Status 404 returned error can't find the container with id ea83c849c2352ef8a8084e27a6f7b7cd7d90cfdf867326e20cf62ebed28eb0e9 Mar 09 09:57:36 crc kubenswrapper[4971]: I0309 09:57:36.226596 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt665" event={"ID":"06e629d8-d2bf-4c02-8936-2d70a37f5d1a","Type":"ContainerStarted","Data":"d9ea79ea67c439c08bf4a6bdb114f43bf94f600024ce53a496217694f5fa5a69"} Mar 09 09:57:36 crc kubenswrapper[4971]: I0309 09:57:36.226990 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt665" event={"ID":"06e629d8-d2bf-4c02-8936-2d70a37f5d1a","Type":"ContainerStarted","Data":"ea83c849c2352ef8a8084e27a6f7b7cd7d90cfdf867326e20cf62ebed28eb0e9"} Mar 09 09:57:36 crc kubenswrapper[4971]: I0309 09:57:36.244655 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt665" podStartSLOduration=2.244627065 podStartE2EDuration="2.244627065s" podCreationTimestamp="2026-03-09 09:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:57:36.24053512 +0000 UTC m=+2259.800462930" watchObservedRunningTime="2026-03-09 09:57:36.244627065 +0000 UTC m=+2259.804554875" Mar 09 09:57:37 crc kubenswrapper[4971]: I0309 09:57:37.236840 4971 generic.go:334] "Generic (PLEG): container finished" podID="06e629d8-d2bf-4c02-8936-2d70a37f5d1a" containerID="d9ea79ea67c439c08bf4a6bdb114f43bf94f600024ce53a496217694f5fa5a69" exitCode=0 Mar 09 09:57:37 crc kubenswrapper[4971]: I0309 09:57:37.236885 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt665" event={"ID":"06e629d8-d2bf-4c02-8936-2d70a37f5d1a","Type":"ContainerDied","Data":"d9ea79ea67c439c08bf4a6bdb114f43bf94f600024ce53a496217694f5fa5a69"} Mar 09 09:57:38 crc kubenswrapper[4971]: I0309 09:57:38.546272 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt665" Mar 09 09:57:38 crc kubenswrapper[4971]: I0309 09:57:38.584827 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xt665"] Mar 09 09:57:38 crc kubenswrapper[4971]: I0309 09:57:38.594471 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xt665"] Mar 09 09:57:38 crc kubenswrapper[4971]: I0309 09:57:38.597456 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-scripts\") pod \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\" (UID: \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\") " Mar 09 09:57:38 crc kubenswrapper[4971]: I0309 09:57:38.597675 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-dispersionconf\") pod \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\" (UID: \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\") " Mar 09 09:57:38 crc kubenswrapper[4971]: I0309 09:57:38.597799 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-etc-swift\") pod \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\" (UID: \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\") " Mar 09 09:57:38 crc kubenswrapper[4971]: I0309 09:57:38.598011 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-swiftconf\") pod \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\" (UID: \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\") " Mar 09 09:57:38 crc kubenswrapper[4971]: I0309 09:57:38.598742 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f659n\" (UniqueName: \"kubernetes.io/projected/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-kube-api-access-f659n\") pod \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\" (UID: \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\") " Mar 09 09:57:38 crc kubenswrapper[4971]: I0309 09:57:38.598789 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-ring-data-devices\") pod \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\" (UID: \"06e629d8-d2bf-4c02-8936-2d70a37f5d1a\") " Mar 09 09:57:38 crc kubenswrapper[4971]: I0309 09:57:38.599111 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "06e629d8-d2bf-4c02-8936-2d70a37f5d1a" (UID: "06e629d8-d2bf-4c02-8936-2d70a37f5d1a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:57:38 crc kubenswrapper[4971]: I0309 09:57:38.599342 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:38 crc kubenswrapper[4971]: I0309 09:57:38.600057 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "06e629d8-d2bf-4c02-8936-2d70a37f5d1a" (UID: "06e629d8-d2bf-4c02-8936-2d70a37f5d1a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:57:38 crc kubenswrapper[4971]: I0309 09:57:38.604876 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-kube-api-access-f659n" (OuterVolumeSpecName: "kube-api-access-f659n") pod "06e629d8-d2bf-4c02-8936-2d70a37f5d1a" (UID: "06e629d8-d2bf-4c02-8936-2d70a37f5d1a"). InnerVolumeSpecName "kube-api-access-f659n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:57:38 crc kubenswrapper[4971]: I0309 09:57:38.620239 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "06e629d8-d2bf-4c02-8936-2d70a37f5d1a" (UID: "06e629d8-d2bf-4c02-8936-2d70a37f5d1a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:38 crc kubenswrapper[4971]: I0309 09:57:38.630569 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "06e629d8-d2bf-4c02-8936-2d70a37f5d1a" (UID: "06e629d8-d2bf-4c02-8936-2d70a37f5d1a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:38 crc kubenswrapper[4971]: I0309 09:57:38.638274 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-scripts" (OuterVolumeSpecName: "scripts") pod "06e629d8-d2bf-4c02-8936-2d70a37f5d1a" (UID: "06e629d8-d2bf-4c02-8936-2d70a37f5d1a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:57:38 crc kubenswrapper[4971]: I0309 09:57:38.700533 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:38 crc kubenswrapper[4971]: I0309 09:57:38.700565 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f659n\" (UniqueName: \"kubernetes.io/projected/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-kube-api-access-f659n\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:38 crc kubenswrapper[4971]: I0309 09:57:38.700576 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:38 crc kubenswrapper[4971]: I0309 09:57:38.700588 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:38 crc kubenswrapper[4971]: I0309 09:57:38.700596 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/06e629d8-d2bf-4c02-8936-2d70a37f5d1a-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.163021 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06e629d8-d2bf-4c02-8936-2d70a37f5d1a" path="/var/lib/kubelet/pods/06e629d8-d2bf-4c02-8936-2d70a37f5d1a/volumes" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.254584 4971 scope.go:117] "RemoveContainer" containerID="d9ea79ea67c439c08bf4a6bdb114f43bf94f600024ce53a496217694f5fa5a69" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.254773 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xt665" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.713055 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-747k9"] Mar 09 09:57:39 crc kubenswrapper[4971]: E0309 09:57:39.713438 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06e629d8-d2bf-4c02-8936-2d70a37f5d1a" containerName="swift-ring-rebalance" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.713455 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="06e629d8-d2bf-4c02-8936-2d70a37f5d1a" containerName="swift-ring-rebalance" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.713662 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="06e629d8-d2bf-4c02-8936-2d70a37f5d1a" containerName="swift-ring-rebalance" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.714219 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-747k9" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.716878 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.716878 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.727428 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-747k9"] Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.816597 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-dispersionconf\") pod \"swift-ring-rebalance-debug-747k9\" (UID: \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-747k9" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.816658 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcmwj\" (UniqueName: \"kubernetes.io/projected/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-kube-api-access-lcmwj\") pod \"swift-ring-rebalance-debug-747k9\" (UID: \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-747k9" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.816743 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-etc-swift\") pod \"swift-ring-rebalance-debug-747k9\" (UID: \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-747k9" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.816777 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-scripts\") pod \"swift-ring-rebalance-debug-747k9\" (UID: \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-747k9" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.816829 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-ring-data-devices\") pod \"swift-ring-rebalance-debug-747k9\" (UID: \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-747k9" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.817220 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-swiftconf\") pod \"swift-ring-rebalance-debug-747k9\" (UID: \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-747k9" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.918624 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-dispersionconf\") pod \"swift-ring-rebalance-debug-747k9\" (UID: \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-747k9" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.918692 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcmwj\" (UniqueName: \"kubernetes.io/projected/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-kube-api-access-lcmwj\") pod \"swift-ring-rebalance-debug-747k9\" (UID: \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-747k9" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.918729 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-etc-swift\") pod \"swift-ring-rebalance-debug-747k9\" (UID: \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-747k9" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.918754 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-scripts\") pod \"swift-ring-rebalance-debug-747k9\" (UID: \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-747k9" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.918783 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-ring-data-devices\") pod \"swift-ring-rebalance-debug-747k9\" (UID: \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-747k9" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.918871 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-swiftconf\") pod \"swift-ring-rebalance-debug-747k9\" (UID: \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-747k9" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.919599 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-etc-swift\") pod \"swift-ring-rebalance-debug-747k9\" (UID: \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-747k9" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.919738 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-ring-data-devices\") pod \"swift-ring-rebalance-debug-747k9\" (UID: \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-747k9" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.919768 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-scripts\") pod \"swift-ring-rebalance-debug-747k9\" (UID: \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-747k9" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.922628 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-dispersionconf\") pod \"swift-ring-rebalance-debug-747k9\" (UID: \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-747k9" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.924117 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-swiftconf\") pod \"swift-ring-rebalance-debug-747k9\" (UID: \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-747k9" Mar 09 09:57:39 crc kubenswrapper[4971]: I0309 09:57:39.935086 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcmwj\" (UniqueName: \"kubernetes.io/projected/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-kube-api-access-lcmwj\") pod \"swift-ring-rebalance-debug-747k9\" (UID: \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-747k9" Mar 09 09:57:40 crc kubenswrapper[4971]: I0309 09:57:40.031933 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-747k9" Mar 09 09:57:40 crc kubenswrapper[4971]: I0309 09:57:40.947799 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-747k9"] Mar 09 09:57:41 crc kubenswrapper[4971]: I0309 09:57:41.276715 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-747k9" event={"ID":"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13","Type":"ContainerStarted","Data":"b251cdd066c16b74e6e0498f7266e380f265dc02219216b18c45abf57c136632"} Mar 09 09:57:41 crc kubenswrapper[4971]: I0309 09:57:41.277166 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-747k9" event={"ID":"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13","Type":"ContainerStarted","Data":"aa658981c058f74cdf14904de66c4a208697928d3158c1faaf7180c4fbae7a2e"} Mar 09 09:57:41 crc kubenswrapper[4971]: I0309 09:57:41.303117 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-747k9" podStartSLOduration=2.303100696 podStartE2EDuration="2.303100696s" podCreationTimestamp="2026-03-09 09:57:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:57:41.294601093 +0000 UTC m=+2264.854528963" watchObservedRunningTime="2026-03-09 09:57:41.303100696 +0000 UTC m=+2264.863028506" Mar 09 09:57:43 crc kubenswrapper[4971]: I0309 09:57:43.054096 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4kzd2"] Mar 09 09:57:43 crc kubenswrapper[4971]: I0309 09:57:43.057932 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4kzd2" Mar 09 09:57:43 crc kubenswrapper[4971]: I0309 09:57:43.067597 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4kzd2"] Mar 09 09:57:43 crc kubenswrapper[4971]: I0309 09:57:43.071849 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvwtv\" (UniqueName: \"kubernetes.io/projected/405aed0a-ce2a-4fa1-b482-6086aff71975-kube-api-access-gvwtv\") pod \"certified-operators-4kzd2\" (UID: \"405aed0a-ce2a-4fa1-b482-6086aff71975\") " pod="openshift-marketplace/certified-operators-4kzd2" Mar 09 09:57:43 crc kubenswrapper[4971]: I0309 09:57:43.071891 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/405aed0a-ce2a-4fa1-b482-6086aff71975-catalog-content\") pod \"certified-operators-4kzd2\" (UID: \"405aed0a-ce2a-4fa1-b482-6086aff71975\") " pod="openshift-marketplace/certified-operators-4kzd2" Mar 09 09:57:43 crc kubenswrapper[4971]: I0309 09:57:43.071946 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/405aed0a-ce2a-4fa1-b482-6086aff71975-utilities\") pod \"certified-operators-4kzd2\" (UID: \"405aed0a-ce2a-4fa1-b482-6086aff71975\") " pod="openshift-marketplace/certified-operators-4kzd2" Mar 09 09:57:43 crc kubenswrapper[4971]: I0309 09:57:43.173728 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/405aed0a-ce2a-4fa1-b482-6086aff71975-utilities\") pod \"certified-operators-4kzd2\" (UID: \"405aed0a-ce2a-4fa1-b482-6086aff71975\") " pod="openshift-marketplace/certified-operators-4kzd2" Mar 09 09:57:43 crc kubenswrapper[4971]: I0309 09:57:43.173942 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvwtv\" (UniqueName: \"kubernetes.io/projected/405aed0a-ce2a-4fa1-b482-6086aff71975-kube-api-access-gvwtv\") pod \"certified-operators-4kzd2\" (UID: \"405aed0a-ce2a-4fa1-b482-6086aff71975\") " pod="openshift-marketplace/certified-operators-4kzd2" Mar 09 09:57:43 crc kubenswrapper[4971]: I0309 09:57:43.173972 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/405aed0a-ce2a-4fa1-b482-6086aff71975-catalog-content\") pod \"certified-operators-4kzd2\" (UID: \"405aed0a-ce2a-4fa1-b482-6086aff71975\") " pod="openshift-marketplace/certified-operators-4kzd2" Mar 09 09:57:43 crc kubenswrapper[4971]: I0309 09:57:43.174231 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/405aed0a-ce2a-4fa1-b482-6086aff71975-utilities\") pod \"certified-operators-4kzd2\" (UID: \"405aed0a-ce2a-4fa1-b482-6086aff71975\") " pod="openshift-marketplace/certified-operators-4kzd2" Mar 09 09:57:43 crc kubenswrapper[4971]: I0309 09:57:43.174609 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/405aed0a-ce2a-4fa1-b482-6086aff71975-catalog-content\") pod \"certified-operators-4kzd2\" (UID: \"405aed0a-ce2a-4fa1-b482-6086aff71975\") " pod="openshift-marketplace/certified-operators-4kzd2" Mar 09 09:57:43 crc kubenswrapper[4971]: I0309 09:57:43.196496 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvwtv\" (UniqueName: \"kubernetes.io/projected/405aed0a-ce2a-4fa1-b482-6086aff71975-kube-api-access-gvwtv\") pod \"certified-operators-4kzd2\" (UID: \"405aed0a-ce2a-4fa1-b482-6086aff71975\") " pod="openshift-marketplace/certified-operators-4kzd2" Mar 09 09:57:43 crc kubenswrapper[4971]: I0309 09:57:43.303738 4971 generic.go:334] "Generic (PLEG): container finished" podID="77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13" containerID="b251cdd066c16b74e6e0498f7266e380f265dc02219216b18c45abf57c136632" exitCode=0 Mar 09 09:57:43 crc kubenswrapper[4971]: I0309 09:57:43.303788 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-747k9" event={"ID":"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13","Type":"ContainerDied","Data":"b251cdd066c16b74e6e0498f7266e380f265dc02219216b18c45abf57c136632"} Mar 09 09:57:43 crc kubenswrapper[4971]: I0309 09:57:43.391922 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4kzd2" Mar 09 09:57:43 crc kubenswrapper[4971]: I0309 09:57:43.865379 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4kzd2"] Mar 09 09:57:44 crc kubenswrapper[4971]: I0309 09:57:44.312227 4971 generic.go:334] "Generic (PLEG): container finished" podID="405aed0a-ce2a-4fa1-b482-6086aff71975" containerID="a34529f5c1cdbed6f1c229069134f20dd2729c8ba57e925b646b22098bd97809" exitCode=0 Mar 09 09:57:44 crc kubenswrapper[4971]: I0309 09:57:44.312293 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4kzd2" event={"ID":"405aed0a-ce2a-4fa1-b482-6086aff71975","Type":"ContainerDied","Data":"a34529f5c1cdbed6f1c229069134f20dd2729c8ba57e925b646b22098bd97809"} Mar 09 09:57:44 crc kubenswrapper[4971]: I0309 09:57:44.312626 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4kzd2" event={"ID":"405aed0a-ce2a-4fa1-b482-6086aff71975","Type":"ContainerStarted","Data":"c1a547f4741961abc6c7ca37465c449e3627819b67d2d9af844e78c734f8b46d"} Mar 09 09:57:44 crc kubenswrapper[4971]: I0309 09:57:44.589996 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-747k9" Mar 09 09:57:44 crc kubenswrapper[4971]: I0309 09:57:44.601851 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-dispersionconf\") pod \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\" (UID: \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\") " Mar 09 09:57:44 crc kubenswrapper[4971]: I0309 09:57:44.601959 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-swiftconf\") pod \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\" (UID: \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\") " Mar 09 09:57:44 crc kubenswrapper[4971]: I0309 09:57:44.602015 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-etc-swift\") pod \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\" (UID: \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\") " Mar 09 09:57:44 crc kubenswrapper[4971]: I0309 09:57:44.602095 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcmwj\" (UniqueName: \"kubernetes.io/projected/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-kube-api-access-lcmwj\") pod \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\" (UID: \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\") " Mar 09 09:57:44 crc kubenswrapper[4971]: I0309 09:57:44.602160 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-scripts\") pod \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\" (UID: \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\") " Mar 09 09:57:44 crc kubenswrapper[4971]: I0309 09:57:44.602188 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-ring-data-devices\") pod \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\" (UID: \"77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13\") " Mar 09 09:57:44 crc kubenswrapper[4971]: I0309 09:57:44.603093 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13" (UID: "77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:57:44 crc kubenswrapper[4971]: I0309 09:57:44.603114 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13" (UID: "77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:57:44 crc kubenswrapper[4971]: I0309 09:57:44.611068 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-kube-api-access-lcmwj" (OuterVolumeSpecName: "kube-api-access-lcmwj") pod "77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13" (UID: "77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13"). InnerVolumeSpecName "kube-api-access-lcmwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:57:44 crc kubenswrapper[4971]: I0309 09:57:44.631512 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13" (UID: "77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:44 crc kubenswrapper[4971]: I0309 09:57:44.633092 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13" (UID: "77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:44 crc kubenswrapper[4971]: I0309 09:57:44.635652 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-scripts" (OuterVolumeSpecName: "scripts") pod "77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13" (UID: "77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:57:44 crc kubenswrapper[4971]: I0309 09:57:44.635820 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-747k9"] Mar 09 09:57:44 crc kubenswrapper[4971]: I0309 09:57:44.644290 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-747k9"] Mar 09 09:57:44 crc kubenswrapper[4971]: I0309 09:57:44.704148 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:44 crc kubenswrapper[4971]: I0309 09:57:44.704209 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:44 crc kubenswrapper[4971]: I0309 09:57:44.704230 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:44 crc kubenswrapper[4971]: I0309 09:57:44.704251 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcmwj\" (UniqueName: \"kubernetes.io/projected/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-kube-api-access-lcmwj\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:44 crc kubenswrapper[4971]: I0309 09:57:44.704269 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:44 crc kubenswrapper[4971]: I0309 09:57:44.704285 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:44 crc kubenswrapper[4971]: I0309 09:57:44.794499 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:57:44 crc kubenswrapper[4971]: I0309 09:57:44.794611 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.160243 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13" path="/var/lib/kubelet/pods/77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13/volumes" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.324565 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-747k9" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.324597 4971 scope.go:117] "RemoveContainer" containerID="b251cdd066c16b74e6e0498f7266e380f265dc02219216b18c45abf57c136632" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.327012 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4kzd2" event={"ID":"405aed0a-ce2a-4fa1-b482-6086aff71975","Type":"ContainerStarted","Data":"db448860d2406381349bdaccd05e046567ce0d3b3b5c1f34b271d1c944e0360b"} Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.441933 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hvccr"] Mar 09 09:57:45 crc kubenswrapper[4971]: E0309 09:57:45.442333 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13" containerName="swift-ring-rebalance" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.442369 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13" containerName="swift-ring-rebalance" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.442550 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="77024fcc-8d3b-4e4e-ba7f-eb9b177b9c13" containerName="swift-ring-rebalance" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.443684 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvccr" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.465936 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hvccr"] Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.516989 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n5xq\" (UniqueName: \"kubernetes.io/projected/3d728e05-9ee8-42f5-a7d3-0a4418f45ab5-kube-api-access-7n5xq\") pod \"community-operators-hvccr\" (UID: \"3d728e05-9ee8-42f5-a7d3-0a4418f45ab5\") " pod="openshift-marketplace/community-operators-hvccr" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.517276 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d728e05-9ee8-42f5-a7d3-0a4418f45ab5-utilities\") pod \"community-operators-hvccr\" (UID: \"3d728e05-9ee8-42f5-a7d3-0a4418f45ab5\") " pod="openshift-marketplace/community-operators-hvccr" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.517456 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d728e05-9ee8-42f5-a7d3-0a4418f45ab5-catalog-content\") pod \"community-operators-hvccr\" (UID: \"3d728e05-9ee8-42f5-a7d3-0a4418f45ab5\") " pod="openshift-marketplace/community-operators-hvccr" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.619423 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n5xq\" (UniqueName: \"kubernetes.io/projected/3d728e05-9ee8-42f5-a7d3-0a4418f45ab5-kube-api-access-7n5xq\") pod \"community-operators-hvccr\" (UID: \"3d728e05-9ee8-42f5-a7d3-0a4418f45ab5\") " pod="openshift-marketplace/community-operators-hvccr" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.620043 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d728e05-9ee8-42f5-a7d3-0a4418f45ab5-utilities\") pod \"community-operators-hvccr\" (UID: \"3d728e05-9ee8-42f5-a7d3-0a4418f45ab5\") " pod="openshift-marketplace/community-operators-hvccr" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.620200 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d728e05-9ee8-42f5-a7d3-0a4418f45ab5-catalog-content\") pod \"community-operators-hvccr\" (UID: \"3d728e05-9ee8-42f5-a7d3-0a4418f45ab5\") " pod="openshift-marketplace/community-operators-hvccr" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.620680 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d728e05-9ee8-42f5-a7d3-0a4418f45ab5-utilities\") pod \"community-operators-hvccr\" (UID: \"3d728e05-9ee8-42f5-a7d3-0a4418f45ab5\") " pod="openshift-marketplace/community-operators-hvccr" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.620788 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d728e05-9ee8-42f5-a7d3-0a4418f45ab5-catalog-content\") pod \"community-operators-hvccr\" (UID: \"3d728e05-9ee8-42f5-a7d3-0a4418f45ab5\") " pod="openshift-marketplace/community-operators-hvccr" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.636286 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n5xq\" (UniqueName: \"kubernetes.io/projected/3d728e05-9ee8-42f5-a7d3-0a4418f45ab5-kube-api-access-7n5xq\") pod \"community-operators-hvccr\" (UID: \"3d728e05-9ee8-42f5-a7d3-0a4418f45ab5\") " pod="openshift-marketplace/community-operators-hvccr" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.762194 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvccr" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.763899 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tww8p"] Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.764764 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tww8p" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.767778 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.768035 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.774999 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tww8p"] Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.824648 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ce5af283-d357-4474-a53d-ed689e79f193-etc-swift\") pod \"swift-ring-rebalance-debug-tww8p\" (UID: \"ce5af283-d357-4474-a53d-ed689e79f193\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tww8p" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.824717 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce5af283-d357-4474-a53d-ed689e79f193-scripts\") pod \"swift-ring-rebalance-debug-tww8p\" (UID: \"ce5af283-d357-4474-a53d-ed689e79f193\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tww8p" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.824772 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ce5af283-d357-4474-a53d-ed689e79f193-swiftconf\") pod \"swift-ring-rebalance-debug-tww8p\" (UID: \"ce5af283-d357-4474-a53d-ed689e79f193\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tww8p" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.824834 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ce5af283-d357-4474-a53d-ed689e79f193-dispersionconf\") pod \"swift-ring-rebalance-debug-tww8p\" (UID: \"ce5af283-d357-4474-a53d-ed689e79f193\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tww8p" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.824871 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ce5af283-d357-4474-a53d-ed689e79f193-ring-data-devices\") pod \"swift-ring-rebalance-debug-tww8p\" (UID: \"ce5af283-d357-4474-a53d-ed689e79f193\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tww8p" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.824896 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5jhk\" (UniqueName: \"kubernetes.io/projected/ce5af283-d357-4474-a53d-ed689e79f193-kube-api-access-n5jhk\") pod \"swift-ring-rebalance-debug-tww8p\" (UID: \"ce5af283-d357-4474-a53d-ed689e79f193\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tww8p" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.926235 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ce5af283-d357-4474-a53d-ed689e79f193-dispersionconf\") pod \"swift-ring-rebalance-debug-tww8p\" (UID: \"ce5af283-d357-4474-a53d-ed689e79f193\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tww8p" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.926280 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ce5af283-d357-4474-a53d-ed689e79f193-ring-data-devices\") pod \"swift-ring-rebalance-debug-tww8p\" (UID: \"ce5af283-d357-4474-a53d-ed689e79f193\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tww8p" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.926309 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5jhk\" (UniqueName: \"kubernetes.io/projected/ce5af283-d357-4474-a53d-ed689e79f193-kube-api-access-n5jhk\") pod \"swift-ring-rebalance-debug-tww8p\" (UID: \"ce5af283-d357-4474-a53d-ed689e79f193\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tww8p" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.926367 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ce5af283-d357-4474-a53d-ed689e79f193-etc-swift\") pod \"swift-ring-rebalance-debug-tww8p\" (UID: \"ce5af283-d357-4474-a53d-ed689e79f193\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tww8p" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.926396 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce5af283-d357-4474-a53d-ed689e79f193-scripts\") pod \"swift-ring-rebalance-debug-tww8p\" (UID: \"ce5af283-d357-4474-a53d-ed689e79f193\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tww8p" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.926436 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ce5af283-d357-4474-a53d-ed689e79f193-swiftconf\") pod \"swift-ring-rebalance-debug-tww8p\" (UID: \"ce5af283-d357-4474-a53d-ed689e79f193\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tww8p" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.927812 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ce5af283-d357-4474-a53d-ed689e79f193-etc-swift\") pod \"swift-ring-rebalance-debug-tww8p\" (UID: \"ce5af283-d357-4474-a53d-ed689e79f193\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tww8p" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.928525 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce5af283-d357-4474-a53d-ed689e79f193-scripts\") pod \"swift-ring-rebalance-debug-tww8p\" (UID: \"ce5af283-d357-4474-a53d-ed689e79f193\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tww8p" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.928573 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ce5af283-d357-4474-a53d-ed689e79f193-ring-data-devices\") pod \"swift-ring-rebalance-debug-tww8p\" (UID: \"ce5af283-d357-4474-a53d-ed689e79f193\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tww8p" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.932166 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ce5af283-d357-4474-a53d-ed689e79f193-swiftconf\") pod \"swift-ring-rebalance-debug-tww8p\" (UID: \"ce5af283-d357-4474-a53d-ed689e79f193\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tww8p" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.960340 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5jhk\" (UniqueName: \"kubernetes.io/projected/ce5af283-d357-4474-a53d-ed689e79f193-kube-api-access-n5jhk\") pod \"swift-ring-rebalance-debug-tww8p\" (UID: \"ce5af283-d357-4474-a53d-ed689e79f193\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tww8p" Mar 09 09:57:45 crc kubenswrapper[4971]: I0309 09:57:45.967834 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ce5af283-d357-4474-a53d-ed689e79f193-dispersionconf\") pod \"swift-ring-rebalance-debug-tww8p\" (UID: \"ce5af283-d357-4474-a53d-ed689e79f193\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tww8p" Mar 09 09:57:46 crc kubenswrapper[4971]: I0309 09:57:46.079912 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tww8p" Mar 09 09:57:46 crc kubenswrapper[4971]: I0309 09:57:46.264590 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hvccr"] Mar 09 09:57:46 crc kubenswrapper[4971]: I0309 09:57:46.337294 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvccr" event={"ID":"3d728e05-9ee8-42f5-a7d3-0a4418f45ab5","Type":"ContainerStarted","Data":"55ae37edc3b1f6ecc29d2d042a739bbae20cfaa599a8c048224c581854b473bc"} Mar 09 09:57:46 crc kubenswrapper[4971]: I0309 09:57:46.339679 4971 generic.go:334] "Generic (PLEG): container finished" podID="405aed0a-ce2a-4fa1-b482-6086aff71975" containerID="db448860d2406381349bdaccd05e046567ce0d3b3b5c1f34b271d1c944e0360b" exitCode=0 Mar 09 09:57:46 crc kubenswrapper[4971]: I0309 09:57:46.339708 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4kzd2" event={"ID":"405aed0a-ce2a-4fa1-b482-6086aff71975","Type":"ContainerDied","Data":"db448860d2406381349bdaccd05e046567ce0d3b3b5c1f34b271d1c944e0360b"} Mar 09 09:57:46 crc kubenswrapper[4971]: I0309 09:57:46.461518 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tww8p"] Mar 09 09:57:46 crc kubenswrapper[4971]: W0309 09:57:46.505163 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce5af283_d357_4474_a53d_ed689e79f193.slice/crio-f815e819c50e1514845ef80379bf148306a3c68a552ae41898f3856d472078e5 WatchSource:0}: Error finding container f815e819c50e1514845ef80379bf148306a3c68a552ae41898f3856d472078e5: Status 404 returned error can't find the container with id f815e819c50e1514845ef80379bf148306a3c68a552ae41898f3856d472078e5 Mar 09 09:57:47 crc kubenswrapper[4971]: I0309 09:57:47.355836 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4kzd2" event={"ID":"405aed0a-ce2a-4fa1-b482-6086aff71975","Type":"ContainerStarted","Data":"a2670868da1ce5ab17622635a2dd18461fa4ddb53013957dc362f89e4aca6721"} Mar 09 09:57:47 crc kubenswrapper[4971]: I0309 09:57:47.359497 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tww8p" event={"ID":"ce5af283-d357-4474-a53d-ed689e79f193","Type":"ContainerStarted","Data":"da66792949c32da0ae0b5463bf18a328622da4220e5b44b4e0bfd9159b9a726f"} Mar 09 09:57:47 crc kubenswrapper[4971]: I0309 09:57:47.359560 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tww8p" event={"ID":"ce5af283-d357-4474-a53d-ed689e79f193","Type":"ContainerStarted","Data":"f815e819c50e1514845ef80379bf148306a3c68a552ae41898f3856d472078e5"} Mar 09 09:57:47 crc kubenswrapper[4971]: I0309 09:57:47.361684 4971 generic.go:334] "Generic (PLEG): container finished" podID="3d728e05-9ee8-42f5-a7d3-0a4418f45ab5" containerID="625076e3ca348ce891c0615ceb8f24fcb6f4d28065c78ca50761c26ed7ce2069" exitCode=0 Mar 09 09:57:47 crc kubenswrapper[4971]: I0309 09:57:47.361724 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvccr" event={"ID":"3d728e05-9ee8-42f5-a7d3-0a4418f45ab5","Type":"ContainerDied","Data":"625076e3ca348ce891c0615ceb8f24fcb6f4d28065c78ca50761c26ed7ce2069"} Mar 09 09:57:47 crc kubenswrapper[4971]: I0309 09:57:47.377709 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4kzd2" podStartSLOduration=1.958892147 podStartE2EDuration="4.37768801s" podCreationTimestamp="2026-03-09 09:57:43 +0000 UTC" firstStartedPulling="2026-03-09 09:57:44.316531273 +0000 UTC m=+2267.876459103" lastFinishedPulling="2026-03-09 09:57:46.735327146 +0000 UTC m=+2270.295254966" observedRunningTime="2026-03-09 09:57:47.375795568 +0000 UTC m=+2270.935723388" watchObservedRunningTime="2026-03-09 09:57:47.37768801 +0000 UTC m=+2270.937615830" Mar 09 09:57:47 crc kubenswrapper[4971]: I0309 09:57:47.404770 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tww8p" podStartSLOduration=2.404744634 podStartE2EDuration="2.404744634s" podCreationTimestamp="2026-03-09 09:57:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:57:47.39769088 +0000 UTC m=+2270.957618690" watchObservedRunningTime="2026-03-09 09:57:47.404744634 +0000 UTC m=+2270.964672444" Mar 09 09:57:48 crc kubenswrapper[4971]: I0309 09:57:48.373689 4971 generic.go:334] "Generic (PLEG): container finished" podID="ce5af283-d357-4474-a53d-ed689e79f193" containerID="da66792949c32da0ae0b5463bf18a328622da4220e5b44b4e0bfd9159b9a726f" exitCode=0 Mar 09 09:57:48 crc kubenswrapper[4971]: I0309 09:57:48.373895 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tww8p" event={"ID":"ce5af283-d357-4474-a53d-ed689e79f193","Type":"ContainerDied","Data":"da66792949c32da0ae0b5463bf18a328622da4220e5b44b4e0bfd9159b9a726f"} Mar 09 09:57:48 crc kubenswrapper[4971]: I0309 09:57:48.376667 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvccr" event={"ID":"3d728e05-9ee8-42f5-a7d3-0a4418f45ab5","Type":"ContainerStarted","Data":"beae46edb7bfc14dc130fd9c2dffe8d0822de075981fe5cfe512bb1af7fbf2fe"} Mar 09 09:57:49 crc kubenswrapper[4971]: I0309 09:57:49.390633 4971 generic.go:334] "Generic (PLEG): container finished" podID="3d728e05-9ee8-42f5-a7d3-0a4418f45ab5" containerID="beae46edb7bfc14dc130fd9c2dffe8d0822de075981fe5cfe512bb1af7fbf2fe" exitCode=0 Mar 09 09:57:49 crc kubenswrapper[4971]: I0309 09:57:49.390711 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvccr" event={"ID":"3d728e05-9ee8-42f5-a7d3-0a4418f45ab5","Type":"ContainerDied","Data":"beae46edb7bfc14dc130fd9c2dffe8d0822de075981fe5cfe512bb1af7fbf2fe"} Mar 09 09:57:49 crc kubenswrapper[4971]: I0309 09:57:49.652456 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tww8p" Mar 09 09:57:49 crc kubenswrapper[4971]: I0309 09:57:49.680661 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tww8p"] Mar 09 09:57:49 crc kubenswrapper[4971]: I0309 09:57:49.686862 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tww8p"] Mar 09 09:57:49 crc kubenswrapper[4971]: I0309 09:57:49.779082 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce5af283-d357-4474-a53d-ed689e79f193-scripts\") pod \"ce5af283-d357-4474-a53d-ed689e79f193\" (UID: \"ce5af283-d357-4474-a53d-ed689e79f193\") " Mar 09 09:57:49 crc kubenswrapper[4971]: I0309 09:57:49.779286 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ce5af283-d357-4474-a53d-ed689e79f193-swiftconf\") pod \"ce5af283-d357-4474-a53d-ed689e79f193\" (UID: \"ce5af283-d357-4474-a53d-ed689e79f193\") " Mar 09 09:57:49 crc kubenswrapper[4971]: I0309 09:57:49.779312 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ce5af283-d357-4474-a53d-ed689e79f193-dispersionconf\") pod \"ce5af283-d357-4474-a53d-ed689e79f193\" (UID: \"ce5af283-d357-4474-a53d-ed689e79f193\") " Mar 09 09:57:49 crc kubenswrapper[4971]: I0309 09:57:49.779338 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5jhk\" (UniqueName: \"kubernetes.io/projected/ce5af283-d357-4474-a53d-ed689e79f193-kube-api-access-n5jhk\") pod \"ce5af283-d357-4474-a53d-ed689e79f193\" (UID: \"ce5af283-d357-4474-a53d-ed689e79f193\") " Mar 09 09:57:49 crc kubenswrapper[4971]: I0309 09:57:49.779383 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ce5af283-d357-4474-a53d-ed689e79f193-etc-swift\") pod \"ce5af283-d357-4474-a53d-ed689e79f193\" (UID: \"ce5af283-d357-4474-a53d-ed689e79f193\") " Mar 09 09:57:49 crc kubenswrapper[4971]: I0309 09:57:49.779403 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ce5af283-d357-4474-a53d-ed689e79f193-ring-data-devices\") pod \"ce5af283-d357-4474-a53d-ed689e79f193\" (UID: \"ce5af283-d357-4474-a53d-ed689e79f193\") " Mar 09 09:57:49 crc kubenswrapper[4971]: I0309 09:57:49.780190 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce5af283-d357-4474-a53d-ed689e79f193-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ce5af283-d357-4474-a53d-ed689e79f193" (UID: "ce5af283-d357-4474-a53d-ed689e79f193"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:57:49 crc kubenswrapper[4971]: I0309 09:57:49.780260 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce5af283-d357-4474-a53d-ed689e79f193-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ce5af283-d357-4474-a53d-ed689e79f193" (UID: "ce5af283-d357-4474-a53d-ed689e79f193"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:57:49 crc kubenswrapper[4971]: I0309 09:57:49.784728 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce5af283-d357-4474-a53d-ed689e79f193-kube-api-access-n5jhk" (OuterVolumeSpecName: "kube-api-access-n5jhk") pod "ce5af283-d357-4474-a53d-ed689e79f193" (UID: "ce5af283-d357-4474-a53d-ed689e79f193"). InnerVolumeSpecName "kube-api-access-n5jhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:57:49 crc kubenswrapper[4971]: I0309 09:57:49.804500 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce5af283-d357-4474-a53d-ed689e79f193-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ce5af283-d357-4474-a53d-ed689e79f193" (UID: "ce5af283-d357-4474-a53d-ed689e79f193"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:49 crc kubenswrapper[4971]: I0309 09:57:49.806235 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce5af283-d357-4474-a53d-ed689e79f193-scripts" (OuterVolumeSpecName: "scripts") pod "ce5af283-d357-4474-a53d-ed689e79f193" (UID: "ce5af283-d357-4474-a53d-ed689e79f193"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:57:49 crc kubenswrapper[4971]: I0309 09:57:49.809114 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce5af283-d357-4474-a53d-ed689e79f193-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ce5af283-d357-4474-a53d-ed689e79f193" (UID: "ce5af283-d357-4474-a53d-ed689e79f193"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:49 crc kubenswrapper[4971]: I0309 09:57:49.880902 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce5af283-d357-4474-a53d-ed689e79f193-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:49 crc kubenswrapper[4971]: I0309 09:57:49.880934 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ce5af283-d357-4474-a53d-ed689e79f193-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:49 crc kubenswrapper[4971]: I0309 09:57:49.880945 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ce5af283-d357-4474-a53d-ed689e79f193-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:49 crc kubenswrapper[4971]: I0309 09:57:49.880957 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5jhk\" (UniqueName: \"kubernetes.io/projected/ce5af283-d357-4474-a53d-ed689e79f193-kube-api-access-n5jhk\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:49 crc kubenswrapper[4971]: I0309 09:57:49.880966 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ce5af283-d357-4474-a53d-ed689e79f193-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:49 crc kubenswrapper[4971]: I0309 09:57:49.880974 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ce5af283-d357-4474-a53d-ed689e79f193-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:50 crc kubenswrapper[4971]: I0309 09:57:50.400910 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f815e819c50e1514845ef80379bf148306a3c68a552ae41898f3856d472078e5" Mar 09 09:57:50 crc kubenswrapper[4971]: I0309 09:57:50.400991 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tww8p" Mar 09 09:57:50 crc kubenswrapper[4971]: I0309 09:57:50.404134 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvccr" event={"ID":"3d728e05-9ee8-42f5-a7d3-0a4418f45ab5","Type":"ContainerStarted","Data":"a58c4adbce9e75559dca6a7f80f11a3f6ab759878b4ddccb10dd02885efa0052"} Mar 09 09:57:50 crc kubenswrapper[4971]: I0309 09:57:50.816664 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hvccr" podStartSLOduration=3.375281693 podStartE2EDuration="5.816635935s" podCreationTimestamp="2026-03-09 09:57:45 +0000 UTC" firstStartedPulling="2026-03-09 09:57:47.365063844 +0000 UTC m=+2270.924991664" lastFinishedPulling="2026-03-09 09:57:49.806418096 +0000 UTC m=+2273.366345906" observedRunningTime="2026-03-09 09:57:50.444691589 +0000 UTC m=+2274.004619399" watchObservedRunningTime="2026-03-09 09:57:50.816635935 +0000 UTC m=+2274.376563785" Mar 09 09:57:50 crc kubenswrapper[4971]: I0309 09:57:50.826045 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b"] Mar 09 09:57:50 crc kubenswrapper[4971]: E0309 09:57:50.827286 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce5af283-d357-4474-a53d-ed689e79f193" containerName="swift-ring-rebalance" Mar 09 09:57:50 crc kubenswrapper[4971]: I0309 09:57:50.827315 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce5af283-d357-4474-a53d-ed689e79f193" containerName="swift-ring-rebalance" Mar 09 09:57:50 crc kubenswrapper[4971]: I0309 09:57:50.827556 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce5af283-d357-4474-a53d-ed689e79f193" containerName="swift-ring-rebalance" Mar 09 09:57:50 crc kubenswrapper[4971]: I0309 09:57:50.828392 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b" Mar 09 09:57:50 crc kubenswrapper[4971]: I0309 09:57:50.831841 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:57:50 crc kubenswrapper[4971]: I0309 09:57:50.832306 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:57:50 crc kubenswrapper[4971]: I0309 09:57:50.838631 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b"] Mar 09 09:57:51 crc kubenswrapper[4971]: I0309 09:57:51.004311 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c5e6e879-0688-459a-b5d7-08b99babc17f-dispersionconf\") pod \"swift-ring-rebalance-debug-7jb6b\" (UID: \"c5e6e879-0688-459a-b5d7-08b99babc17f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b" Mar 09 09:57:51 crc kubenswrapper[4971]: I0309 09:57:51.004369 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c5e6e879-0688-459a-b5d7-08b99babc17f-swiftconf\") pod \"swift-ring-rebalance-debug-7jb6b\" (UID: \"c5e6e879-0688-459a-b5d7-08b99babc17f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b" Mar 09 09:57:51 crc kubenswrapper[4971]: I0309 09:57:51.004391 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c5e6e879-0688-459a-b5d7-08b99babc17f-ring-data-devices\") pod \"swift-ring-rebalance-debug-7jb6b\" (UID: \"c5e6e879-0688-459a-b5d7-08b99babc17f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b" Mar 09 09:57:51 crc kubenswrapper[4971]: I0309 09:57:51.004421 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c5e6e879-0688-459a-b5d7-08b99babc17f-etc-swift\") pod \"swift-ring-rebalance-debug-7jb6b\" (UID: \"c5e6e879-0688-459a-b5d7-08b99babc17f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b" Mar 09 09:57:51 crc kubenswrapper[4971]: I0309 09:57:51.004926 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g927l\" (UniqueName: \"kubernetes.io/projected/c5e6e879-0688-459a-b5d7-08b99babc17f-kube-api-access-g927l\") pod \"swift-ring-rebalance-debug-7jb6b\" (UID: \"c5e6e879-0688-459a-b5d7-08b99babc17f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b" Mar 09 09:57:51 crc kubenswrapper[4971]: I0309 09:57:51.005089 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5e6e879-0688-459a-b5d7-08b99babc17f-scripts\") pod \"swift-ring-rebalance-debug-7jb6b\" (UID: \"c5e6e879-0688-459a-b5d7-08b99babc17f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b" Mar 09 09:57:51 crc kubenswrapper[4971]: I0309 09:57:51.106189 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5e6e879-0688-459a-b5d7-08b99babc17f-scripts\") pod \"swift-ring-rebalance-debug-7jb6b\" (UID: \"c5e6e879-0688-459a-b5d7-08b99babc17f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b" Mar 09 09:57:51 crc kubenswrapper[4971]: I0309 09:57:51.106291 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c5e6e879-0688-459a-b5d7-08b99babc17f-dispersionconf\") pod \"swift-ring-rebalance-debug-7jb6b\" (UID: \"c5e6e879-0688-459a-b5d7-08b99babc17f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b" Mar 09 09:57:51 crc kubenswrapper[4971]: I0309 09:57:51.106315 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c5e6e879-0688-459a-b5d7-08b99babc17f-swiftconf\") pod \"swift-ring-rebalance-debug-7jb6b\" (UID: \"c5e6e879-0688-459a-b5d7-08b99babc17f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b" Mar 09 09:57:51 crc kubenswrapper[4971]: I0309 09:57:51.106330 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c5e6e879-0688-459a-b5d7-08b99babc17f-ring-data-devices\") pod \"swift-ring-rebalance-debug-7jb6b\" (UID: \"c5e6e879-0688-459a-b5d7-08b99babc17f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b" Mar 09 09:57:51 crc kubenswrapper[4971]: I0309 09:57:51.106375 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c5e6e879-0688-459a-b5d7-08b99babc17f-etc-swift\") pod \"swift-ring-rebalance-debug-7jb6b\" (UID: \"c5e6e879-0688-459a-b5d7-08b99babc17f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b" Mar 09 09:57:51 crc kubenswrapper[4971]: I0309 09:57:51.106396 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g927l\" (UniqueName: \"kubernetes.io/projected/c5e6e879-0688-459a-b5d7-08b99babc17f-kube-api-access-g927l\") pod \"swift-ring-rebalance-debug-7jb6b\" (UID: \"c5e6e879-0688-459a-b5d7-08b99babc17f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b" Mar 09 09:57:51 crc kubenswrapper[4971]: I0309 09:57:51.106977 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c5e6e879-0688-459a-b5d7-08b99babc17f-etc-swift\") pod \"swift-ring-rebalance-debug-7jb6b\" (UID: \"c5e6e879-0688-459a-b5d7-08b99babc17f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b" Mar 09 09:57:51 crc kubenswrapper[4971]: I0309 09:57:51.107119 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5e6e879-0688-459a-b5d7-08b99babc17f-scripts\") pod \"swift-ring-rebalance-debug-7jb6b\" (UID: \"c5e6e879-0688-459a-b5d7-08b99babc17f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b" Mar 09 09:57:51 crc kubenswrapper[4971]: I0309 09:57:51.107165 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c5e6e879-0688-459a-b5d7-08b99babc17f-ring-data-devices\") pod \"swift-ring-rebalance-debug-7jb6b\" (UID: \"c5e6e879-0688-459a-b5d7-08b99babc17f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b" Mar 09 09:57:51 crc kubenswrapper[4971]: I0309 09:57:51.110879 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c5e6e879-0688-459a-b5d7-08b99babc17f-dispersionconf\") pod \"swift-ring-rebalance-debug-7jb6b\" (UID: \"c5e6e879-0688-459a-b5d7-08b99babc17f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b" Mar 09 09:57:51 crc kubenswrapper[4971]: I0309 09:57:51.111773 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c5e6e879-0688-459a-b5d7-08b99babc17f-swiftconf\") pod \"swift-ring-rebalance-debug-7jb6b\" (UID: \"c5e6e879-0688-459a-b5d7-08b99babc17f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b" Mar 09 09:57:51 crc kubenswrapper[4971]: I0309 09:57:51.123711 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g927l\" (UniqueName: \"kubernetes.io/projected/c5e6e879-0688-459a-b5d7-08b99babc17f-kube-api-access-g927l\") pod \"swift-ring-rebalance-debug-7jb6b\" (UID: \"c5e6e879-0688-459a-b5d7-08b99babc17f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b" Mar 09 09:57:51 crc kubenswrapper[4971]: I0309 09:57:51.154806 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b" Mar 09 09:57:51 crc kubenswrapper[4971]: I0309 09:57:51.163169 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce5af283-d357-4474-a53d-ed689e79f193" path="/var/lib/kubelet/pods/ce5af283-d357-4474-a53d-ed689e79f193/volumes" Mar 09 09:57:51 crc kubenswrapper[4971]: I0309 09:57:51.556258 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b"] Mar 09 09:57:51 crc kubenswrapper[4971]: W0309 09:57:51.566066 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5e6e879_0688_459a_b5d7_08b99babc17f.slice/crio-ba70d7177025cb727d8d2058166ef4f5e61b15db4ec390aac4b753a71c60e37e WatchSource:0}: Error finding container ba70d7177025cb727d8d2058166ef4f5e61b15db4ec390aac4b753a71c60e37e: Status 404 returned error can't find the container with id ba70d7177025cb727d8d2058166ef4f5e61b15db4ec390aac4b753a71c60e37e Mar 09 09:57:52 crc kubenswrapper[4971]: I0309 09:57:52.422035 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b" event={"ID":"c5e6e879-0688-459a-b5d7-08b99babc17f","Type":"ContainerStarted","Data":"0f44ccfd005ed8824c50e6d80b6c30a690c189c4076a93f772b88fa545dbc9b3"} Mar 09 09:57:52 crc kubenswrapper[4971]: I0309 09:57:52.422402 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b" event={"ID":"c5e6e879-0688-459a-b5d7-08b99babc17f","Type":"ContainerStarted","Data":"ba70d7177025cb727d8d2058166ef4f5e61b15db4ec390aac4b753a71c60e37e"} Mar 09 09:57:53 crc kubenswrapper[4971]: I0309 09:57:53.392629 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4kzd2" Mar 09 09:57:53 crc kubenswrapper[4971]: I0309 09:57:53.392961 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4kzd2" Mar 09 09:57:53 crc kubenswrapper[4971]: I0309 09:57:53.435186 4971 generic.go:334] "Generic (PLEG): container finished" podID="c5e6e879-0688-459a-b5d7-08b99babc17f" containerID="0f44ccfd005ed8824c50e6d80b6c30a690c189c4076a93f772b88fa545dbc9b3" exitCode=0 Mar 09 09:57:53 crc kubenswrapper[4971]: I0309 09:57:53.435241 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b" event={"ID":"c5e6e879-0688-459a-b5d7-08b99babc17f","Type":"ContainerDied","Data":"0f44ccfd005ed8824c50e6d80b6c30a690c189c4076a93f772b88fa545dbc9b3"} Mar 09 09:57:53 crc kubenswrapper[4971]: I0309 09:57:53.442438 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4kzd2" Mar 09 09:57:53 crc kubenswrapper[4971]: I0309 09:57:53.492955 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4kzd2" Mar 09 09:57:53 crc kubenswrapper[4971]: I0309 09:57:53.834745 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4kzd2"] Mar 09 09:57:54 crc kubenswrapper[4971]: I0309 09:57:54.699168 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b" Mar 09 09:57:54 crc kubenswrapper[4971]: I0309 09:57:54.731597 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b"] Mar 09 09:57:54 crc kubenswrapper[4971]: I0309 09:57:54.737961 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b"] Mar 09 09:57:54 crc kubenswrapper[4971]: I0309 09:57:54.857234 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c5e6e879-0688-459a-b5d7-08b99babc17f-ring-data-devices\") pod \"c5e6e879-0688-459a-b5d7-08b99babc17f\" (UID: \"c5e6e879-0688-459a-b5d7-08b99babc17f\") " Mar 09 09:57:54 crc kubenswrapper[4971]: I0309 09:57:54.857313 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5e6e879-0688-459a-b5d7-08b99babc17f-scripts\") pod \"c5e6e879-0688-459a-b5d7-08b99babc17f\" (UID: \"c5e6e879-0688-459a-b5d7-08b99babc17f\") " Mar 09 09:57:54 crc kubenswrapper[4971]: I0309 09:57:54.857372 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c5e6e879-0688-459a-b5d7-08b99babc17f-swiftconf\") pod \"c5e6e879-0688-459a-b5d7-08b99babc17f\" (UID: \"c5e6e879-0688-459a-b5d7-08b99babc17f\") " Mar 09 09:57:54 crc kubenswrapper[4971]: I0309 09:57:54.857432 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g927l\" (UniqueName: \"kubernetes.io/projected/c5e6e879-0688-459a-b5d7-08b99babc17f-kube-api-access-g927l\") pod \"c5e6e879-0688-459a-b5d7-08b99babc17f\" (UID: \"c5e6e879-0688-459a-b5d7-08b99babc17f\") " Mar 09 09:57:54 crc kubenswrapper[4971]: I0309 09:57:54.857467 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c5e6e879-0688-459a-b5d7-08b99babc17f-dispersionconf\") pod \"c5e6e879-0688-459a-b5d7-08b99babc17f\" (UID: \"c5e6e879-0688-459a-b5d7-08b99babc17f\") " Mar 09 09:57:54 crc kubenswrapper[4971]: I0309 09:57:54.857516 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c5e6e879-0688-459a-b5d7-08b99babc17f-etc-swift\") pod \"c5e6e879-0688-459a-b5d7-08b99babc17f\" (UID: \"c5e6e879-0688-459a-b5d7-08b99babc17f\") " Mar 09 09:57:54 crc kubenswrapper[4971]: I0309 09:57:54.858309 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5e6e879-0688-459a-b5d7-08b99babc17f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c5e6e879-0688-459a-b5d7-08b99babc17f" (UID: "c5e6e879-0688-459a-b5d7-08b99babc17f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:57:54 crc kubenswrapper[4971]: I0309 09:57:54.858919 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5e6e879-0688-459a-b5d7-08b99babc17f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c5e6e879-0688-459a-b5d7-08b99babc17f" (UID: "c5e6e879-0688-459a-b5d7-08b99babc17f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:57:54 crc kubenswrapper[4971]: I0309 09:57:54.864519 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5e6e879-0688-459a-b5d7-08b99babc17f-kube-api-access-g927l" (OuterVolumeSpecName: "kube-api-access-g927l") pod "c5e6e879-0688-459a-b5d7-08b99babc17f" (UID: "c5e6e879-0688-459a-b5d7-08b99babc17f"). InnerVolumeSpecName "kube-api-access-g927l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:57:54 crc kubenswrapper[4971]: I0309 09:57:54.878022 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5e6e879-0688-459a-b5d7-08b99babc17f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c5e6e879-0688-459a-b5d7-08b99babc17f" (UID: "c5e6e879-0688-459a-b5d7-08b99babc17f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:54 crc kubenswrapper[4971]: I0309 09:57:54.880011 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5e6e879-0688-459a-b5d7-08b99babc17f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c5e6e879-0688-459a-b5d7-08b99babc17f" (UID: "c5e6e879-0688-459a-b5d7-08b99babc17f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:54 crc kubenswrapper[4971]: I0309 09:57:54.893264 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5e6e879-0688-459a-b5d7-08b99babc17f-scripts" (OuterVolumeSpecName: "scripts") pod "c5e6e879-0688-459a-b5d7-08b99babc17f" (UID: "c5e6e879-0688-459a-b5d7-08b99babc17f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:57:54 crc kubenswrapper[4971]: I0309 09:57:54.960124 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c5e6e879-0688-459a-b5d7-08b99babc17f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:54 crc kubenswrapper[4971]: I0309 09:57:54.960167 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c5e6e879-0688-459a-b5d7-08b99babc17f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:54 crc kubenswrapper[4971]: I0309 09:57:54.960183 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5e6e879-0688-459a-b5d7-08b99babc17f-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:54 crc kubenswrapper[4971]: I0309 09:57:54.960195 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c5e6e879-0688-459a-b5d7-08b99babc17f-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:54 crc kubenswrapper[4971]: I0309 09:57:54.960207 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g927l\" (UniqueName: \"kubernetes.io/projected/c5e6e879-0688-459a-b5d7-08b99babc17f-kube-api-access-g927l\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:54 crc kubenswrapper[4971]: I0309 09:57:54.960219 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c5e6e879-0688-459a-b5d7-08b99babc17f-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:55 crc kubenswrapper[4971]: I0309 09:57:55.161089 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5e6e879-0688-459a-b5d7-08b99babc17f" path="/var/lib/kubelet/pods/c5e6e879-0688-459a-b5d7-08b99babc17f/volumes" Mar 09 09:57:55 crc kubenswrapper[4971]: I0309 09:57:55.454140 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7jb6b" Mar 09 09:57:55 crc kubenswrapper[4971]: I0309 09:57:55.454182 4971 scope.go:117] "RemoveContainer" containerID="0f44ccfd005ed8824c50e6d80b6c30a690c189c4076a93f772b88fa545dbc9b3" Mar 09 09:57:55 crc kubenswrapper[4971]: I0309 09:57:55.454293 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4kzd2" podUID="405aed0a-ce2a-4fa1-b482-6086aff71975" containerName="registry-server" containerID="cri-o://a2670868da1ce5ab17622635a2dd18461fa4ddb53013957dc362f89e4aca6721" gracePeriod=2 Mar 09 09:57:55 crc kubenswrapper[4971]: I0309 09:57:55.763030 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hvccr" Mar 09 09:57:55 crc kubenswrapper[4971]: I0309 09:57:55.763386 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hvccr" Mar 09 09:57:55 crc kubenswrapper[4971]: I0309 09:57:55.815772 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hvccr" Mar 09 09:57:55 crc kubenswrapper[4971]: I0309 09:57:55.835190 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4kzd2" Mar 09 09:57:55 crc kubenswrapper[4971]: I0309 09:57:55.892236 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn"] Mar 09 09:57:55 crc kubenswrapper[4971]: E0309 09:57:55.892528 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e6e879-0688-459a-b5d7-08b99babc17f" containerName="swift-ring-rebalance" Mar 09 09:57:55 crc kubenswrapper[4971]: I0309 09:57:55.892540 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e6e879-0688-459a-b5d7-08b99babc17f" containerName="swift-ring-rebalance" Mar 09 09:57:55 crc kubenswrapper[4971]: E0309 09:57:55.892560 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="405aed0a-ce2a-4fa1-b482-6086aff71975" containerName="registry-server" Mar 09 09:57:55 crc kubenswrapper[4971]: I0309 09:57:55.892569 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="405aed0a-ce2a-4fa1-b482-6086aff71975" containerName="registry-server" Mar 09 09:57:55 crc kubenswrapper[4971]: E0309 09:57:55.892583 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="405aed0a-ce2a-4fa1-b482-6086aff71975" containerName="extract-content" Mar 09 09:57:55 crc kubenswrapper[4971]: I0309 09:57:55.892589 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="405aed0a-ce2a-4fa1-b482-6086aff71975" containerName="extract-content" Mar 09 09:57:55 crc kubenswrapper[4971]: E0309 09:57:55.892605 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="405aed0a-ce2a-4fa1-b482-6086aff71975" containerName="extract-utilities" Mar 09 09:57:55 crc kubenswrapper[4971]: I0309 09:57:55.892610 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="405aed0a-ce2a-4fa1-b482-6086aff71975" containerName="extract-utilities" Mar 09 09:57:55 crc kubenswrapper[4971]: I0309 09:57:55.892798 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5e6e879-0688-459a-b5d7-08b99babc17f" containerName="swift-ring-rebalance" Mar 09 09:57:55 crc kubenswrapper[4971]: I0309 09:57:55.892820 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="405aed0a-ce2a-4fa1-b482-6086aff71975" containerName="registry-server" Mar 09 09:57:55 crc kubenswrapper[4971]: I0309 09:57:55.893420 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn" Mar 09 09:57:55 crc kubenswrapper[4971]: I0309 09:57:55.895453 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:57:55 crc kubenswrapper[4971]: I0309 09:57:55.895573 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:57:55 crc kubenswrapper[4971]: I0309 09:57:55.903337 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn"] Mar 09 09:57:55 crc kubenswrapper[4971]: I0309 09:57:55.976058 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/405aed0a-ce2a-4fa1-b482-6086aff71975-catalog-content\") pod \"405aed0a-ce2a-4fa1-b482-6086aff71975\" (UID: \"405aed0a-ce2a-4fa1-b482-6086aff71975\") " Mar 09 09:57:55 crc kubenswrapper[4971]: I0309 09:57:55.976217 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/405aed0a-ce2a-4fa1-b482-6086aff71975-utilities\") pod \"405aed0a-ce2a-4fa1-b482-6086aff71975\" (UID: \"405aed0a-ce2a-4fa1-b482-6086aff71975\") " Mar 09 09:57:55 crc kubenswrapper[4971]: I0309 09:57:55.976283 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvwtv\" (UniqueName: \"kubernetes.io/projected/405aed0a-ce2a-4fa1-b482-6086aff71975-kube-api-access-gvwtv\") pod \"405aed0a-ce2a-4fa1-b482-6086aff71975\" (UID: \"405aed0a-ce2a-4fa1-b482-6086aff71975\") " Mar 09 09:57:55 crc kubenswrapper[4971]: I0309 09:57:55.977001 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/405aed0a-ce2a-4fa1-b482-6086aff71975-utilities" (OuterVolumeSpecName: "utilities") pod "405aed0a-ce2a-4fa1-b482-6086aff71975" (UID: "405aed0a-ce2a-4fa1-b482-6086aff71975"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:57:55 crc kubenswrapper[4971]: I0309 09:57:55.982326 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/405aed0a-ce2a-4fa1-b482-6086aff71975-kube-api-access-gvwtv" (OuterVolumeSpecName: "kube-api-access-gvwtv") pod "405aed0a-ce2a-4fa1-b482-6086aff71975" (UID: "405aed0a-ce2a-4fa1-b482-6086aff71975"). InnerVolumeSpecName "kube-api-access-gvwtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.034789 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/405aed0a-ce2a-4fa1-b482-6086aff71975-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "405aed0a-ce2a-4fa1-b482-6086aff71975" (UID: "405aed0a-ce2a-4fa1-b482-6086aff71975"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.078414 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b48023f8-833f-424d-bcde-d56a8729fe05-dispersionconf\") pod \"swift-ring-rebalance-debug-sdwcn\" (UID: \"b48023f8-833f-424d-bcde-d56a8729fe05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.078470 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b48023f8-833f-424d-bcde-d56a8729fe05-swiftconf\") pod \"swift-ring-rebalance-debug-sdwcn\" (UID: \"b48023f8-833f-424d-bcde-d56a8729fe05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.078517 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b48023f8-833f-424d-bcde-d56a8729fe05-etc-swift\") pod \"swift-ring-rebalance-debug-sdwcn\" (UID: \"b48023f8-833f-424d-bcde-d56a8729fe05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.078537 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhq8d\" (UniqueName: \"kubernetes.io/projected/b48023f8-833f-424d-bcde-d56a8729fe05-kube-api-access-vhq8d\") pod \"swift-ring-rebalance-debug-sdwcn\" (UID: \"b48023f8-833f-424d-bcde-d56a8729fe05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.078835 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b48023f8-833f-424d-bcde-d56a8729fe05-ring-data-devices\") pod \"swift-ring-rebalance-debug-sdwcn\" (UID: \"b48023f8-833f-424d-bcde-d56a8729fe05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.078958 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b48023f8-833f-424d-bcde-d56a8729fe05-scripts\") pod \"swift-ring-rebalance-debug-sdwcn\" (UID: \"b48023f8-833f-424d-bcde-d56a8729fe05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.079051 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/405aed0a-ce2a-4fa1-b482-6086aff71975-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.079071 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/405aed0a-ce2a-4fa1-b482-6086aff71975-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.079085 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvwtv\" (UniqueName: \"kubernetes.io/projected/405aed0a-ce2a-4fa1-b482-6086aff71975-kube-api-access-gvwtv\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.180432 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b48023f8-833f-424d-bcde-d56a8729fe05-dispersionconf\") pod \"swift-ring-rebalance-debug-sdwcn\" (UID: \"b48023f8-833f-424d-bcde-d56a8729fe05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.180493 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b48023f8-833f-424d-bcde-d56a8729fe05-swiftconf\") pod \"swift-ring-rebalance-debug-sdwcn\" (UID: \"b48023f8-833f-424d-bcde-d56a8729fe05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.180536 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b48023f8-833f-424d-bcde-d56a8729fe05-etc-swift\") pod \"swift-ring-rebalance-debug-sdwcn\" (UID: \"b48023f8-833f-424d-bcde-d56a8729fe05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.180554 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhq8d\" (UniqueName: \"kubernetes.io/projected/b48023f8-833f-424d-bcde-d56a8729fe05-kube-api-access-vhq8d\") pod \"swift-ring-rebalance-debug-sdwcn\" (UID: \"b48023f8-833f-424d-bcde-d56a8729fe05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.180598 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b48023f8-833f-424d-bcde-d56a8729fe05-ring-data-devices\") pod \"swift-ring-rebalance-debug-sdwcn\" (UID: \"b48023f8-833f-424d-bcde-d56a8729fe05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.180635 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b48023f8-833f-424d-bcde-d56a8729fe05-scripts\") pod \"swift-ring-rebalance-debug-sdwcn\" (UID: \"b48023f8-833f-424d-bcde-d56a8729fe05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.181721 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b48023f8-833f-424d-bcde-d56a8729fe05-scripts\") pod \"swift-ring-rebalance-debug-sdwcn\" (UID: \"b48023f8-833f-424d-bcde-d56a8729fe05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.181854 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b48023f8-833f-424d-bcde-d56a8729fe05-ring-data-devices\") pod \"swift-ring-rebalance-debug-sdwcn\" (UID: \"b48023f8-833f-424d-bcde-d56a8729fe05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.181977 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b48023f8-833f-424d-bcde-d56a8729fe05-etc-swift\") pod \"swift-ring-rebalance-debug-sdwcn\" (UID: \"b48023f8-833f-424d-bcde-d56a8729fe05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.183820 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b48023f8-833f-424d-bcde-d56a8729fe05-dispersionconf\") pod \"swift-ring-rebalance-debug-sdwcn\" (UID: \"b48023f8-833f-424d-bcde-d56a8729fe05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.184935 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b48023f8-833f-424d-bcde-d56a8729fe05-swiftconf\") pod \"swift-ring-rebalance-debug-sdwcn\" (UID: \"b48023f8-833f-424d-bcde-d56a8729fe05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.197818 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhq8d\" (UniqueName: \"kubernetes.io/projected/b48023f8-833f-424d-bcde-d56a8729fe05-kube-api-access-vhq8d\") pod \"swift-ring-rebalance-debug-sdwcn\" (UID: \"b48023f8-833f-424d-bcde-d56a8729fe05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.219676 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.464639 4971 generic.go:334] "Generic (PLEG): container finished" podID="405aed0a-ce2a-4fa1-b482-6086aff71975" containerID="a2670868da1ce5ab17622635a2dd18461fa4ddb53013957dc362f89e4aca6721" exitCode=0 Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.464705 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4kzd2" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.464736 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4kzd2" event={"ID":"405aed0a-ce2a-4fa1-b482-6086aff71975","Type":"ContainerDied","Data":"a2670868da1ce5ab17622635a2dd18461fa4ddb53013957dc362f89e4aca6721"} Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.465157 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4kzd2" event={"ID":"405aed0a-ce2a-4fa1-b482-6086aff71975","Type":"ContainerDied","Data":"c1a547f4741961abc6c7ca37465c449e3627819b67d2d9af844e78c734f8b46d"} Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.465185 4971 scope.go:117] "RemoveContainer" containerID="a2670868da1ce5ab17622635a2dd18461fa4ddb53013957dc362f89e4aca6721" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.489020 4971 scope.go:117] "RemoveContainer" containerID="db448860d2406381349bdaccd05e046567ce0d3b3b5c1f34b271d1c944e0360b" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.506875 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4kzd2"] Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.513695 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4kzd2"] Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.530177 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hvccr" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.535415 4971 scope.go:117] "RemoveContainer" containerID="a34529f5c1cdbed6f1c229069134f20dd2729c8ba57e925b646b22098bd97809" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.555595 4971 scope.go:117] "RemoveContainer" containerID="a2670868da1ce5ab17622635a2dd18461fa4ddb53013957dc362f89e4aca6721" Mar 09 09:57:56 crc kubenswrapper[4971]: E0309 09:57:56.556068 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2670868da1ce5ab17622635a2dd18461fa4ddb53013957dc362f89e4aca6721\": container with ID starting with a2670868da1ce5ab17622635a2dd18461fa4ddb53013957dc362f89e4aca6721 not found: ID does not exist" containerID="a2670868da1ce5ab17622635a2dd18461fa4ddb53013957dc362f89e4aca6721" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.556117 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2670868da1ce5ab17622635a2dd18461fa4ddb53013957dc362f89e4aca6721"} err="failed to get container status \"a2670868da1ce5ab17622635a2dd18461fa4ddb53013957dc362f89e4aca6721\": rpc error: code = NotFound desc = could not find container \"a2670868da1ce5ab17622635a2dd18461fa4ddb53013957dc362f89e4aca6721\": container with ID starting with a2670868da1ce5ab17622635a2dd18461fa4ddb53013957dc362f89e4aca6721 not found: ID does not exist" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.556147 4971 scope.go:117] "RemoveContainer" containerID="db448860d2406381349bdaccd05e046567ce0d3b3b5c1f34b271d1c944e0360b" Mar 09 09:57:56 crc kubenswrapper[4971]: E0309 09:57:56.556452 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db448860d2406381349bdaccd05e046567ce0d3b3b5c1f34b271d1c944e0360b\": container with ID starting with db448860d2406381349bdaccd05e046567ce0d3b3b5c1f34b271d1c944e0360b not found: ID does not exist" containerID="db448860d2406381349bdaccd05e046567ce0d3b3b5c1f34b271d1c944e0360b" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.556483 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db448860d2406381349bdaccd05e046567ce0d3b3b5c1f34b271d1c944e0360b"} err="failed to get container status \"db448860d2406381349bdaccd05e046567ce0d3b3b5c1f34b271d1c944e0360b\": rpc error: code = NotFound desc = could not find container \"db448860d2406381349bdaccd05e046567ce0d3b3b5c1f34b271d1c944e0360b\": container with ID starting with db448860d2406381349bdaccd05e046567ce0d3b3b5c1f34b271d1c944e0360b not found: ID does not exist" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.556499 4971 scope.go:117] "RemoveContainer" containerID="a34529f5c1cdbed6f1c229069134f20dd2729c8ba57e925b646b22098bd97809" Mar 09 09:57:56 crc kubenswrapper[4971]: E0309 09:57:56.556850 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a34529f5c1cdbed6f1c229069134f20dd2729c8ba57e925b646b22098bd97809\": container with ID starting with a34529f5c1cdbed6f1c229069134f20dd2729c8ba57e925b646b22098bd97809 not found: ID does not exist" containerID="a34529f5c1cdbed6f1c229069134f20dd2729c8ba57e925b646b22098bd97809" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.556874 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a34529f5c1cdbed6f1c229069134f20dd2729c8ba57e925b646b22098bd97809"} err="failed to get container status \"a34529f5c1cdbed6f1c229069134f20dd2729c8ba57e925b646b22098bd97809\": rpc error: code = NotFound desc = could not find container \"a34529f5c1cdbed6f1c229069134f20dd2729c8ba57e925b646b22098bd97809\": container with ID starting with a34529f5c1cdbed6f1c229069134f20dd2729c8ba57e925b646b22098bd97809 not found: ID does not exist" Mar 09 09:57:56 crc kubenswrapper[4971]: I0309 09:57:56.625465 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn"] Mar 09 09:57:57 crc kubenswrapper[4971]: I0309 09:57:57.164871 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="405aed0a-ce2a-4fa1-b482-6086aff71975" path="/var/lib/kubelet/pods/405aed0a-ce2a-4fa1-b482-6086aff71975/volumes" Mar 09 09:57:57 crc kubenswrapper[4971]: I0309 09:57:57.477693 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn" event={"ID":"b48023f8-833f-424d-bcde-d56a8729fe05","Type":"ContainerStarted","Data":"cda968c437520562c4f15807e00fcb0df9be2b2ec794240bacac803a73af827a"} Mar 09 09:57:57 crc kubenswrapper[4971]: I0309 09:57:57.477735 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn" event={"ID":"b48023f8-833f-424d-bcde-d56a8729fe05","Type":"ContainerStarted","Data":"815ece0434df72b69f4506523ac7a2cdc3f8efd018ab76d77d9781a37c31bd1b"} Mar 09 09:57:57 crc kubenswrapper[4971]: I0309 09:57:57.502010 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn" podStartSLOduration=2.501991877 podStartE2EDuration="2.501991877s" podCreationTimestamp="2026-03-09 09:57:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:57:57.500573518 +0000 UTC m=+2281.060501338" watchObservedRunningTime="2026-03-09 09:57:57.501991877 +0000 UTC m=+2281.061919697" Mar 09 09:57:58 crc kubenswrapper[4971]: I0309 09:57:58.232688 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hvccr"] Mar 09 09:57:58 crc kubenswrapper[4971]: I0309 09:57:58.489233 4971 generic.go:334] "Generic (PLEG): container finished" podID="b48023f8-833f-424d-bcde-d56a8729fe05" containerID="cda968c437520562c4f15807e00fcb0df9be2b2ec794240bacac803a73af827a" exitCode=0 Mar 09 09:57:58 crc kubenswrapper[4971]: I0309 09:57:58.489295 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn" event={"ID":"b48023f8-833f-424d-bcde-d56a8729fe05","Type":"ContainerDied","Data":"cda968c437520562c4f15807e00fcb0df9be2b2ec794240bacac803a73af827a"} Mar 09 09:57:58 crc kubenswrapper[4971]: I0309 09:57:58.489459 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hvccr" podUID="3d728e05-9ee8-42f5-a7d3-0a4418f45ab5" containerName="registry-server" containerID="cri-o://a58c4adbce9e75559dca6a7f80f11a3f6ab759878b4ddccb10dd02885efa0052" gracePeriod=2 Mar 09 09:57:58 crc kubenswrapper[4971]: I0309 09:57:58.908436 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvccr" Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.022566 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d728e05-9ee8-42f5-a7d3-0a4418f45ab5-utilities\") pod \"3d728e05-9ee8-42f5-a7d3-0a4418f45ab5\" (UID: \"3d728e05-9ee8-42f5-a7d3-0a4418f45ab5\") " Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.022688 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d728e05-9ee8-42f5-a7d3-0a4418f45ab5-catalog-content\") pod \"3d728e05-9ee8-42f5-a7d3-0a4418f45ab5\" (UID: \"3d728e05-9ee8-42f5-a7d3-0a4418f45ab5\") " Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.022775 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n5xq\" (UniqueName: \"kubernetes.io/projected/3d728e05-9ee8-42f5-a7d3-0a4418f45ab5-kube-api-access-7n5xq\") pod \"3d728e05-9ee8-42f5-a7d3-0a4418f45ab5\" (UID: \"3d728e05-9ee8-42f5-a7d3-0a4418f45ab5\") " Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.023512 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d728e05-9ee8-42f5-a7d3-0a4418f45ab5-utilities" (OuterVolumeSpecName: "utilities") pod "3d728e05-9ee8-42f5-a7d3-0a4418f45ab5" (UID: "3d728e05-9ee8-42f5-a7d3-0a4418f45ab5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.027891 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d728e05-9ee8-42f5-a7d3-0a4418f45ab5-kube-api-access-7n5xq" (OuterVolumeSpecName: "kube-api-access-7n5xq") pod "3d728e05-9ee8-42f5-a7d3-0a4418f45ab5" (UID: "3d728e05-9ee8-42f5-a7d3-0a4418f45ab5"). InnerVolumeSpecName "kube-api-access-7n5xq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.072196 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d728e05-9ee8-42f5-a7d3-0a4418f45ab5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d728e05-9ee8-42f5-a7d3-0a4418f45ab5" (UID: "3d728e05-9ee8-42f5-a7d3-0a4418f45ab5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.124427 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d728e05-9ee8-42f5-a7d3-0a4418f45ab5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.124470 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n5xq\" (UniqueName: \"kubernetes.io/projected/3d728e05-9ee8-42f5-a7d3-0a4418f45ab5-kube-api-access-7n5xq\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.124487 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d728e05-9ee8-42f5-a7d3-0a4418f45ab5-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.505902 4971 generic.go:334] "Generic (PLEG): container finished" podID="3d728e05-9ee8-42f5-a7d3-0a4418f45ab5" containerID="a58c4adbce9e75559dca6a7f80f11a3f6ab759878b4ddccb10dd02885efa0052" exitCode=0 Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.505983 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvccr" Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.505982 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvccr" event={"ID":"3d728e05-9ee8-42f5-a7d3-0a4418f45ab5","Type":"ContainerDied","Data":"a58c4adbce9e75559dca6a7f80f11a3f6ab759878b4ddccb10dd02885efa0052"} Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.506050 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvccr" event={"ID":"3d728e05-9ee8-42f5-a7d3-0a4418f45ab5","Type":"ContainerDied","Data":"55ae37edc3b1f6ecc29d2d042a739bbae20cfaa599a8c048224c581854b473bc"} Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.506078 4971 scope.go:117] "RemoveContainer" containerID="a58c4adbce9e75559dca6a7f80f11a3f6ab759878b4ddccb10dd02885efa0052" Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.532233 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hvccr"] Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.537877 4971 scope.go:117] "RemoveContainer" containerID="beae46edb7bfc14dc130fd9c2dffe8d0822de075981fe5cfe512bb1af7fbf2fe" Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.537977 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hvccr"] Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.559898 4971 scope.go:117] "RemoveContainer" containerID="625076e3ca348ce891c0615ceb8f24fcb6f4d28065c78ca50761c26ed7ce2069" Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.596726 4971 scope.go:117] "RemoveContainer" containerID="a58c4adbce9e75559dca6a7f80f11a3f6ab759878b4ddccb10dd02885efa0052" Mar 09 09:57:59 crc kubenswrapper[4971]: E0309 09:57:59.602677 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a58c4adbce9e75559dca6a7f80f11a3f6ab759878b4ddccb10dd02885efa0052\": container with ID starting with a58c4adbce9e75559dca6a7f80f11a3f6ab759878b4ddccb10dd02885efa0052 not found: ID does not exist" containerID="a58c4adbce9e75559dca6a7f80f11a3f6ab759878b4ddccb10dd02885efa0052" Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.602733 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a58c4adbce9e75559dca6a7f80f11a3f6ab759878b4ddccb10dd02885efa0052"} err="failed to get container status \"a58c4adbce9e75559dca6a7f80f11a3f6ab759878b4ddccb10dd02885efa0052\": rpc error: code = NotFound desc = could not find container \"a58c4adbce9e75559dca6a7f80f11a3f6ab759878b4ddccb10dd02885efa0052\": container with ID starting with a58c4adbce9e75559dca6a7f80f11a3f6ab759878b4ddccb10dd02885efa0052 not found: ID does not exist" Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.602756 4971 scope.go:117] "RemoveContainer" containerID="beae46edb7bfc14dc130fd9c2dffe8d0822de075981fe5cfe512bb1af7fbf2fe" Mar 09 09:57:59 crc kubenswrapper[4971]: E0309 09:57:59.603069 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beae46edb7bfc14dc130fd9c2dffe8d0822de075981fe5cfe512bb1af7fbf2fe\": container with ID starting with beae46edb7bfc14dc130fd9c2dffe8d0822de075981fe5cfe512bb1af7fbf2fe not found: ID does not exist" containerID="beae46edb7bfc14dc130fd9c2dffe8d0822de075981fe5cfe512bb1af7fbf2fe" Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.603085 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beae46edb7bfc14dc130fd9c2dffe8d0822de075981fe5cfe512bb1af7fbf2fe"} err="failed to get container status \"beae46edb7bfc14dc130fd9c2dffe8d0822de075981fe5cfe512bb1af7fbf2fe\": rpc error: code = NotFound desc = could not find container \"beae46edb7bfc14dc130fd9c2dffe8d0822de075981fe5cfe512bb1af7fbf2fe\": container with ID starting with beae46edb7bfc14dc130fd9c2dffe8d0822de075981fe5cfe512bb1af7fbf2fe not found: ID does not exist" Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.603098 4971 scope.go:117] "RemoveContainer" containerID="625076e3ca348ce891c0615ceb8f24fcb6f4d28065c78ca50761c26ed7ce2069" Mar 09 09:57:59 crc kubenswrapper[4971]: E0309 09:57:59.603385 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"625076e3ca348ce891c0615ceb8f24fcb6f4d28065c78ca50761c26ed7ce2069\": container with ID starting with 625076e3ca348ce891c0615ceb8f24fcb6f4d28065c78ca50761c26ed7ce2069 not found: ID does not exist" containerID="625076e3ca348ce891c0615ceb8f24fcb6f4d28065c78ca50761c26ed7ce2069" Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.603409 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625076e3ca348ce891c0615ceb8f24fcb6f4d28065c78ca50761c26ed7ce2069"} err="failed to get container status \"625076e3ca348ce891c0615ceb8f24fcb6f4d28065c78ca50761c26ed7ce2069\": rpc error: code = NotFound desc = could not find container \"625076e3ca348ce891c0615ceb8f24fcb6f4d28065c78ca50761c26ed7ce2069\": container with ID starting with 625076e3ca348ce891c0615ceb8f24fcb6f4d28065c78ca50761c26ed7ce2069 not found: ID does not exist" Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.800485 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn" Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.834646 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn"] Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.841637 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn"] Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.935489 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b48023f8-833f-424d-bcde-d56a8729fe05-etc-swift\") pod \"b48023f8-833f-424d-bcde-d56a8729fe05\" (UID: \"b48023f8-833f-424d-bcde-d56a8729fe05\") " Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.935567 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhq8d\" (UniqueName: \"kubernetes.io/projected/b48023f8-833f-424d-bcde-d56a8729fe05-kube-api-access-vhq8d\") pod \"b48023f8-833f-424d-bcde-d56a8729fe05\" (UID: \"b48023f8-833f-424d-bcde-d56a8729fe05\") " Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.935606 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b48023f8-833f-424d-bcde-d56a8729fe05-swiftconf\") pod \"b48023f8-833f-424d-bcde-d56a8729fe05\" (UID: \"b48023f8-833f-424d-bcde-d56a8729fe05\") " Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.935651 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b48023f8-833f-424d-bcde-d56a8729fe05-scripts\") pod \"b48023f8-833f-424d-bcde-d56a8729fe05\" (UID: \"b48023f8-833f-424d-bcde-d56a8729fe05\") " Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.935672 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b48023f8-833f-424d-bcde-d56a8729fe05-dispersionconf\") pod \"b48023f8-833f-424d-bcde-d56a8729fe05\" (UID: \"b48023f8-833f-424d-bcde-d56a8729fe05\") " Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.935711 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b48023f8-833f-424d-bcde-d56a8729fe05-ring-data-devices\") pod \"b48023f8-833f-424d-bcde-d56a8729fe05\" (UID: \"b48023f8-833f-424d-bcde-d56a8729fe05\") " Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.936288 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b48023f8-833f-424d-bcde-d56a8729fe05-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b48023f8-833f-424d-bcde-d56a8729fe05" (UID: "b48023f8-833f-424d-bcde-d56a8729fe05"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.936607 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b48023f8-833f-424d-bcde-d56a8729fe05-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b48023f8-833f-424d-bcde-d56a8729fe05" (UID: "b48023f8-833f-424d-bcde-d56a8729fe05"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.940165 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b48023f8-833f-424d-bcde-d56a8729fe05-kube-api-access-vhq8d" (OuterVolumeSpecName: "kube-api-access-vhq8d") pod "b48023f8-833f-424d-bcde-d56a8729fe05" (UID: "b48023f8-833f-424d-bcde-d56a8729fe05"). InnerVolumeSpecName "kube-api-access-vhq8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.955973 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b48023f8-833f-424d-bcde-d56a8729fe05-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b48023f8-833f-424d-bcde-d56a8729fe05" (UID: "b48023f8-833f-424d-bcde-d56a8729fe05"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.957639 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b48023f8-833f-424d-bcde-d56a8729fe05-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b48023f8-833f-424d-bcde-d56a8729fe05" (UID: "b48023f8-833f-424d-bcde-d56a8729fe05"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:59 crc kubenswrapper[4971]: I0309 09:57:59.961626 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b48023f8-833f-424d-bcde-d56a8729fe05-scripts" (OuterVolumeSpecName: "scripts") pod "b48023f8-833f-424d-bcde-d56a8729fe05" (UID: "b48023f8-833f-424d-bcde-d56a8729fe05"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.037753 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b48023f8-833f-424d-bcde-d56a8729fe05-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.037804 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhq8d\" (UniqueName: \"kubernetes.io/projected/b48023f8-833f-424d-bcde-d56a8729fe05-kube-api-access-vhq8d\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.037822 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b48023f8-833f-424d-bcde-d56a8729fe05-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.037832 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b48023f8-833f-424d-bcde-d56a8729fe05-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.037844 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b48023f8-833f-424d-bcde-d56a8729fe05-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.037856 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b48023f8-833f-424d-bcde-d56a8729fe05-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.135963 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550838-fvgvb"] Mar 09 09:58:00 crc kubenswrapper[4971]: E0309 09:58:00.136251 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b48023f8-833f-424d-bcde-d56a8729fe05" containerName="swift-ring-rebalance" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.136264 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b48023f8-833f-424d-bcde-d56a8729fe05" containerName="swift-ring-rebalance" Mar 09 09:58:00 crc kubenswrapper[4971]: E0309 09:58:00.136277 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d728e05-9ee8-42f5-a7d3-0a4418f45ab5" containerName="extract-content" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.136284 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d728e05-9ee8-42f5-a7d3-0a4418f45ab5" containerName="extract-content" Mar 09 09:58:00 crc kubenswrapper[4971]: E0309 09:58:00.136297 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d728e05-9ee8-42f5-a7d3-0a4418f45ab5" containerName="extract-utilities" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.136304 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d728e05-9ee8-42f5-a7d3-0a4418f45ab5" containerName="extract-utilities" Mar 09 09:58:00 crc kubenswrapper[4971]: E0309 09:58:00.136330 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d728e05-9ee8-42f5-a7d3-0a4418f45ab5" containerName="registry-server" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.136336 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d728e05-9ee8-42f5-a7d3-0a4418f45ab5" containerName="registry-server" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.136488 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b48023f8-833f-424d-bcde-d56a8729fe05" containerName="swift-ring-rebalance" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.136510 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d728e05-9ee8-42f5-a7d3-0a4418f45ab5" containerName="registry-server" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.136950 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550838-fvgvb" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.139200 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-xhrv2" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.139691 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.139992 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.145775 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550838-fvgvb"] Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.240656 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd8j5\" (UniqueName: \"kubernetes.io/projected/4ec3c03f-f1a7-4212-9f5f-0c3b79671ddb-kube-api-access-kd8j5\") pod \"auto-csr-approver-29550838-fvgvb\" (UID: \"4ec3c03f-f1a7-4212-9f5f-0c3b79671ddb\") " pod="openshift-infra/auto-csr-approver-29550838-fvgvb" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.343361 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd8j5\" (UniqueName: \"kubernetes.io/projected/4ec3c03f-f1a7-4212-9f5f-0c3b79671ddb-kube-api-access-kd8j5\") pod \"auto-csr-approver-29550838-fvgvb\" (UID: \"4ec3c03f-f1a7-4212-9f5f-0c3b79671ddb\") " pod="openshift-infra/auto-csr-approver-29550838-fvgvb" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.359904 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd8j5\" (UniqueName: \"kubernetes.io/projected/4ec3c03f-f1a7-4212-9f5f-0c3b79671ddb-kube-api-access-kd8j5\") pod \"auto-csr-approver-29550838-fvgvb\" (UID: \"4ec3c03f-f1a7-4212-9f5f-0c3b79671ddb\") " pod="openshift-infra/auto-csr-approver-29550838-fvgvb" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.497628 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550838-fvgvb" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.517783 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="815ece0434df72b69f4506523ac7a2cdc3f8efd018ab76d77d9781a37c31bd1b" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.517810 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdwcn" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.910185 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550838-fvgvb"] Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.971819 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-m96jb"] Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.973024 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-m96jb" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.974604 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.975214 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:58:00 crc kubenswrapper[4971]: I0309 09:58:00.978439 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-m96jb"] Mar 09 09:58:01 crc kubenswrapper[4971]: I0309 09:58:01.154397 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-etc-swift\") pod \"swift-ring-rebalance-debug-m96jb\" (UID: \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m96jb" Mar 09 09:58:01 crc kubenswrapper[4971]: I0309 09:58:01.154445 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-ring-data-devices\") pod \"swift-ring-rebalance-debug-m96jb\" (UID: \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m96jb" Mar 09 09:58:01 crc kubenswrapper[4971]: I0309 09:58:01.154528 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-swiftconf\") pod \"swift-ring-rebalance-debug-m96jb\" (UID: \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m96jb" Mar 09 09:58:01 crc kubenswrapper[4971]: I0309 09:58:01.154572 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-dispersionconf\") pod \"swift-ring-rebalance-debug-m96jb\" (UID: \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m96jb" Mar 09 09:58:01 crc kubenswrapper[4971]: I0309 09:58:01.154590 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-scripts\") pod \"swift-ring-rebalance-debug-m96jb\" (UID: \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m96jb" Mar 09 09:58:01 crc kubenswrapper[4971]: I0309 09:58:01.154614 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7xvz\" (UniqueName: \"kubernetes.io/projected/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-kube-api-access-g7xvz\") pod \"swift-ring-rebalance-debug-m96jb\" (UID: \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m96jb" Mar 09 09:58:01 crc kubenswrapper[4971]: I0309 09:58:01.161737 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d728e05-9ee8-42f5-a7d3-0a4418f45ab5" path="/var/lib/kubelet/pods/3d728e05-9ee8-42f5-a7d3-0a4418f45ab5/volumes" Mar 09 09:58:01 crc kubenswrapper[4971]: I0309 09:58:01.162427 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b48023f8-833f-424d-bcde-d56a8729fe05" path="/var/lib/kubelet/pods/b48023f8-833f-424d-bcde-d56a8729fe05/volumes" Mar 09 09:58:01 crc kubenswrapper[4971]: I0309 09:58:01.256130 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-etc-swift\") pod \"swift-ring-rebalance-debug-m96jb\" (UID: \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m96jb" Mar 09 09:58:01 crc kubenswrapper[4971]: I0309 09:58:01.256207 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-ring-data-devices\") pod \"swift-ring-rebalance-debug-m96jb\" (UID: \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m96jb" Mar 09 09:58:01 crc kubenswrapper[4971]: I0309 09:58:01.256645 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-etc-swift\") pod \"swift-ring-rebalance-debug-m96jb\" (UID: \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m96jb" Mar 09 09:58:01 crc kubenswrapper[4971]: I0309 09:58:01.257053 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-ring-data-devices\") pod \"swift-ring-rebalance-debug-m96jb\" (UID: \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m96jb" Mar 09 09:58:01 crc kubenswrapper[4971]: I0309 09:58:01.257093 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-swiftconf\") pod \"swift-ring-rebalance-debug-m96jb\" (UID: \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m96jb" Mar 09 09:58:01 crc kubenswrapper[4971]: I0309 09:58:01.257255 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-dispersionconf\") pod \"swift-ring-rebalance-debug-m96jb\" (UID: \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m96jb" Mar 09 09:58:01 crc kubenswrapper[4971]: I0309 09:58:01.257294 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-scripts\") pod \"swift-ring-rebalance-debug-m96jb\" (UID: \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m96jb" Mar 09 09:58:01 crc kubenswrapper[4971]: I0309 09:58:01.257344 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7xvz\" (UniqueName: \"kubernetes.io/projected/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-kube-api-access-g7xvz\") pod \"swift-ring-rebalance-debug-m96jb\" (UID: \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m96jb" Mar 09 09:58:01 crc kubenswrapper[4971]: I0309 09:58:01.258276 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-scripts\") pod \"swift-ring-rebalance-debug-m96jb\" (UID: \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m96jb" Mar 09 09:58:01 crc kubenswrapper[4971]: I0309 09:58:01.264060 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-swiftconf\") pod \"swift-ring-rebalance-debug-m96jb\" (UID: \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m96jb" Mar 09 09:58:01 crc kubenswrapper[4971]: I0309 09:58:01.268821 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-dispersionconf\") pod \"swift-ring-rebalance-debug-m96jb\" (UID: \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m96jb" Mar 09 09:58:01 crc kubenswrapper[4971]: I0309 09:58:01.276023 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7xvz\" (UniqueName: \"kubernetes.io/projected/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-kube-api-access-g7xvz\") pod \"swift-ring-rebalance-debug-m96jb\" (UID: \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m96jb" Mar 09 09:58:01 crc kubenswrapper[4971]: I0309 09:58:01.291326 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-m96jb" Mar 09 09:58:01 crc kubenswrapper[4971]: I0309 09:58:01.529542 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550838-fvgvb" event={"ID":"4ec3c03f-f1a7-4212-9f5f-0c3b79671ddb","Type":"ContainerStarted","Data":"6a31fa1f53d7bfc03f9785f6ef27a165da0ef270688f4223f6a9fb203a387700"} Mar 09 09:58:01 crc kubenswrapper[4971]: I0309 09:58:01.712385 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-m96jb"] Mar 09 09:58:01 crc kubenswrapper[4971]: W0309 09:58:01.717764 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c9cdfe9_2fae_4a5f_88b8_db966eea1a4b.slice/crio-913e825360e1a38cb583970c0077357112579349d45d4a29a1707a60a98cd90e WatchSource:0}: Error finding container 913e825360e1a38cb583970c0077357112579349d45d4a29a1707a60a98cd90e: Status 404 returned error can't find the container with id 913e825360e1a38cb583970c0077357112579349d45d4a29a1707a60a98cd90e Mar 09 09:58:02 crc kubenswrapper[4971]: I0309 09:58:02.543484 4971 generic.go:334] "Generic (PLEG): container finished" podID="4ec3c03f-f1a7-4212-9f5f-0c3b79671ddb" containerID="fb822c0e1473ee993de436875741d20de5a4cd0128860d42b74dbfddd7d7653a" exitCode=0 Mar 09 09:58:02 crc kubenswrapper[4971]: I0309 09:58:02.543581 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550838-fvgvb" event={"ID":"4ec3c03f-f1a7-4212-9f5f-0c3b79671ddb","Type":"ContainerDied","Data":"fb822c0e1473ee993de436875741d20de5a4cd0128860d42b74dbfddd7d7653a"} Mar 09 09:58:02 crc kubenswrapper[4971]: I0309 09:58:02.545978 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-m96jb" event={"ID":"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b","Type":"ContainerStarted","Data":"8d80c2d7b39503147e611cbcaac7ef42789eef8b89b10f760e421ac8ae1c2ef6"} Mar 09 09:58:02 crc kubenswrapper[4971]: I0309 09:58:02.546046 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-m96jb" event={"ID":"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b","Type":"ContainerStarted","Data":"913e825360e1a38cb583970c0077357112579349d45d4a29a1707a60a98cd90e"} Mar 09 09:58:02 crc kubenswrapper[4971]: I0309 09:58:02.582830 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-m96jb" podStartSLOduration=2.582813623 podStartE2EDuration="2.582813623s" podCreationTimestamp="2026-03-09 09:58:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:58:02.580234112 +0000 UTC m=+2286.140161932" watchObservedRunningTime="2026-03-09 09:58:02.582813623 +0000 UTC m=+2286.142741433" Mar 09 09:58:03 crc kubenswrapper[4971]: I0309 09:58:03.556182 4971 generic.go:334] "Generic (PLEG): container finished" podID="5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b" containerID="8d80c2d7b39503147e611cbcaac7ef42789eef8b89b10f760e421ac8ae1c2ef6" exitCode=0 Mar 09 09:58:03 crc kubenswrapper[4971]: I0309 09:58:03.556238 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-m96jb" event={"ID":"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b","Type":"ContainerDied","Data":"8d80c2d7b39503147e611cbcaac7ef42789eef8b89b10f760e421ac8ae1c2ef6"} Mar 09 09:58:03 crc kubenswrapper[4971]: I0309 09:58:03.843823 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550838-fvgvb" Mar 09 09:58:03 crc kubenswrapper[4971]: I0309 09:58:03.999400 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd8j5\" (UniqueName: \"kubernetes.io/projected/4ec3c03f-f1a7-4212-9f5f-0c3b79671ddb-kube-api-access-kd8j5\") pod \"4ec3c03f-f1a7-4212-9f5f-0c3b79671ddb\" (UID: \"4ec3c03f-f1a7-4212-9f5f-0c3b79671ddb\") " Mar 09 09:58:04 crc kubenswrapper[4971]: I0309 09:58:04.004452 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ec3c03f-f1a7-4212-9f5f-0c3b79671ddb-kube-api-access-kd8j5" (OuterVolumeSpecName: "kube-api-access-kd8j5") pod "4ec3c03f-f1a7-4212-9f5f-0c3b79671ddb" (UID: "4ec3c03f-f1a7-4212-9f5f-0c3b79671ddb"). InnerVolumeSpecName "kube-api-access-kd8j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:58:04 crc kubenswrapper[4971]: I0309 09:58:04.100958 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd8j5\" (UniqueName: \"kubernetes.io/projected/4ec3c03f-f1a7-4212-9f5f-0c3b79671ddb-kube-api-access-kd8j5\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:04 crc kubenswrapper[4971]: I0309 09:58:04.572078 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550838-fvgvb" event={"ID":"4ec3c03f-f1a7-4212-9f5f-0c3b79671ddb","Type":"ContainerDied","Data":"6a31fa1f53d7bfc03f9785f6ef27a165da0ef270688f4223f6a9fb203a387700"} Mar 09 09:58:04 crc kubenswrapper[4971]: I0309 09:58:04.572131 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a31fa1f53d7bfc03f9785f6ef27a165da0ef270688f4223f6a9fb203a387700" Mar 09 09:58:04 crc kubenswrapper[4971]: I0309 09:58:04.572129 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550838-fvgvb" Mar 09 09:58:04 crc kubenswrapper[4971]: I0309 09:58:04.851854 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-m96jb" Mar 09 09:58:04 crc kubenswrapper[4971]: I0309 09:58:04.899342 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-m96jb"] Mar 09 09:58:04 crc kubenswrapper[4971]: I0309 09:58:04.905392 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-m96jb"] Mar 09 09:58:04 crc kubenswrapper[4971]: I0309 09:58:04.911256 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550832-7sp5k"] Mar 09 09:58:04 crc kubenswrapper[4971]: I0309 09:58:04.918324 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550832-7sp5k"] Mar 09 09:58:05 crc kubenswrapper[4971]: I0309 09:58:05.011703 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-ring-data-devices\") pod \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\" (UID: \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\") " Mar 09 09:58:05 crc kubenswrapper[4971]: I0309 09:58:05.012124 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-dispersionconf\") pod \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\" (UID: \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\") " Mar 09 09:58:05 crc kubenswrapper[4971]: I0309 09:58:05.012153 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-swiftconf\") pod \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\" (UID: \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\") " Mar 09 09:58:05 crc kubenswrapper[4971]: I0309 09:58:05.012222 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-scripts\") pod \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\" (UID: \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\") " Mar 09 09:58:05 crc kubenswrapper[4971]: I0309 09:58:05.012275 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7xvz\" (UniqueName: \"kubernetes.io/projected/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-kube-api-access-g7xvz\") pod \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\" (UID: \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\") " Mar 09 09:58:05 crc kubenswrapper[4971]: I0309 09:58:05.012341 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-etc-swift\") pod \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\" (UID: \"5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b\") " Mar 09 09:58:05 crc kubenswrapper[4971]: I0309 09:58:05.012953 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b" (UID: "5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:58:05 crc kubenswrapper[4971]: I0309 09:58:05.013479 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b" (UID: "5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:58:05 crc kubenswrapper[4971]: I0309 09:58:05.016374 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-kube-api-access-g7xvz" (OuterVolumeSpecName: "kube-api-access-g7xvz") pod "5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b" (UID: "5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b"). InnerVolumeSpecName "kube-api-access-g7xvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:58:05 crc kubenswrapper[4971]: I0309 09:58:05.033009 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-scripts" (OuterVolumeSpecName: "scripts") pod "5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b" (UID: "5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:58:05 crc kubenswrapper[4971]: I0309 09:58:05.035319 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b" (UID: "5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:58:05 crc kubenswrapper[4971]: I0309 09:58:05.035899 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b" (UID: "5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:58:05 crc kubenswrapper[4971]: I0309 09:58:05.113808 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:05 crc kubenswrapper[4971]: I0309 09:58:05.114880 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7xvz\" (UniqueName: \"kubernetes.io/projected/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-kube-api-access-g7xvz\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:05 crc kubenswrapper[4971]: I0309 09:58:05.114916 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:05 crc kubenswrapper[4971]: I0309 09:58:05.114929 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:05 crc kubenswrapper[4971]: I0309 09:58:05.114942 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:05 crc kubenswrapper[4971]: I0309 09:58:05.114957 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:05 crc kubenswrapper[4971]: I0309 09:58:05.161046 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b" path="/var/lib/kubelet/pods/5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b/volumes" Mar 09 09:58:05 crc kubenswrapper[4971]: I0309 09:58:05.161813 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3cdaf5b-75af-469e-b233-f1cadc6f49a0" path="/var/lib/kubelet/pods/f3cdaf5b-75af-469e-b233-f1cadc6f49a0/volumes" Mar 09 09:58:05 crc kubenswrapper[4971]: I0309 09:58:05.588243 4971 scope.go:117] "RemoveContainer" containerID="8d80c2d7b39503147e611cbcaac7ef42789eef8b89b10f760e421ac8ae1c2ef6" Mar 09 09:58:05 crc kubenswrapper[4971]: I0309 09:58:05.588285 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-m96jb" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.038596 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5"] Mar 09 09:58:06 crc kubenswrapper[4971]: E0309 09:58:06.038923 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ec3c03f-f1a7-4212-9f5f-0c3b79671ddb" containerName="oc" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.038937 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ec3c03f-f1a7-4212-9f5f-0c3b79671ddb" containerName="oc" Mar 09 09:58:06 crc kubenswrapper[4971]: E0309 09:58:06.038968 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b" containerName="swift-ring-rebalance" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.038974 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b" containerName="swift-ring-rebalance" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.039103 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c9cdfe9-2fae-4a5f-88b8-db966eea1a4b" containerName="swift-ring-rebalance" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.039116 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ec3c03f-f1a7-4212-9f5f-0c3b79671ddb" containerName="oc" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.039586 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.061197 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.062596 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.116410 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5"] Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.130831 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/29f7222c-3096-4d22-91cd-6838ab25caca-ring-data-devices\") pod \"swift-ring-rebalance-debug-lbnc5\" (UID: \"29f7222c-3096-4d22-91cd-6838ab25caca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.130891 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/29f7222c-3096-4d22-91cd-6838ab25caca-dispersionconf\") pod \"swift-ring-rebalance-debug-lbnc5\" (UID: \"29f7222c-3096-4d22-91cd-6838ab25caca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.130914 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/29f7222c-3096-4d22-91cd-6838ab25caca-etc-swift\") pod \"swift-ring-rebalance-debug-lbnc5\" (UID: \"29f7222c-3096-4d22-91cd-6838ab25caca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.130967 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29f7222c-3096-4d22-91cd-6838ab25caca-scripts\") pod \"swift-ring-rebalance-debug-lbnc5\" (UID: \"29f7222c-3096-4d22-91cd-6838ab25caca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.131002 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/29f7222c-3096-4d22-91cd-6838ab25caca-swiftconf\") pod \"swift-ring-rebalance-debug-lbnc5\" (UID: \"29f7222c-3096-4d22-91cd-6838ab25caca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.131020 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cttjd\" (UniqueName: \"kubernetes.io/projected/29f7222c-3096-4d22-91cd-6838ab25caca-kube-api-access-cttjd\") pod \"swift-ring-rebalance-debug-lbnc5\" (UID: \"29f7222c-3096-4d22-91cd-6838ab25caca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.181257 4971 scope.go:117] "RemoveContainer" containerID="91d160aedd3f8d47dff2491523b23067338b142b6c2ebb92d658924ee0e820c2" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.205194 4971 scope.go:117] "RemoveContainer" containerID="811d8a88eb980fa2f3bd981bce5a50e23c55cd765c9080394d71afcf54e7a9cf" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.232296 4971 scope.go:117] "RemoveContainer" containerID="5c1e72fd942670f367d3ce5bd9f8f69f949fbcecb953d8a09b98b497b077550c" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.232540 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29f7222c-3096-4d22-91cd-6838ab25caca-scripts\") pod \"swift-ring-rebalance-debug-lbnc5\" (UID: \"29f7222c-3096-4d22-91cd-6838ab25caca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.232617 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/29f7222c-3096-4d22-91cd-6838ab25caca-swiftconf\") pod \"swift-ring-rebalance-debug-lbnc5\" (UID: \"29f7222c-3096-4d22-91cd-6838ab25caca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.232638 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cttjd\" (UniqueName: \"kubernetes.io/projected/29f7222c-3096-4d22-91cd-6838ab25caca-kube-api-access-cttjd\") pod \"swift-ring-rebalance-debug-lbnc5\" (UID: \"29f7222c-3096-4d22-91cd-6838ab25caca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.232681 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/29f7222c-3096-4d22-91cd-6838ab25caca-ring-data-devices\") pod \"swift-ring-rebalance-debug-lbnc5\" (UID: \"29f7222c-3096-4d22-91cd-6838ab25caca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.232730 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/29f7222c-3096-4d22-91cd-6838ab25caca-dispersionconf\") pod \"swift-ring-rebalance-debug-lbnc5\" (UID: \"29f7222c-3096-4d22-91cd-6838ab25caca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.232751 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/29f7222c-3096-4d22-91cd-6838ab25caca-etc-swift\") pod \"swift-ring-rebalance-debug-lbnc5\" (UID: \"29f7222c-3096-4d22-91cd-6838ab25caca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.233598 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29f7222c-3096-4d22-91cd-6838ab25caca-scripts\") pod \"swift-ring-rebalance-debug-lbnc5\" (UID: \"29f7222c-3096-4d22-91cd-6838ab25caca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.233929 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/29f7222c-3096-4d22-91cd-6838ab25caca-ring-data-devices\") pod \"swift-ring-rebalance-debug-lbnc5\" (UID: \"29f7222c-3096-4d22-91cd-6838ab25caca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.233943 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/29f7222c-3096-4d22-91cd-6838ab25caca-etc-swift\") pod \"swift-ring-rebalance-debug-lbnc5\" (UID: \"29f7222c-3096-4d22-91cd-6838ab25caca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.238061 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/29f7222c-3096-4d22-91cd-6838ab25caca-dispersionconf\") pod \"swift-ring-rebalance-debug-lbnc5\" (UID: \"29f7222c-3096-4d22-91cd-6838ab25caca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.238123 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/29f7222c-3096-4d22-91cd-6838ab25caca-swiftconf\") pod \"swift-ring-rebalance-debug-lbnc5\" (UID: \"29f7222c-3096-4d22-91cd-6838ab25caca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.254234 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cttjd\" (UniqueName: \"kubernetes.io/projected/29f7222c-3096-4d22-91cd-6838ab25caca-kube-api-access-cttjd\") pod \"swift-ring-rebalance-debug-lbnc5\" (UID: \"29f7222c-3096-4d22-91cd-6838ab25caca\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.318390 4971 scope.go:117] "RemoveContainer" containerID="85009dcc1fa7756739a14ad0cb0328f3f6018b3684fccf1ffb98460612ee0a23" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.356873 4971 scope.go:117] "RemoveContainer" containerID="6012da9cb7cab1a63e85164cac194e0a7304a8809c24b43d7dc38dfc8edd57fb" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.361595 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5" Mar 09 09:58:06 crc kubenswrapper[4971]: I0309 09:58:06.795862 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5"] Mar 09 09:58:06 crc kubenswrapper[4971]: W0309 09:58:06.798743 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29f7222c_3096_4d22_91cd_6838ab25caca.slice/crio-d72f468fc0db5f8fa80c14856192de03aaa7435cdc3cc301ecf1779787742eb9 WatchSource:0}: Error finding container d72f468fc0db5f8fa80c14856192de03aaa7435cdc3cc301ecf1779787742eb9: Status 404 returned error can't find the container with id d72f468fc0db5f8fa80c14856192de03aaa7435cdc3cc301ecf1779787742eb9 Mar 09 09:58:07 crc kubenswrapper[4971]: I0309 09:58:07.610183 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5" event={"ID":"29f7222c-3096-4d22-91cd-6838ab25caca","Type":"ContainerStarted","Data":"e395dac390e2310108846d7a781e8be05e9ac4e9554caadc8cf57fe60e56aa19"} Mar 09 09:58:07 crc kubenswrapper[4971]: I0309 09:58:07.610491 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5" event={"ID":"29f7222c-3096-4d22-91cd-6838ab25caca","Type":"ContainerStarted","Data":"d72f468fc0db5f8fa80c14856192de03aaa7435cdc3cc301ecf1779787742eb9"} Mar 09 09:58:07 crc kubenswrapper[4971]: I0309 09:58:07.638453 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5" podStartSLOduration=1.6384283179999999 podStartE2EDuration="1.638428318s" podCreationTimestamp="2026-03-09 09:58:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:58:07.625852503 +0000 UTC m=+2291.185780373" watchObservedRunningTime="2026-03-09 09:58:07.638428318 +0000 UTC m=+2291.198356148" Mar 09 09:58:08 crc kubenswrapper[4971]: I0309 09:58:08.622570 4971 generic.go:334] "Generic (PLEG): container finished" podID="29f7222c-3096-4d22-91cd-6838ab25caca" containerID="e395dac390e2310108846d7a781e8be05e9ac4e9554caadc8cf57fe60e56aa19" exitCode=0 Mar 09 09:58:08 crc kubenswrapper[4971]: I0309 09:58:08.622644 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5" event={"ID":"29f7222c-3096-4d22-91cd-6838ab25caca","Type":"ContainerDied","Data":"e395dac390e2310108846d7a781e8be05e9ac4e9554caadc8cf57fe60e56aa19"} Mar 09 09:58:09 crc kubenswrapper[4971]: I0309 09:58:09.901988 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5" Mar 09 09:58:09 crc kubenswrapper[4971]: I0309 09:58:09.944780 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5"] Mar 09 09:58:09 crc kubenswrapper[4971]: I0309 09:58:09.956570 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5"] Mar 09 09:58:09 crc kubenswrapper[4971]: I0309 09:58:09.989763 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/29f7222c-3096-4d22-91cd-6838ab25caca-etc-swift\") pod \"29f7222c-3096-4d22-91cd-6838ab25caca\" (UID: \"29f7222c-3096-4d22-91cd-6838ab25caca\") " Mar 09 09:58:09 crc kubenswrapper[4971]: I0309 09:58:09.989855 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/29f7222c-3096-4d22-91cd-6838ab25caca-dispersionconf\") pod \"29f7222c-3096-4d22-91cd-6838ab25caca\" (UID: \"29f7222c-3096-4d22-91cd-6838ab25caca\") " Mar 09 09:58:09 crc kubenswrapper[4971]: I0309 09:58:09.989962 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29f7222c-3096-4d22-91cd-6838ab25caca-scripts\") pod \"29f7222c-3096-4d22-91cd-6838ab25caca\" (UID: \"29f7222c-3096-4d22-91cd-6838ab25caca\") " Mar 09 09:58:09 crc kubenswrapper[4971]: I0309 09:58:09.990036 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/29f7222c-3096-4d22-91cd-6838ab25caca-ring-data-devices\") pod \"29f7222c-3096-4d22-91cd-6838ab25caca\" (UID: \"29f7222c-3096-4d22-91cd-6838ab25caca\") " Mar 09 09:58:09 crc kubenswrapper[4971]: I0309 09:58:09.990064 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/29f7222c-3096-4d22-91cd-6838ab25caca-swiftconf\") pod \"29f7222c-3096-4d22-91cd-6838ab25caca\" (UID: \"29f7222c-3096-4d22-91cd-6838ab25caca\") " Mar 09 09:58:09 crc kubenswrapper[4971]: I0309 09:58:09.990088 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cttjd\" (UniqueName: \"kubernetes.io/projected/29f7222c-3096-4d22-91cd-6838ab25caca-kube-api-access-cttjd\") pod \"29f7222c-3096-4d22-91cd-6838ab25caca\" (UID: \"29f7222c-3096-4d22-91cd-6838ab25caca\") " Mar 09 09:58:09 crc kubenswrapper[4971]: I0309 09:58:09.990786 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29f7222c-3096-4d22-91cd-6838ab25caca-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "29f7222c-3096-4d22-91cd-6838ab25caca" (UID: "29f7222c-3096-4d22-91cd-6838ab25caca"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:58:09 crc kubenswrapper[4971]: I0309 09:58:09.991328 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29f7222c-3096-4d22-91cd-6838ab25caca-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "29f7222c-3096-4d22-91cd-6838ab25caca" (UID: "29f7222c-3096-4d22-91cd-6838ab25caca"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:58:09 crc kubenswrapper[4971]: I0309 09:58:09.995025 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29f7222c-3096-4d22-91cd-6838ab25caca-kube-api-access-cttjd" (OuterVolumeSpecName: "kube-api-access-cttjd") pod "29f7222c-3096-4d22-91cd-6838ab25caca" (UID: "29f7222c-3096-4d22-91cd-6838ab25caca"). InnerVolumeSpecName "kube-api-access-cttjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:58:10 crc kubenswrapper[4971]: I0309 09:58:10.010498 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29f7222c-3096-4d22-91cd-6838ab25caca-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "29f7222c-3096-4d22-91cd-6838ab25caca" (UID: "29f7222c-3096-4d22-91cd-6838ab25caca"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:58:10 crc kubenswrapper[4971]: I0309 09:58:10.010775 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29f7222c-3096-4d22-91cd-6838ab25caca-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "29f7222c-3096-4d22-91cd-6838ab25caca" (UID: "29f7222c-3096-4d22-91cd-6838ab25caca"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:58:10 crc kubenswrapper[4971]: I0309 09:58:10.012096 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29f7222c-3096-4d22-91cd-6838ab25caca-scripts" (OuterVolumeSpecName: "scripts") pod "29f7222c-3096-4d22-91cd-6838ab25caca" (UID: "29f7222c-3096-4d22-91cd-6838ab25caca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:58:10 crc kubenswrapper[4971]: I0309 09:58:10.091885 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/29f7222c-3096-4d22-91cd-6838ab25caca-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:10 crc kubenswrapper[4971]: I0309 09:58:10.091924 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/29f7222c-3096-4d22-91cd-6838ab25caca-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:10 crc kubenswrapper[4971]: I0309 09:58:10.091937 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29f7222c-3096-4d22-91cd-6838ab25caca-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:10 crc kubenswrapper[4971]: I0309 09:58:10.091948 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/29f7222c-3096-4d22-91cd-6838ab25caca-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:10 crc kubenswrapper[4971]: I0309 09:58:10.091960 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/29f7222c-3096-4d22-91cd-6838ab25caca-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:10 crc kubenswrapper[4971]: I0309 09:58:10.091971 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cttjd\" (UniqueName: \"kubernetes.io/projected/29f7222c-3096-4d22-91cd-6838ab25caca-kube-api-access-cttjd\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:10 crc kubenswrapper[4971]: I0309 09:58:10.644482 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d72f468fc0db5f8fa80c14856192de03aaa7435cdc3cc301ecf1779787742eb9" Mar 09 09:58:10 crc kubenswrapper[4971]: I0309 09:58:10.644582 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lbnc5" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.094394 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9gxms"] Mar 09 09:58:11 crc kubenswrapper[4971]: E0309 09:58:11.094895 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29f7222c-3096-4d22-91cd-6838ab25caca" containerName="swift-ring-rebalance" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.094927 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f7222c-3096-4d22-91cd-6838ab25caca" containerName="swift-ring-rebalance" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.095453 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="29f7222c-3096-4d22-91cd-6838ab25caca" containerName="swift-ring-rebalance" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.096480 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gxms" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.099085 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.099706 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.111415 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9gxms"] Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.163083 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29f7222c-3096-4d22-91cd-6838ab25caca" path="/var/lib/kubelet/pods/29f7222c-3096-4d22-91cd-6838ab25caca/volumes" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.207131 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-etc-swift\") pod \"swift-ring-rebalance-debug-9gxms\" (UID: \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gxms" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.207222 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lfd9\" (UniqueName: \"kubernetes.io/projected/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-kube-api-access-5lfd9\") pod \"swift-ring-rebalance-debug-9gxms\" (UID: \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gxms" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.207244 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-swiftconf\") pod \"swift-ring-rebalance-debug-9gxms\" (UID: \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gxms" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.207260 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-ring-data-devices\") pod \"swift-ring-rebalance-debug-9gxms\" (UID: \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gxms" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.207273 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-dispersionconf\") pod \"swift-ring-rebalance-debug-9gxms\" (UID: \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gxms" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.207293 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-scripts\") pod \"swift-ring-rebalance-debug-9gxms\" (UID: \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gxms" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.309040 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-etc-swift\") pod \"swift-ring-rebalance-debug-9gxms\" (UID: \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gxms" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.309471 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-etc-swift\") pod \"swift-ring-rebalance-debug-9gxms\" (UID: \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gxms" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.309628 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lfd9\" (UniqueName: \"kubernetes.io/projected/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-kube-api-access-5lfd9\") pod \"swift-ring-rebalance-debug-9gxms\" (UID: \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gxms" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.309659 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-swiftconf\") pod \"swift-ring-rebalance-debug-9gxms\" (UID: \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gxms" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.309682 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-ring-data-devices\") pod \"swift-ring-rebalance-debug-9gxms\" (UID: \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gxms" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.309698 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-dispersionconf\") pod \"swift-ring-rebalance-debug-9gxms\" (UID: \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gxms" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.309733 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-scripts\") pod \"swift-ring-rebalance-debug-9gxms\" (UID: \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gxms" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.310311 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-scripts\") pod \"swift-ring-rebalance-debug-9gxms\" (UID: \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gxms" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.311425 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-ring-data-devices\") pod \"swift-ring-rebalance-debug-9gxms\" (UID: \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gxms" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.315434 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-dispersionconf\") pod \"swift-ring-rebalance-debug-9gxms\" (UID: \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gxms" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.320486 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-swiftconf\") pod \"swift-ring-rebalance-debug-9gxms\" (UID: \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gxms" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.326067 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lfd9\" (UniqueName: \"kubernetes.io/projected/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-kube-api-access-5lfd9\") pod \"swift-ring-rebalance-debug-9gxms\" (UID: \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gxms" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.425058 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gxms" Mar 09 09:58:11 crc kubenswrapper[4971]: I0309 09:58:11.866861 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9gxms"] Mar 09 09:58:11 crc kubenswrapper[4971]: W0309 09:58:11.867469 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0dee6466_5b7c_4a9d_a913_6a6e5474c0be.slice/crio-886f40e54a16f86e34a8c3120d6ea8357b73fea59ba47119ccfcbc70adb1d6d1 WatchSource:0}: Error finding container 886f40e54a16f86e34a8c3120d6ea8357b73fea59ba47119ccfcbc70adb1d6d1: Status 404 returned error can't find the container with id 886f40e54a16f86e34a8c3120d6ea8357b73fea59ba47119ccfcbc70adb1d6d1 Mar 09 09:58:12 crc kubenswrapper[4971]: I0309 09:58:12.671110 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gxms" event={"ID":"0dee6466-5b7c-4a9d-a913-6a6e5474c0be","Type":"ContainerStarted","Data":"762f8843f221039bdbbc303c6e2cf8c8ece2c4760c9db1e0fad0724d61b9447d"} Mar 09 09:58:12 crc kubenswrapper[4971]: I0309 09:58:12.671494 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gxms" event={"ID":"0dee6466-5b7c-4a9d-a913-6a6e5474c0be","Type":"ContainerStarted","Data":"886f40e54a16f86e34a8c3120d6ea8357b73fea59ba47119ccfcbc70adb1d6d1"} Mar 09 09:58:13 crc kubenswrapper[4971]: I0309 09:58:13.683313 4971 generic.go:334] "Generic (PLEG): container finished" podID="0dee6466-5b7c-4a9d-a913-6a6e5474c0be" containerID="762f8843f221039bdbbc303c6e2cf8c8ece2c4760c9db1e0fad0724d61b9447d" exitCode=0 Mar 09 09:58:13 crc kubenswrapper[4971]: I0309 09:58:13.683456 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gxms" event={"ID":"0dee6466-5b7c-4a9d-a913-6a6e5474c0be","Type":"ContainerDied","Data":"762f8843f221039bdbbc303c6e2cf8c8ece2c4760c9db1e0fad0724d61b9447d"} Mar 09 09:58:14 crc kubenswrapper[4971]: I0309 09:58:14.796515 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:58:14 crc kubenswrapper[4971]: I0309 09:58:14.796633 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:58:14 crc kubenswrapper[4971]: I0309 09:58:14.796717 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" Mar 09 09:58:14 crc kubenswrapper[4971]: I0309 09:58:14.797795 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fb854a481092dad066a02e66c2ebd6763e161f9c45ef6671e752ecdc7ae089b9"} pod="openshift-machine-config-operator/machine-config-daemon-p56wx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:58:14 crc kubenswrapper[4971]: I0309 09:58:14.797923 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" containerID="cri-o://fb854a481092dad066a02e66c2ebd6763e161f9c45ef6671e752ecdc7ae089b9" gracePeriod=600 Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.035868 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gxms" Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.092483 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9gxms"] Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.110706 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9gxms"] Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.186263 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-swiftconf\") pod \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\" (UID: \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\") " Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.186410 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lfd9\" (UniqueName: \"kubernetes.io/projected/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-kube-api-access-5lfd9\") pod \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\" (UID: \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\") " Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.186452 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-ring-data-devices\") pod \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\" (UID: \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\") " Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.186483 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-etc-swift\") pod \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\" (UID: \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\") " Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.186584 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-scripts\") pod \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\" (UID: \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\") " Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.186633 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-dispersionconf\") pod \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\" (UID: \"0dee6466-5b7c-4a9d-a913-6a6e5474c0be\") " Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.187775 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0dee6466-5b7c-4a9d-a913-6a6e5474c0be" (UID: "0dee6466-5b7c-4a9d-a913-6a6e5474c0be"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.187982 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0dee6466-5b7c-4a9d-a913-6a6e5474c0be" (UID: "0dee6466-5b7c-4a9d-a913-6a6e5474c0be"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.192693 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-kube-api-access-5lfd9" (OuterVolumeSpecName: "kube-api-access-5lfd9") pod "0dee6466-5b7c-4a9d-a913-6a6e5474c0be" (UID: "0dee6466-5b7c-4a9d-a913-6a6e5474c0be"). InnerVolumeSpecName "kube-api-access-5lfd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.218790 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0dee6466-5b7c-4a9d-a913-6a6e5474c0be" (UID: "0dee6466-5b7c-4a9d-a913-6a6e5474c0be"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.223088 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-scripts" (OuterVolumeSpecName: "scripts") pod "0dee6466-5b7c-4a9d-a913-6a6e5474c0be" (UID: "0dee6466-5b7c-4a9d-a913-6a6e5474c0be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.238583 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0dee6466-5b7c-4a9d-a913-6a6e5474c0be" (UID: "0dee6466-5b7c-4a9d-a913-6a6e5474c0be"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.288182 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lfd9\" (UniqueName: \"kubernetes.io/projected/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-kube-api-access-5lfd9\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.288209 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.288218 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.288228 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.288236 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.288243 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0dee6466-5b7c-4a9d-a913-6a6e5474c0be-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.707176 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gxms" Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.707288 4971 scope.go:117] "RemoveContainer" containerID="762f8843f221039bdbbc303c6e2cf8c8ece2c4760c9db1e0fad0724d61b9447d" Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.711529 4971 generic.go:334] "Generic (PLEG): container finished" podID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerID="fb854a481092dad066a02e66c2ebd6763e161f9c45ef6671e752ecdc7ae089b9" exitCode=0 Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.711596 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" event={"ID":"05fde3ad-1182-4b15-bb1a-f365ecc92d75","Type":"ContainerDied","Data":"fb854a481092dad066a02e66c2ebd6763e161f9c45ef6671e752ecdc7ae089b9"} Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.711652 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" event={"ID":"05fde3ad-1182-4b15-bb1a-f365ecc92d75","Type":"ContainerStarted","Data":"47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819"} Mar 09 09:58:15 crc kubenswrapper[4971]: I0309 09:58:15.741057 4971 scope.go:117] "RemoveContainer" containerID="b6651c67ba0d34ad8905aa76d3d3c83b2bac897a26e5ef479f58dacc0a091808" Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.250862 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-f2824"] Mar 09 09:58:16 crc kubenswrapper[4971]: E0309 09:58:16.251326 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dee6466-5b7c-4a9d-a913-6a6e5474c0be" containerName="swift-ring-rebalance" Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.251371 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dee6466-5b7c-4a9d-a913-6a6e5474c0be" containerName="swift-ring-rebalance" Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.251602 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dee6466-5b7c-4a9d-a913-6a6e5474c0be" containerName="swift-ring-rebalance" Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.252314 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f2824" Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.253848 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.255046 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.266641 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-f2824"] Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.404777 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0c222a90-722b-48f3-a908-3b7cb713c06b-swiftconf\") pod \"swift-ring-rebalance-debug-f2824\" (UID: \"0c222a90-722b-48f3-a908-3b7cb713c06b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f2824" Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.404829 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0c222a90-722b-48f3-a908-3b7cb713c06b-etc-swift\") pod \"swift-ring-rebalance-debug-f2824\" (UID: \"0c222a90-722b-48f3-a908-3b7cb713c06b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f2824" Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.404855 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0c222a90-722b-48f3-a908-3b7cb713c06b-ring-data-devices\") pod \"swift-ring-rebalance-debug-f2824\" (UID: \"0c222a90-722b-48f3-a908-3b7cb713c06b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f2824" Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.404884 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0c222a90-722b-48f3-a908-3b7cb713c06b-dispersionconf\") pod \"swift-ring-rebalance-debug-f2824\" (UID: \"0c222a90-722b-48f3-a908-3b7cb713c06b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f2824" Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.404913 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr66k\" (UniqueName: \"kubernetes.io/projected/0c222a90-722b-48f3-a908-3b7cb713c06b-kube-api-access-kr66k\") pod \"swift-ring-rebalance-debug-f2824\" (UID: \"0c222a90-722b-48f3-a908-3b7cb713c06b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f2824" Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.405138 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c222a90-722b-48f3-a908-3b7cb713c06b-scripts\") pod \"swift-ring-rebalance-debug-f2824\" (UID: \"0c222a90-722b-48f3-a908-3b7cb713c06b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f2824" Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.506613 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0c222a90-722b-48f3-a908-3b7cb713c06b-swiftconf\") pod \"swift-ring-rebalance-debug-f2824\" (UID: \"0c222a90-722b-48f3-a908-3b7cb713c06b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f2824" Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.506746 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0c222a90-722b-48f3-a908-3b7cb713c06b-etc-swift\") pod \"swift-ring-rebalance-debug-f2824\" (UID: \"0c222a90-722b-48f3-a908-3b7cb713c06b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f2824" Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.506831 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0c222a90-722b-48f3-a908-3b7cb713c06b-ring-data-devices\") pod \"swift-ring-rebalance-debug-f2824\" (UID: \"0c222a90-722b-48f3-a908-3b7cb713c06b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f2824" Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.506920 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0c222a90-722b-48f3-a908-3b7cb713c06b-dispersionconf\") pod \"swift-ring-rebalance-debug-f2824\" (UID: \"0c222a90-722b-48f3-a908-3b7cb713c06b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f2824" Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.507019 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr66k\" (UniqueName: \"kubernetes.io/projected/0c222a90-722b-48f3-a908-3b7cb713c06b-kube-api-access-kr66k\") pod \"swift-ring-rebalance-debug-f2824\" (UID: \"0c222a90-722b-48f3-a908-3b7cb713c06b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f2824" Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.507131 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c222a90-722b-48f3-a908-3b7cb713c06b-scripts\") pod \"swift-ring-rebalance-debug-f2824\" (UID: \"0c222a90-722b-48f3-a908-3b7cb713c06b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f2824" Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.507177 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0c222a90-722b-48f3-a908-3b7cb713c06b-etc-swift\") pod \"swift-ring-rebalance-debug-f2824\" (UID: \"0c222a90-722b-48f3-a908-3b7cb713c06b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f2824" Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.507816 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0c222a90-722b-48f3-a908-3b7cb713c06b-ring-data-devices\") pod \"swift-ring-rebalance-debug-f2824\" (UID: \"0c222a90-722b-48f3-a908-3b7cb713c06b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f2824" Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.507862 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c222a90-722b-48f3-a908-3b7cb713c06b-scripts\") pod \"swift-ring-rebalance-debug-f2824\" (UID: \"0c222a90-722b-48f3-a908-3b7cb713c06b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f2824" Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.512234 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0c222a90-722b-48f3-a908-3b7cb713c06b-swiftconf\") pod \"swift-ring-rebalance-debug-f2824\" (UID: \"0c222a90-722b-48f3-a908-3b7cb713c06b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f2824" Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.512337 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0c222a90-722b-48f3-a908-3b7cb713c06b-dispersionconf\") pod \"swift-ring-rebalance-debug-f2824\" (UID: \"0c222a90-722b-48f3-a908-3b7cb713c06b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f2824" Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.523809 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr66k\" (UniqueName: \"kubernetes.io/projected/0c222a90-722b-48f3-a908-3b7cb713c06b-kube-api-access-kr66k\") pod \"swift-ring-rebalance-debug-f2824\" (UID: \"0c222a90-722b-48f3-a908-3b7cb713c06b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f2824" Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.576033 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f2824" Mar 09 09:58:16 crc kubenswrapper[4971]: I0309 09:58:16.972654 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-f2824"] Mar 09 09:58:17 crc kubenswrapper[4971]: I0309 09:58:17.160784 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dee6466-5b7c-4a9d-a913-6a6e5474c0be" path="/var/lib/kubelet/pods/0dee6466-5b7c-4a9d-a913-6a6e5474c0be/volumes" Mar 09 09:58:17 crc kubenswrapper[4971]: I0309 09:58:17.735554 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f2824" event={"ID":"0c222a90-722b-48f3-a908-3b7cb713c06b","Type":"ContainerStarted","Data":"466c30e2b53e4edf62fe8fec55369d93336b1fa3093a5516d92bd5b0518f9b0e"} Mar 09 09:58:17 crc kubenswrapper[4971]: I0309 09:58:17.735606 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f2824" event={"ID":"0c222a90-722b-48f3-a908-3b7cb713c06b","Type":"ContainerStarted","Data":"1d31dddbd6d479f6f1e74c6afede3643cfb1ee6fd4b9f9931750cd3b29d85374"} Mar 09 09:58:17 crc kubenswrapper[4971]: I0309 09:58:17.757030 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f2824" podStartSLOduration=1.7570084480000001 podStartE2EDuration="1.757008448s" podCreationTimestamp="2026-03-09 09:58:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:58:17.752160384 +0000 UTC m=+2301.312088214" watchObservedRunningTime="2026-03-09 09:58:17.757008448 +0000 UTC m=+2301.316936268" Mar 09 09:58:18 crc kubenswrapper[4971]: I0309 09:58:18.749469 4971 generic.go:334] "Generic (PLEG): container finished" podID="0c222a90-722b-48f3-a908-3b7cb713c06b" containerID="466c30e2b53e4edf62fe8fec55369d93336b1fa3093a5516d92bd5b0518f9b0e" exitCode=0 Mar 09 09:58:18 crc kubenswrapper[4971]: I0309 09:58:18.749971 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f2824" event={"ID":"0c222a90-722b-48f3-a908-3b7cb713c06b","Type":"ContainerDied","Data":"466c30e2b53e4edf62fe8fec55369d93336b1fa3093a5516d92bd5b0518f9b0e"} Mar 09 09:58:20 crc kubenswrapper[4971]: I0309 09:58:20.011184 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f2824" Mar 09 09:58:20 crc kubenswrapper[4971]: I0309 09:58:20.038057 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-f2824"] Mar 09 09:58:20 crc kubenswrapper[4971]: I0309 09:58:20.042730 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-f2824"] Mar 09 09:58:20 crc kubenswrapper[4971]: I0309 09:58:20.159488 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0c222a90-722b-48f3-a908-3b7cb713c06b-dispersionconf\") pod \"0c222a90-722b-48f3-a908-3b7cb713c06b\" (UID: \"0c222a90-722b-48f3-a908-3b7cb713c06b\") " Mar 09 09:58:20 crc kubenswrapper[4971]: I0309 09:58:20.159573 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr66k\" (UniqueName: \"kubernetes.io/projected/0c222a90-722b-48f3-a908-3b7cb713c06b-kube-api-access-kr66k\") pod \"0c222a90-722b-48f3-a908-3b7cb713c06b\" (UID: \"0c222a90-722b-48f3-a908-3b7cb713c06b\") " Mar 09 09:58:20 crc kubenswrapper[4971]: I0309 09:58:20.159615 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0c222a90-722b-48f3-a908-3b7cb713c06b-etc-swift\") pod \"0c222a90-722b-48f3-a908-3b7cb713c06b\" (UID: \"0c222a90-722b-48f3-a908-3b7cb713c06b\") " Mar 09 09:58:20 crc kubenswrapper[4971]: I0309 09:58:20.159715 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0c222a90-722b-48f3-a908-3b7cb713c06b-ring-data-devices\") pod \"0c222a90-722b-48f3-a908-3b7cb713c06b\" (UID: \"0c222a90-722b-48f3-a908-3b7cb713c06b\") " Mar 09 09:58:20 crc kubenswrapper[4971]: I0309 09:58:20.159758 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0c222a90-722b-48f3-a908-3b7cb713c06b-swiftconf\") pod \"0c222a90-722b-48f3-a908-3b7cb713c06b\" (UID: \"0c222a90-722b-48f3-a908-3b7cb713c06b\") " Mar 09 09:58:20 crc kubenswrapper[4971]: I0309 09:58:20.159881 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c222a90-722b-48f3-a908-3b7cb713c06b-scripts\") pod \"0c222a90-722b-48f3-a908-3b7cb713c06b\" (UID: \"0c222a90-722b-48f3-a908-3b7cb713c06b\") " Mar 09 09:58:20 crc kubenswrapper[4971]: I0309 09:58:20.160759 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c222a90-722b-48f3-a908-3b7cb713c06b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0c222a90-722b-48f3-a908-3b7cb713c06b" (UID: "0c222a90-722b-48f3-a908-3b7cb713c06b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:58:20 crc kubenswrapper[4971]: I0309 09:58:20.160816 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c222a90-722b-48f3-a908-3b7cb713c06b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0c222a90-722b-48f3-a908-3b7cb713c06b" (UID: "0c222a90-722b-48f3-a908-3b7cb713c06b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:58:20 crc kubenswrapper[4971]: I0309 09:58:20.167829 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c222a90-722b-48f3-a908-3b7cb713c06b-kube-api-access-kr66k" (OuterVolumeSpecName: "kube-api-access-kr66k") pod "0c222a90-722b-48f3-a908-3b7cb713c06b" (UID: "0c222a90-722b-48f3-a908-3b7cb713c06b"). InnerVolumeSpecName "kube-api-access-kr66k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:58:20 crc kubenswrapper[4971]: I0309 09:58:20.182427 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c222a90-722b-48f3-a908-3b7cb713c06b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0c222a90-722b-48f3-a908-3b7cb713c06b" (UID: "0c222a90-722b-48f3-a908-3b7cb713c06b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:58:20 crc kubenswrapper[4971]: I0309 09:58:20.189766 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c222a90-722b-48f3-a908-3b7cb713c06b-scripts" (OuterVolumeSpecName: "scripts") pod "0c222a90-722b-48f3-a908-3b7cb713c06b" (UID: "0c222a90-722b-48f3-a908-3b7cb713c06b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:58:20 crc kubenswrapper[4971]: I0309 09:58:20.197538 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c222a90-722b-48f3-a908-3b7cb713c06b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0c222a90-722b-48f3-a908-3b7cb713c06b" (UID: "0c222a90-722b-48f3-a908-3b7cb713c06b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:58:20 crc kubenswrapper[4971]: I0309 09:58:20.261311 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c222a90-722b-48f3-a908-3b7cb713c06b-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:20 crc kubenswrapper[4971]: I0309 09:58:20.261427 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0c222a90-722b-48f3-a908-3b7cb713c06b-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:20 crc kubenswrapper[4971]: I0309 09:58:20.261446 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr66k\" (UniqueName: \"kubernetes.io/projected/0c222a90-722b-48f3-a908-3b7cb713c06b-kube-api-access-kr66k\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:20 crc kubenswrapper[4971]: I0309 09:58:20.261459 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0c222a90-722b-48f3-a908-3b7cb713c06b-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:20 crc kubenswrapper[4971]: I0309 09:58:20.261469 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0c222a90-722b-48f3-a908-3b7cb713c06b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:20 crc kubenswrapper[4971]: I0309 09:58:20.261479 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0c222a90-722b-48f3-a908-3b7cb713c06b-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:20 crc kubenswrapper[4971]: I0309 09:58:20.768139 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d31dddbd6d479f6f1e74c6afede3643cfb1ee6fd4b9f9931750cd3b29d85374" Mar 09 09:58:20 crc kubenswrapper[4971]: I0309 09:58:20.768190 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f2824" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.165161 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c222a90-722b-48f3-a908-3b7cb713c06b" path="/var/lib/kubelet/pods/0c222a90-722b-48f3-a908-3b7cb713c06b/volumes" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.179785 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s"] Mar 09 09:58:21 crc kubenswrapper[4971]: E0309 09:58:21.180125 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c222a90-722b-48f3-a908-3b7cb713c06b" containerName="swift-ring-rebalance" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.180145 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c222a90-722b-48f3-a908-3b7cb713c06b" containerName="swift-ring-rebalance" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.180345 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c222a90-722b-48f3-a908-3b7cb713c06b" containerName="swift-ring-rebalance" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.181146 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.187520 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.187731 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.196338 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s"] Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.274881 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6bb1873b-0d46-4a33-8543-824bb1767375-ring-data-devices\") pod \"swift-ring-rebalance-debug-hgk6s\" (UID: \"6bb1873b-0d46-4a33-8543-824bb1767375\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.274951 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6bb1873b-0d46-4a33-8543-824bb1767375-etc-swift\") pod \"swift-ring-rebalance-debug-hgk6s\" (UID: \"6bb1873b-0d46-4a33-8543-824bb1767375\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.274995 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp8w9\" (UniqueName: \"kubernetes.io/projected/6bb1873b-0d46-4a33-8543-824bb1767375-kube-api-access-qp8w9\") pod \"swift-ring-rebalance-debug-hgk6s\" (UID: \"6bb1873b-0d46-4a33-8543-824bb1767375\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.275023 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6bb1873b-0d46-4a33-8543-824bb1767375-scripts\") pod \"swift-ring-rebalance-debug-hgk6s\" (UID: \"6bb1873b-0d46-4a33-8543-824bb1767375\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.275116 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6bb1873b-0d46-4a33-8543-824bb1767375-swiftconf\") pod \"swift-ring-rebalance-debug-hgk6s\" (UID: \"6bb1873b-0d46-4a33-8543-824bb1767375\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.275146 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6bb1873b-0d46-4a33-8543-824bb1767375-dispersionconf\") pod \"swift-ring-rebalance-debug-hgk6s\" (UID: \"6bb1873b-0d46-4a33-8543-824bb1767375\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.376062 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6bb1873b-0d46-4a33-8543-824bb1767375-scripts\") pod \"swift-ring-rebalance-debug-hgk6s\" (UID: \"6bb1873b-0d46-4a33-8543-824bb1767375\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.376131 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp8w9\" (UniqueName: \"kubernetes.io/projected/6bb1873b-0d46-4a33-8543-824bb1767375-kube-api-access-qp8w9\") pod \"swift-ring-rebalance-debug-hgk6s\" (UID: \"6bb1873b-0d46-4a33-8543-824bb1767375\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.376191 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6bb1873b-0d46-4a33-8543-824bb1767375-swiftconf\") pod \"swift-ring-rebalance-debug-hgk6s\" (UID: \"6bb1873b-0d46-4a33-8543-824bb1767375\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.376214 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6bb1873b-0d46-4a33-8543-824bb1767375-dispersionconf\") pod \"swift-ring-rebalance-debug-hgk6s\" (UID: \"6bb1873b-0d46-4a33-8543-824bb1767375\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.376280 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6bb1873b-0d46-4a33-8543-824bb1767375-ring-data-devices\") pod \"swift-ring-rebalance-debug-hgk6s\" (UID: \"6bb1873b-0d46-4a33-8543-824bb1767375\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.376302 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6bb1873b-0d46-4a33-8543-824bb1767375-etc-swift\") pod \"swift-ring-rebalance-debug-hgk6s\" (UID: \"6bb1873b-0d46-4a33-8543-824bb1767375\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.376693 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6bb1873b-0d46-4a33-8543-824bb1767375-etc-swift\") pod \"swift-ring-rebalance-debug-hgk6s\" (UID: \"6bb1873b-0d46-4a33-8543-824bb1767375\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.376840 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6bb1873b-0d46-4a33-8543-824bb1767375-scripts\") pod \"swift-ring-rebalance-debug-hgk6s\" (UID: \"6bb1873b-0d46-4a33-8543-824bb1767375\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.377098 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6bb1873b-0d46-4a33-8543-824bb1767375-ring-data-devices\") pod \"swift-ring-rebalance-debug-hgk6s\" (UID: \"6bb1873b-0d46-4a33-8543-824bb1767375\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.383925 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6bb1873b-0d46-4a33-8543-824bb1767375-swiftconf\") pod \"swift-ring-rebalance-debug-hgk6s\" (UID: \"6bb1873b-0d46-4a33-8543-824bb1767375\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.385410 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6bb1873b-0d46-4a33-8543-824bb1767375-dispersionconf\") pod \"swift-ring-rebalance-debug-hgk6s\" (UID: \"6bb1873b-0d46-4a33-8543-824bb1767375\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.393671 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp8w9\" (UniqueName: \"kubernetes.io/projected/6bb1873b-0d46-4a33-8543-824bb1767375-kube-api-access-qp8w9\") pod \"swift-ring-rebalance-debug-hgk6s\" (UID: \"6bb1873b-0d46-4a33-8543-824bb1767375\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.509992 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s" Mar 09 09:58:21 crc kubenswrapper[4971]: I0309 09:58:21.917870 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s"] Mar 09 09:58:21 crc kubenswrapper[4971]: W0309 09:58:21.923635 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bb1873b_0d46_4a33_8543_824bb1767375.slice/crio-fff7761455a5121e17623e8fb921505af8c6a7b76e6738c6b27984dc12002878 WatchSource:0}: Error finding container fff7761455a5121e17623e8fb921505af8c6a7b76e6738c6b27984dc12002878: Status 404 returned error can't find the container with id fff7761455a5121e17623e8fb921505af8c6a7b76e6738c6b27984dc12002878 Mar 09 09:58:22 crc kubenswrapper[4971]: I0309 09:58:22.793623 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s" event={"ID":"6bb1873b-0d46-4a33-8543-824bb1767375","Type":"ContainerStarted","Data":"d449ef5092b18da58945aee2c13aebeec4073b79a8035c1f6b92349d56fc800b"} Mar 09 09:58:22 crc kubenswrapper[4971]: I0309 09:58:22.794063 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s" event={"ID":"6bb1873b-0d46-4a33-8543-824bb1767375","Type":"ContainerStarted","Data":"fff7761455a5121e17623e8fb921505af8c6a7b76e6738c6b27984dc12002878"} Mar 09 09:58:22 crc kubenswrapper[4971]: I0309 09:58:22.816410 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s" podStartSLOduration=1.816387585 podStartE2EDuration="1.816387585s" podCreationTimestamp="2026-03-09 09:58:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:58:22.810404951 +0000 UTC m=+2306.370332781" watchObservedRunningTime="2026-03-09 09:58:22.816387585 +0000 UTC m=+2306.376315405" Mar 09 09:58:23 crc kubenswrapper[4971]: I0309 09:58:23.805763 4971 generic.go:334] "Generic (PLEG): container finished" podID="6bb1873b-0d46-4a33-8543-824bb1767375" containerID="d449ef5092b18da58945aee2c13aebeec4073b79a8035c1f6b92349d56fc800b" exitCode=0 Mar 09 09:58:23 crc kubenswrapper[4971]: I0309 09:58:23.805830 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s" event={"ID":"6bb1873b-0d46-4a33-8543-824bb1767375","Type":"ContainerDied","Data":"d449ef5092b18da58945aee2c13aebeec4073b79a8035c1f6b92349d56fc800b"} Mar 09 09:58:25 crc kubenswrapper[4971]: I0309 09:58:25.071282 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s" Mar 09 09:58:25 crc kubenswrapper[4971]: I0309 09:58:25.121191 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s"] Mar 09 09:58:25 crc kubenswrapper[4971]: I0309 09:58:25.127455 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s"] Mar 09 09:58:25 crc kubenswrapper[4971]: I0309 09:58:25.238821 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6bb1873b-0d46-4a33-8543-824bb1767375-scripts\") pod \"6bb1873b-0d46-4a33-8543-824bb1767375\" (UID: \"6bb1873b-0d46-4a33-8543-824bb1767375\") " Mar 09 09:58:25 crc kubenswrapper[4971]: I0309 09:58:25.238925 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6bb1873b-0d46-4a33-8543-824bb1767375-ring-data-devices\") pod \"6bb1873b-0d46-4a33-8543-824bb1767375\" (UID: \"6bb1873b-0d46-4a33-8543-824bb1767375\") " Mar 09 09:58:25 crc kubenswrapper[4971]: I0309 09:58:25.239025 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6bb1873b-0d46-4a33-8543-824bb1767375-dispersionconf\") pod \"6bb1873b-0d46-4a33-8543-824bb1767375\" (UID: \"6bb1873b-0d46-4a33-8543-824bb1767375\") " Mar 09 09:58:25 crc kubenswrapper[4971]: I0309 09:58:25.239080 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6bb1873b-0d46-4a33-8543-824bb1767375-etc-swift\") pod \"6bb1873b-0d46-4a33-8543-824bb1767375\" (UID: \"6bb1873b-0d46-4a33-8543-824bb1767375\") " Mar 09 09:58:25 crc kubenswrapper[4971]: I0309 09:58:25.239152 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp8w9\" (UniqueName: \"kubernetes.io/projected/6bb1873b-0d46-4a33-8543-824bb1767375-kube-api-access-qp8w9\") pod \"6bb1873b-0d46-4a33-8543-824bb1767375\" (UID: \"6bb1873b-0d46-4a33-8543-824bb1767375\") " Mar 09 09:58:25 crc kubenswrapper[4971]: I0309 09:58:25.239226 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6bb1873b-0d46-4a33-8543-824bb1767375-swiftconf\") pod \"6bb1873b-0d46-4a33-8543-824bb1767375\" (UID: \"6bb1873b-0d46-4a33-8543-824bb1767375\") " Mar 09 09:58:25 crc kubenswrapper[4971]: I0309 09:58:25.242434 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bb1873b-0d46-4a33-8543-824bb1767375-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6bb1873b-0d46-4a33-8543-824bb1767375" (UID: "6bb1873b-0d46-4a33-8543-824bb1767375"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:58:25 crc kubenswrapper[4971]: I0309 09:58:25.242466 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bb1873b-0d46-4a33-8543-824bb1767375-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6bb1873b-0d46-4a33-8543-824bb1767375" (UID: "6bb1873b-0d46-4a33-8543-824bb1767375"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:58:25 crc kubenswrapper[4971]: I0309 09:58:25.256715 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bb1873b-0d46-4a33-8543-824bb1767375-kube-api-access-qp8w9" (OuterVolumeSpecName: "kube-api-access-qp8w9") pod "6bb1873b-0d46-4a33-8543-824bb1767375" (UID: "6bb1873b-0d46-4a33-8543-824bb1767375"). InnerVolumeSpecName "kube-api-access-qp8w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:58:25 crc kubenswrapper[4971]: I0309 09:58:25.259701 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bb1873b-0d46-4a33-8543-824bb1767375-scripts" (OuterVolumeSpecName: "scripts") pod "6bb1873b-0d46-4a33-8543-824bb1767375" (UID: "6bb1873b-0d46-4a33-8543-824bb1767375"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:58:25 crc kubenswrapper[4971]: I0309 09:58:25.265888 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bb1873b-0d46-4a33-8543-824bb1767375-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6bb1873b-0d46-4a33-8543-824bb1767375" (UID: "6bb1873b-0d46-4a33-8543-824bb1767375"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:58:25 crc kubenswrapper[4971]: I0309 09:58:25.272296 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bb1873b-0d46-4a33-8543-824bb1767375-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6bb1873b-0d46-4a33-8543-824bb1767375" (UID: "6bb1873b-0d46-4a33-8543-824bb1767375"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:58:25 crc kubenswrapper[4971]: I0309 09:58:25.341440 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6bb1873b-0d46-4a33-8543-824bb1767375-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:25 crc kubenswrapper[4971]: I0309 09:58:25.341854 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6bb1873b-0d46-4a33-8543-824bb1767375-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:25 crc kubenswrapper[4971]: I0309 09:58:25.341881 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6bb1873b-0d46-4a33-8543-824bb1767375-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:25 crc kubenswrapper[4971]: I0309 09:58:25.341900 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6bb1873b-0d46-4a33-8543-824bb1767375-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:25 crc kubenswrapper[4971]: I0309 09:58:25.341918 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp8w9\" (UniqueName: \"kubernetes.io/projected/6bb1873b-0d46-4a33-8543-824bb1767375-kube-api-access-qp8w9\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:25 crc kubenswrapper[4971]: I0309 09:58:25.341976 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6bb1873b-0d46-4a33-8543-824bb1767375-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:25 crc kubenswrapper[4971]: I0309 09:58:25.827771 4971 scope.go:117] "RemoveContainer" containerID="d449ef5092b18da58945aee2c13aebeec4073b79a8035c1f6b92349d56fc800b" Mar 09 09:58:25 crc kubenswrapper[4971]: I0309 09:58:25.827848 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hgk6s" Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.240685 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn"] Mar 09 09:58:26 crc kubenswrapper[4971]: E0309 09:58:26.241001 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bb1873b-0d46-4a33-8543-824bb1767375" containerName="swift-ring-rebalance" Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.241013 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bb1873b-0d46-4a33-8543-824bb1767375" containerName="swift-ring-rebalance" Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.241170 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bb1873b-0d46-4a33-8543-824bb1767375" containerName="swift-ring-rebalance" Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.241706 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn" Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.244816 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.248890 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.267127 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn"] Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.356278 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/13a1c171-d154-40bd-9c46-ad973c6b73f6-swiftconf\") pod \"swift-ring-rebalance-debug-rtvsn\" (UID: \"13a1c171-d154-40bd-9c46-ad973c6b73f6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn" Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.356337 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfxj2\" (UniqueName: \"kubernetes.io/projected/13a1c171-d154-40bd-9c46-ad973c6b73f6-kube-api-access-qfxj2\") pod \"swift-ring-rebalance-debug-rtvsn\" (UID: \"13a1c171-d154-40bd-9c46-ad973c6b73f6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn" Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.356408 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13a1c171-d154-40bd-9c46-ad973c6b73f6-scripts\") pod \"swift-ring-rebalance-debug-rtvsn\" (UID: \"13a1c171-d154-40bd-9c46-ad973c6b73f6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn" Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.356450 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/13a1c171-d154-40bd-9c46-ad973c6b73f6-ring-data-devices\") pod \"swift-ring-rebalance-debug-rtvsn\" (UID: \"13a1c171-d154-40bd-9c46-ad973c6b73f6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn" Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.356478 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/13a1c171-d154-40bd-9c46-ad973c6b73f6-etc-swift\") pod \"swift-ring-rebalance-debug-rtvsn\" (UID: \"13a1c171-d154-40bd-9c46-ad973c6b73f6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn" Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.356531 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/13a1c171-d154-40bd-9c46-ad973c6b73f6-dispersionconf\") pod \"swift-ring-rebalance-debug-rtvsn\" (UID: \"13a1c171-d154-40bd-9c46-ad973c6b73f6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn" Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.457339 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/13a1c171-d154-40bd-9c46-ad973c6b73f6-swiftconf\") pod \"swift-ring-rebalance-debug-rtvsn\" (UID: \"13a1c171-d154-40bd-9c46-ad973c6b73f6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn" Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.457434 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfxj2\" (UniqueName: \"kubernetes.io/projected/13a1c171-d154-40bd-9c46-ad973c6b73f6-kube-api-access-qfxj2\") pod \"swift-ring-rebalance-debug-rtvsn\" (UID: \"13a1c171-d154-40bd-9c46-ad973c6b73f6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn" Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.457485 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13a1c171-d154-40bd-9c46-ad973c6b73f6-scripts\") pod \"swift-ring-rebalance-debug-rtvsn\" (UID: \"13a1c171-d154-40bd-9c46-ad973c6b73f6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn" Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.457531 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/13a1c171-d154-40bd-9c46-ad973c6b73f6-ring-data-devices\") pod \"swift-ring-rebalance-debug-rtvsn\" (UID: \"13a1c171-d154-40bd-9c46-ad973c6b73f6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn" Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.457564 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/13a1c171-d154-40bd-9c46-ad973c6b73f6-etc-swift\") pod \"swift-ring-rebalance-debug-rtvsn\" (UID: \"13a1c171-d154-40bd-9c46-ad973c6b73f6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn" Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.457585 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/13a1c171-d154-40bd-9c46-ad973c6b73f6-dispersionconf\") pod \"swift-ring-rebalance-debug-rtvsn\" (UID: \"13a1c171-d154-40bd-9c46-ad973c6b73f6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn" Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.458617 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13a1c171-d154-40bd-9c46-ad973c6b73f6-scripts\") pod \"swift-ring-rebalance-debug-rtvsn\" (UID: \"13a1c171-d154-40bd-9c46-ad973c6b73f6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn" Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.458789 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/13a1c171-d154-40bd-9c46-ad973c6b73f6-ring-data-devices\") pod \"swift-ring-rebalance-debug-rtvsn\" (UID: \"13a1c171-d154-40bd-9c46-ad973c6b73f6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn" Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.458937 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/13a1c171-d154-40bd-9c46-ad973c6b73f6-etc-swift\") pod \"swift-ring-rebalance-debug-rtvsn\" (UID: \"13a1c171-d154-40bd-9c46-ad973c6b73f6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn" Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.461140 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/13a1c171-d154-40bd-9c46-ad973c6b73f6-dispersionconf\") pod \"swift-ring-rebalance-debug-rtvsn\" (UID: \"13a1c171-d154-40bd-9c46-ad973c6b73f6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn" Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.462078 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/13a1c171-d154-40bd-9c46-ad973c6b73f6-swiftconf\") pod \"swift-ring-rebalance-debug-rtvsn\" (UID: \"13a1c171-d154-40bd-9c46-ad973c6b73f6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn" Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.473832 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfxj2\" (UniqueName: \"kubernetes.io/projected/13a1c171-d154-40bd-9c46-ad973c6b73f6-kube-api-access-qfxj2\") pod \"swift-ring-rebalance-debug-rtvsn\" (UID: \"13a1c171-d154-40bd-9c46-ad973c6b73f6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn" Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.560813 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn" Mar 09 09:58:26 crc kubenswrapper[4971]: I0309 09:58:26.974011 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn"] Mar 09 09:58:27 crc kubenswrapper[4971]: I0309 09:58:27.170318 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bb1873b-0d46-4a33-8543-824bb1767375" path="/var/lib/kubelet/pods/6bb1873b-0d46-4a33-8543-824bb1767375/volumes" Mar 09 09:58:27 crc kubenswrapper[4971]: I0309 09:58:27.849302 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn" event={"ID":"13a1c171-d154-40bd-9c46-ad973c6b73f6","Type":"ContainerStarted","Data":"f6b0f8bb2f6fdd17ab22e25265268bc903a3a8dd8845a64ae9fdfc7a857368e4"} Mar 09 09:58:27 crc kubenswrapper[4971]: I0309 09:58:27.849904 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn" event={"ID":"13a1c171-d154-40bd-9c46-ad973c6b73f6","Type":"ContainerStarted","Data":"71a0c6cab969645e7209ec9a3544953f20c7b42cad180125a0c95e6038ede4ab"} Mar 09 09:58:27 crc kubenswrapper[4971]: I0309 09:58:27.878122 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn" podStartSLOduration=1.8781000570000002 podStartE2EDuration="1.878100057s" podCreationTimestamp="2026-03-09 09:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:58:27.870896749 +0000 UTC m=+2311.430824589" watchObservedRunningTime="2026-03-09 09:58:27.878100057 +0000 UTC m=+2311.438027877" Mar 09 09:58:28 crc kubenswrapper[4971]: I0309 09:58:28.859852 4971 generic.go:334] "Generic (PLEG): container finished" podID="13a1c171-d154-40bd-9c46-ad973c6b73f6" containerID="f6b0f8bb2f6fdd17ab22e25265268bc903a3a8dd8845a64ae9fdfc7a857368e4" exitCode=0 Mar 09 09:58:28 crc kubenswrapper[4971]: I0309 09:58:28.859943 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn" event={"ID":"13a1c171-d154-40bd-9c46-ad973c6b73f6","Type":"ContainerDied","Data":"f6b0f8bb2f6fdd17ab22e25265268bc903a3a8dd8845a64ae9fdfc7a857368e4"} Mar 09 09:58:30 crc kubenswrapper[4971]: I0309 09:58:30.163640 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn" Mar 09 09:58:30 crc kubenswrapper[4971]: I0309 09:58:30.206265 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn"] Mar 09 09:58:30 crc kubenswrapper[4971]: I0309 09:58:30.220197 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn"] Mar 09 09:58:30 crc kubenswrapper[4971]: I0309 09:58:30.313405 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfxj2\" (UniqueName: \"kubernetes.io/projected/13a1c171-d154-40bd-9c46-ad973c6b73f6-kube-api-access-qfxj2\") pod \"13a1c171-d154-40bd-9c46-ad973c6b73f6\" (UID: \"13a1c171-d154-40bd-9c46-ad973c6b73f6\") " Mar 09 09:58:30 crc kubenswrapper[4971]: I0309 09:58:30.313446 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/13a1c171-d154-40bd-9c46-ad973c6b73f6-dispersionconf\") pod \"13a1c171-d154-40bd-9c46-ad973c6b73f6\" (UID: \"13a1c171-d154-40bd-9c46-ad973c6b73f6\") " Mar 09 09:58:30 crc kubenswrapper[4971]: I0309 09:58:30.313620 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13a1c171-d154-40bd-9c46-ad973c6b73f6-scripts\") pod \"13a1c171-d154-40bd-9c46-ad973c6b73f6\" (UID: \"13a1c171-d154-40bd-9c46-ad973c6b73f6\") " Mar 09 09:58:30 crc kubenswrapper[4971]: I0309 09:58:30.313662 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/13a1c171-d154-40bd-9c46-ad973c6b73f6-etc-swift\") pod \"13a1c171-d154-40bd-9c46-ad973c6b73f6\" (UID: \"13a1c171-d154-40bd-9c46-ad973c6b73f6\") " Mar 09 09:58:30 crc kubenswrapper[4971]: I0309 09:58:30.313694 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/13a1c171-d154-40bd-9c46-ad973c6b73f6-ring-data-devices\") pod \"13a1c171-d154-40bd-9c46-ad973c6b73f6\" (UID: \"13a1c171-d154-40bd-9c46-ad973c6b73f6\") " Mar 09 09:58:30 crc kubenswrapper[4971]: I0309 09:58:30.313719 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/13a1c171-d154-40bd-9c46-ad973c6b73f6-swiftconf\") pod \"13a1c171-d154-40bd-9c46-ad973c6b73f6\" (UID: \"13a1c171-d154-40bd-9c46-ad973c6b73f6\") " Mar 09 09:58:30 crc kubenswrapper[4971]: I0309 09:58:30.314271 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13a1c171-d154-40bd-9c46-ad973c6b73f6-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "13a1c171-d154-40bd-9c46-ad973c6b73f6" (UID: "13a1c171-d154-40bd-9c46-ad973c6b73f6"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:58:30 crc kubenswrapper[4971]: I0309 09:58:30.314462 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13a1c171-d154-40bd-9c46-ad973c6b73f6-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "13a1c171-d154-40bd-9c46-ad973c6b73f6" (UID: "13a1c171-d154-40bd-9c46-ad973c6b73f6"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:58:30 crc kubenswrapper[4971]: I0309 09:58:30.318569 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13a1c171-d154-40bd-9c46-ad973c6b73f6-kube-api-access-qfxj2" (OuterVolumeSpecName: "kube-api-access-qfxj2") pod "13a1c171-d154-40bd-9c46-ad973c6b73f6" (UID: "13a1c171-d154-40bd-9c46-ad973c6b73f6"). InnerVolumeSpecName "kube-api-access-qfxj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:58:30 crc kubenswrapper[4971]: I0309 09:58:30.332418 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13a1c171-d154-40bd-9c46-ad973c6b73f6-scripts" (OuterVolumeSpecName: "scripts") pod "13a1c171-d154-40bd-9c46-ad973c6b73f6" (UID: "13a1c171-d154-40bd-9c46-ad973c6b73f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:58:30 crc kubenswrapper[4971]: I0309 09:58:30.335299 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13a1c171-d154-40bd-9c46-ad973c6b73f6-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "13a1c171-d154-40bd-9c46-ad973c6b73f6" (UID: "13a1c171-d154-40bd-9c46-ad973c6b73f6"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:58:30 crc kubenswrapper[4971]: I0309 09:58:30.340274 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13a1c171-d154-40bd-9c46-ad973c6b73f6-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "13a1c171-d154-40bd-9c46-ad973c6b73f6" (UID: "13a1c171-d154-40bd-9c46-ad973c6b73f6"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:58:30 crc kubenswrapper[4971]: I0309 09:58:30.416061 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfxj2\" (UniqueName: \"kubernetes.io/projected/13a1c171-d154-40bd-9c46-ad973c6b73f6-kube-api-access-qfxj2\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:30 crc kubenswrapper[4971]: I0309 09:58:30.416102 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/13a1c171-d154-40bd-9c46-ad973c6b73f6-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:30 crc kubenswrapper[4971]: I0309 09:58:30.416114 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13a1c171-d154-40bd-9c46-ad973c6b73f6-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:30 crc kubenswrapper[4971]: I0309 09:58:30.416123 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/13a1c171-d154-40bd-9c46-ad973c6b73f6-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:30 crc kubenswrapper[4971]: I0309 09:58:30.416132 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/13a1c171-d154-40bd-9c46-ad973c6b73f6-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:30 crc kubenswrapper[4971]: I0309 09:58:30.416140 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/13a1c171-d154-40bd-9c46-ad973c6b73f6-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:30 crc kubenswrapper[4971]: I0309 09:58:30.878341 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71a0c6cab969645e7209ec9a3544953f20c7b42cad180125a0c95e6038ede4ab" Mar 09 09:58:30 crc kubenswrapper[4971]: I0309 09:58:30.878429 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rtvsn" Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.160013 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13a1c171-d154-40bd-9c46-ad973c6b73f6" path="/var/lib/kubelet/pods/13a1c171-d154-40bd-9c46-ad973c6b73f6/volumes" Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.348506 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm"] Mar 09 09:58:31 crc kubenswrapper[4971]: E0309 09:58:31.348805 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a1c171-d154-40bd-9c46-ad973c6b73f6" containerName="swift-ring-rebalance" Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.348819 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a1c171-d154-40bd-9c46-ad973c6b73f6" containerName="swift-ring-rebalance" Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.349000 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a1c171-d154-40bd-9c46-ad973c6b73f6" containerName="swift-ring-rebalance" Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.349629 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm" Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.354798 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.354978 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.361691 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm"] Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.430618 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b13fcff2-a0f1-4641-8b66-ba430cd4b956-dispersionconf\") pod \"swift-ring-rebalance-debug-v8rnm\" (UID: \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm" Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.430728 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b13fcff2-a0f1-4641-8b66-ba430cd4b956-scripts\") pod \"swift-ring-rebalance-debug-v8rnm\" (UID: \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm" Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.430763 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b13fcff2-a0f1-4641-8b66-ba430cd4b956-etc-swift\") pod \"swift-ring-rebalance-debug-v8rnm\" (UID: \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm" Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.430785 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c9fv\" (UniqueName: \"kubernetes.io/projected/b13fcff2-a0f1-4641-8b66-ba430cd4b956-kube-api-access-2c9fv\") pod \"swift-ring-rebalance-debug-v8rnm\" (UID: \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm" Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.430837 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b13fcff2-a0f1-4641-8b66-ba430cd4b956-swiftconf\") pod \"swift-ring-rebalance-debug-v8rnm\" (UID: \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm" Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.430894 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b13fcff2-a0f1-4641-8b66-ba430cd4b956-ring-data-devices\") pod \"swift-ring-rebalance-debug-v8rnm\" (UID: \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm" Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.532702 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b13fcff2-a0f1-4641-8b66-ba430cd4b956-ring-data-devices\") pod \"swift-ring-rebalance-debug-v8rnm\" (UID: \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm" Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.532762 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b13fcff2-a0f1-4641-8b66-ba430cd4b956-dispersionconf\") pod \"swift-ring-rebalance-debug-v8rnm\" (UID: \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm" Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.532839 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b13fcff2-a0f1-4641-8b66-ba430cd4b956-scripts\") pod \"swift-ring-rebalance-debug-v8rnm\" (UID: \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm" Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.532875 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b13fcff2-a0f1-4641-8b66-ba430cd4b956-etc-swift\") pod \"swift-ring-rebalance-debug-v8rnm\" (UID: \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm" Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.532901 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c9fv\" (UniqueName: \"kubernetes.io/projected/b13fcff2-a0f1-4641-8b66-ba430cd4b956-kube-api-access-2c9fv\") pod \"swift-ring-rebalance-debug-v8rnm\" (UID: \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm" Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.532935 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b13fcff2-a0f1-4641-8b66-ba430cd4b956-swiftconf\") pod \"swift-ring-rebalance-debug-v8rnm\" (UID: \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm" Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.534150 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b13fcff2-a0f1-4641-8b66-ba430cd4b956-etc-swift\") pod \"swift-ring-rebalance-debug-v8rnm\" (UID: \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm" Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.534504 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b13fcff2-a0f1-4641-8b66-ba430cd4b956-ring-data-devices\") pod \"swift-ring-rebalance-debug-v8rnm\" (UID: \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm" Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.534690 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b13fcff2-a0f1-4641-8b66-ba430cd4b956-scripts\") pod \"swift-ring-rebalance-debug-v8rnm\" (UID: \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm" Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.536994 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b13fcff2-a0f1-4641-8b66-ba430cd4b956-swiftconf\") pod \"swift-ring-rebalance-debug-v8rnm\" (UID: \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm" Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.538114 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b13fcff2-a0f1-4641-8b66-ba430cd4b956-dispersionconf\") pod \"swift-ring-rebalance-debug-v8rnm\" (UID: \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm" Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.555272 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c9fv\" (UniqueName: \"kubernetes.io/projected/b13fcff2-a0f1-4641-8b66-ba430cd4b956-kube-api-access-2c9fv\") pod \"swift-ring-rebalance-debug-v8rnm\" (UID: \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm" Mar 09 09:58:31 crc kubenswrapper[4971]: I0309 09:58:31.674415 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm" Mar 09 09:58:32 crc kubenswrapper[4971]: I0309 09:58:32.104377 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm"] Mar 09 09:58:32 crc kubenswrapper[4971]: I0309 09:58:32.899822 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm" event={"ID":"b13fcff2-a0f1-4641-8b66-ba430cd4b956","Type":"ContainerStarted","Data":"a95776a42bffa19910c66efeaed1e0d4269ff8edd7efefa0d1e23aa23d83d282"} Mar 09 09:58:32 crc kubenswrapper[4971]: I0309 09:58:32.900172 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm" event={"ID":"b13fcff2-a0f1-4641-8b66-ba430cd4b956","Type":"ContainerStarted","Data":"03a97c01ec946576ecb54281a57013796731c81b4bce1075b0b15908ea2a5676"} Mar 09 09:58:32 crc kubenswrapper[4971]: I0309 09:58:32.933304 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm" podStartSLOduration=1.93328723 podStartE2EDuration="1.93328723s" podCreationTimestamp="2026-03-09 09:58:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:58:32.922575705 +0000 UTC m=+2316.482503515" watchObservedRunningTime="2026-03-09 09:58:32.93328723 +0000 UTC m=+2316.493215040" Mar 09 09:58:33 crc kubenswrapper[4971]: I0309 09:58:33.909822 4971 generic.go:334] "Generic (PLEG): container finished" podID="b13fcff2-a0f1-4641-8b66-ba430cd4b956" containerID="a95776a42bffa19910c66efeaed1e0d4269ff8edd7efefa0d1e23aa23d83d282" exitCode=0 Mar 09 09:58:33 crc kubenswrapper[4971]: I0309 09:58:33.909884 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm" event={"ID":"b13fcff2-a0f1-4641-8b66-ba430cd4b956","Type":"ContainerDied","Data":"a95776a42bffa19910c66efeaed1e0d4269ff8edd7efefa0d1e23aa23d83d282"} Mar 09 09:58:35 crc kubenswrapper[4971]: I0309 09:58:35.272365 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm" Mar 09 09:58:35 crc kubenswrapper[4971]: I0309 09:58:35.303745 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm"] Mar 09 09:58:35 crc kubenswrapper[4971]: I0309 09:58:35.311676 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm"] Mar 09 09:58:35 crc kubenswrapper[4971]: I0309 09:58:35.409599 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c9fv\" (UniqueName: \"kubernetes.io/projected/b13fcff2-a0f1-4641-8b66-ba430cd4b956-kube-api-access-2c9fv\") pod \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\" (UID: \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\") " Mar 09 09:58:35 crc kubenswrapper[4971]: I0309 09:58:35.409666 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b13fcff2-a0f1-4641-8b66-ba430cd4b956-dispersionconf\") pod \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\" (UID: \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\") " Mar 09 09:58:35 crc kubenswrapper[4971]: I0309 09:58:35.409741 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b13fcff2-a0f1-4641-8b66-ba430cd4b956-etc-swift\") pod \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\" (UID: \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\") " Mar 09 09:58:35 crc kubenswrapper[4971]: I0309 09:58:35.409765 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b13fcff2-a0f1-4641-8b66-ba430cd4b956-scripts\") pod \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\" (UID: \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\") " Mar 09 09:58:35 crc kubenswrapper[4971]: I0309 09:58:35.409835 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b13fcff2-a0f1-4641-8b66-ba430cd4b956-ring-data-devices\") pod \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\" (UID: \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\") " Mar 09 09:58:35 crc kubenswrapper[4971]: I0309 09:58:35.409892 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b13fcff2-a0f1-4641-8b66-ba430cd4b956-swiftconf\") pod \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\" (UID: \"b13fcff2-a0f1-4641-8b66-ba430cd4b956\") " Mar 09 09:58:35 crc kubenswrapper[4971]: I0309 09:58:35.410470 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b13fcff2-a0f1-4641-8b66-ba430cd4b956-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b13fcff2-a0f1-4641-8b66-ba430cd4b956" (UID: "b13fcff2-a0f1-4641-8b66-ba430cd4b956"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:58:35 crc kubenswrapper[4971]: I0309 09:58:35.410593 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b13fcff2-a0f1-4641-8b66-ba430cd4b956-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b13fcff2-a0f1-4641-8b66-ba430cd4b956" (UID: "b13fcff2-a0f1-4641-8b66-ba430cd4b956"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:58:35 crc kubenswrapper[4971]: I0309 09:58:35.421607 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b13fcff2-a0f1-4641-8b66-ba430cd4b956-kube-api-access-2c9fv" (OuterVolumeSpecName: "kube-api-access-2c9fv") pod "b13fcff2-a0f1-4641-8b66-ba430cd4b956" (UID: "b13fcff2-a0f1-4641-8b66-ba430cd4b956"). InnerVolumeSpecName "kube-api-access-2c9fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:58:35 crc kubenswrapper[4971]: I0309 09:58:35.434511 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b13fcff2-a0f1-4641-8b66-ba430cd4b956-scripts" (OuterVolumeSpecName: "scripts") pod "b13fcff2-a0f1-4641-8b66-ba430cd4b956" (UID: "b13fcff2-a0f1-4641-8b66-ba430cd4b956"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:58:35 crc kubenswrapper[4971]: I0309 09:58:35.435789 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b13fcff2-a0f1-4641-8b66-ba430cd4b956-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b13fcff2-a0f1-4641-8b66-ba430cd4b956" (UID: "b13fcff2-a0f1-4641-8b66-ba430cd4b956"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:58:35 crc kubenswrapper[4971]: I0309 09:58:35.436031 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b13fcff2-a0f1-4641-8b66-ba430cd4b956-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b13fcff2-a0f1-4641-8b66-ba430cd4b956" (UID: "b13fcff2-a0f1-4641-8b66-ba430cd4b956"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:58:35 crc kubenswrapper[4971]: I0309 09:58:35.511647 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b13fcff2-a0f1-4641-8b66-ba430cd4b956-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:35 crc kubenswrapper[4971]: I0309 09:58:35.511682 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b13fcff2-a0f1-4641-8b66-ba430cd4b956-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:35 crc kubenswrapper[4971]: I0309 09:58:35.511692 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c9fv\" (UniqueName: \"kubernetes.io/projected/b13fcff2-a0f1-4641-8b66-ba430cd4b956-kube-api-access-2c9fv\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:35 crc kubenswrapper[4971]: I0309 09:58:35.511704 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b13fcff2-a0f1-4641-8b66-ba430cd4b956-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:35 crc kubenswrapper[4971]: I0309 09:58:35.511712 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b13fcff2-a0f1-4641-8b66-ba430cd4b956-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:35 crc kubenswrapper[4971]: I0309 09:58:35.511720 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b13fcff2-a0f1-4641-8b66-ba430cd4b956-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:35 crc kubenswrapper[4971]: I0309 09:58:35.926264 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03a97c01ec946576ecb54281a57013796731c81b4bce1075b0b15908ea2a5676" Mar 09 09:58:35 crc kubenswrapper[4971]: I0309 09:58:35.926324 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8rnm" Mar 09 09:58:36 crc kubenswrapper[4971]: I0309 09:58:36.444313 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9"] Mar 09 09:58:36 crc kubenswrapper[4971]: E0309 09:58:36.444640 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b13fcff2-a0f1-4641-8b66-ba430cd4b956" containerName="swift-ring-rebalance" Mar 09 09:58:36 crc kubenswrapper[4971]: I0309 09:58:36.444653 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b13fcff2-a0f1-4641-8b66-ba430cd4b956" containerName="swift-ring-rebalance" Mar 09 09:58:36 crc kubenswrapper[4971]: I0309 09:58:36.444828 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b13fcff2-a0f1-4641-8b66-ba430cd4b956" containerName="swift-ring-rebalance" Mar 09 09:58:36 crc kubenswrapper[4971]: I0309 09:58:36.445328 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9" Mar 09 09:58:36 crc kubenswrapper[4971]: I0309 09:58:36.447207 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:58:36 crc kubenswrapper[4971]: I0309 09:58:36.447313 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:58:36 crc kubenswrapper[4971]: I0309 09:58:36.455605 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9"] Mar 09 09:58:36 crc kubenswrapper[4971]: I0309 09:58:36.626677 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dc647b25-56e8-48e6-9159-aef1dde5bf61-dispersionconf\") pod \"swift-ring-rebalance-debug-xrxt9\" (UID: \"dc647b25-56e8-48e6-9159-aef1dde5bf61\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9" Mar 09 09:58:36 crc kubenswrapper[4971]: I0309 09:58:36.627097 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dc647b25-56e8-48e6-9159-aef1dde5bf61-ring-data-devices\") pod \"swift-ring-rebalance-debug-xrxt9\" (UID: \"dc647b25-56e8-48e6-9159-aef1dde5bf61\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9" Mar 09 09:58:36 crc kubenswrapper[4971]: I0309 09:58:36.627209 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dc647b25-56e8-48e6-9159-aef1dde5bf61-etc-swift\") pod \"swift-ring-rebalance-debug-xrxt9\" (UID: \"dc647b25-56e8-48e6-9159-aef1dde5bf61\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9" Mar 09 09:58:36 crc kubenswrapper[4971]: I0309 09:58:36.627308 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dc647b25-56e8-48e6-9159-aef1dde5bf61-swiftconf\") pod \"swift-ring-rebalance-debug-xrxt9\" (UID: \"dc647b25-56e8-48e6-9159-aef1dde5bf61\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9" Mar 09 09:58:36 crc kubenswrapper[4971]: I0309 09:58:36.627414 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc647b25-56e8-48e6-9159-aef1dde5bf61-scripts\") pod \"swift-ring-rebalance-debug-xrxt9\" (UID: \"dc647b25-56e8-48e6-9159-aef1dde5bf61\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9" Mar 09 09:58:36 crc kubenswrapper[4971]: I0309 09:58:36.627511 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5gxx\" (UniqueName: \"kubernetes.io/projected/dc647b25-56e8-48e6-9159-aef1dde5bf61-kube-api-access-t5gxx\") pod \"swift-ring-rebalance-debug-xrxt9\" (UID: \"dc647b25-56e8-48e6-9159-aef1dde5bf61\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9" Mar 09 09:58:36 crc kubenswrapper[4971]: I0309 09:58:36.730078 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dc647b25-56e8-48e6-9159-aef1dde5bf61-etc-swift\") pod \"swift-ring-rebalance-debug-xrxt9\" (UID: \"dc647b25-56e8-48e6-9159-aef1dde5bf61\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9" Mar 09 09:58:36 crc kubenswrapper[4971]: I0309 09:58:36.730633 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dc647b25-56e8-48e6-9159-aef1dde5bf61-swiftconf\") pod \"swift-ring-rebalance-debug-xrxt9\" (UID: \"dc647b25-56e8-48e6-9159-aef1dde5bf61\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9" Mar 09 09:58:36 crc kubenswrapper[4971]: I0309 09:58:36.730741 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc647b25-56e8-48e6-9159-aef1dde5bf61-scripts\") pod \"swift-ring-rebalance-debug-xrxt9\" (UID: \"dc647b25-56e8-48e6-9159-aef1dde5bf61\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9" Mar 09 09:58:36 crc kubenswrapper[4971]: I0309 09:58:36.730899 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5gxx\" (UniqueName: \"kubernetes.io/projected/dc647b25-56e8-48e6-9159-aef1dde5bf61-kube-api-access-t5gxx\") pod \"swift-ring-rebalance-debug-xrxt9\" (UID: \"dc647b25-56e8-48e6-9159-aef1dde5bf61\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9" Mar 09 09:58:36 crc kubenswrapper[4971]: I0309 09:58:36.731081 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dc647b25-56e8-48e6-9159-aef1dde5bf61-dispersionconf\") pod \"swift-ring-rebalance-debug-xrxt9\" (UID: \"dc647b25-56e8-48e6-9159-aef1dde5bf61\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9" Mar 09 09:58:36 crc kubenswrapper[4971]: I0309 09:58:36.731237 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dc647b25-56e8-48e6-9159-aef1dde5bf61-ring-data-devices\") pod \"swift-ring-rebalance-debug-xrxt9\" (UID: \"dc647b25-56e8-48e6-9159-aef1dde5bf61\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9" Mar 09 09:58:36 crc kubenswrapper[4971]: I0309 09:58:36.733011 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dc647b25-56e8-48e6-9159-aef1dde5bf61-ring-data-devices\") pod \"swift-ring-rebalance-debug-xrxt9\" (UID: \"dc647b25-56e8-48e6-9159-aef1dde5bf61\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9" Mar 09 09:58:36 crc kubenswrapper[4971]: I0309 09:58:36.734383 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dc647b25-56e8-48e6-9159-aef1dde5bf61-etc-swift\") pod \"swift-ring-rebalance-debug-xrxt9\" (UID: \"dc647b25-56e8-48e6-9159-aef1dde5bf61\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9" Mar 09 09:58:36 crc kubenswrapper[4971]: I0309 09:58:36.741018 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dc647b25-56e8-48e6-9159-aef1dde5bf61-swiftconf\") pod \"swift-ring-rebalance-debug-xrxt9\" (UID: \"dc647b25-56e8-48e6-9159-aef1dde5bf61\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9" Mar 09 09:58:36 crc kubenswrapper[4971]: I0309 09:58:36.741942 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc647b25-56e8-48e6-9159-aef1dde5bf61-scripts\") pod \"swift-ring-rebalance-debug-xrxt9\" (UID: \"dc647b25-56e8-48e6-9159-aef1dde5bf61\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9" Mar 09 09:58:36 crc kubenswrapper[4971]: I0309 09:58:36.746141 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dc647b25-56e8-48e6-9159-aef1dde5bf61-dispersionconf\") pod \"swift-ring-rebalance-debug-xrxt9\" (UID: \"dc647b25-56e8-48e6-9159-aef1dde5bf61\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9" Mar 09 09:58:36 crc kubenswrapper[4971]: I0309 09:58:36.760957 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5gxx\" (UniqueName: \"kubernetes.io/projected/dc647b25-56e8-48e6-9159-aef1dde5bf61-kube-api-access-t5gxx\") pod \"swift-ring-rebalance-debug-xrxt9\" (UID: \"dc647b25-56e8-48e6-9159-aef1dde5bf61\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9" Mar 09 09:58:37 crc kubenswrapper[4971]: I0309 09:58:37.574049 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9" Mar 09 09:58:37 crc kubenswrapper[4971]: I0309 09:58:37.583625 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b13fcff2-a0f1-4641-8b66-ba430cd4b956" path="/var/lib/kubelet/pods/b13fcff2-a0f1-4641-8b66-ba430cd4b956/volumes" Mar 09 09:58:38 crc kubenswrapper[4971]: I0309 09:58:38.023456 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9"] Mar 09 09:58:38 crc kubenswrapper[4971]: W0309 09:58:38.025207 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc647b25_56e8_48e6_9159_aef1dde5bf61.slice/crio-c16570e2fd3726b3688bb2ca3af16aed9e4c4623a0ee945a73bac46d2c0ef2f1 WatchSource:0}: Error finding container c16570e2fd3726b3688bb2ca3af16aed9e4c4623a0ee945a73bac46d2c0ef2f1: Status 404 returned error can't find the container with id c16570e2fd3726b3688bb2ca3af16aed9e4c4623a0ee945a73bac46d2c0ef2f1 Mar 09 09:58:38 crc kubenswrapper[4971]: I0309 09:58:38.594392 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9" event={"ID":"dc647b25-56e8-48e6-9159-aef1dde5bf61","Type":"ContainerStarted","Data":"0cd6cc2a47cb8373e2d5f9a9c76982bd9c67d753e300eee5ae292b2b3363fe21"} Mar 09 09:58:38 crc kubenswrapper[4971]: I0309 09:58:38.594755 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9" event={"ID":"dc647b25-56e8-48e6-9159-aef1dde5bf61","Type":"ContainerStarted","Data":"c16570e2fd3726b3688bb2ca3af16aed9e4c4623a0ee945a73bac46d2c0ef2f1"} Mar 09 09:58:38 crc kubenswrapper[4971]: I0309 09:58:38.612931 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9" podStartSLOduration=2.612909954 podStartE2EDuration="2.612909954s" podCreationTimestamp="2026-03-09 09:58:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:58:38.608289797 +0000 UTC m=+2322.168217597" watchObservedRunningTime="2026-03-09 09:58:38.612909954 +0000 UTC m=+2322.172837764" Mar 09 09:58:39 crc kubenswrapper[4971]: I0309 09:58:39.604020 4971 generic.go:334] "Generic (PLEG): container finished" podID="dc647b25-56e8-48e6-9159-aef1dde5bf61" containerID="0cd6cc2a47cb8373e2d5f9a9c76982bd9c67d753e300eee5ae292b2b3363fe21" exitCode=0 Mar 09 09:58:39 crc kubenswrapper[4971]: I0309 09:58:39.604072 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9" event={"ID":"dc647b25-56e8-48e6-9159-aef1dde5bf61","Type":"ContainerDied","Data":"0cd6cc2a47cb8373e2d5f9a9c76982bd9c67d753e300eee5ae292b2b3363fe21"} Mar 09 09:58:40 crc kubenswrapper[4971]: I0309 09:58:40.936844 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9" Mar 09 09:58:40 crc kubenswrapper[4971]: I0309 09:58:40.976314 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9"] Mar 09 09:58:41 crc kubenswrapper[4971]: I0309 09:58:41.002263 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9"] Mar 09 09:58:41 crc kubenswrapper[4971]: I0309 09:58:41.096944 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dc647b25-56e8-48e6-9159-aef1dde5bf61-dispersionconf\") pod \"dc647b25-56e8-48e6-9159-aef1dde5bf61\" (UID: \"dc647b25-56e8-48e6-9159-aef1dde5bf61\") " Mar 09 09:58:41 crc kubenswrapper[4971]: I0309 09:58:41.097024 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dc647b25-56e8-48e6-9159-aef1dde5bf61-etc-swift\") pod \"dc647b25-56e8-48e6-9159-aef1dde5bf61\" (UID: \"dc647b25-56e8-48e6-9159-aef1dde5bf61\") " Mar 09 09:58:41 crc kubenswrapper[4971]: I0309 09:58:41.097166 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5gxx\" (UniqueName: \"kubernetes.io/projected/dc647b25-56e8-48e6-9159-aef1dde5bf61-kube-api-access-t5gxx\") pod \"dc647b25-56e8-48e6-9159-aef1dde5bf61\" (UID: \"dc647b25-56e8-48e6-9159-aef1dde5bf61\") " Mar 09 09:58:41 crc kubenswrapper[4971]: I0309 09:58:41.097220 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dc647b25-56e8-48e6-9159-aef1dde5bf61-ring-data-devices\") pod \"dc647b25-56e8-48e6-9159-aef1dde5bf61\" (UID: \"dc647b25-56e8-48e6-9159-aef1dde5bf61\") " Mar 09 09:58:41 crc kubenswrapper[4971]: I0309 09:58:41.097241 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dc647b25-56e8-48e6-9159-aef1dde5bf61-swiftconf\") pod \"dc647b25-56e8-48e6-9159-aef1dde5bf61\" (UID: \"dc647b25-56e8-48e6-9159-aef1dde5bf61\") " Mar 09 09:58:41 crc kubenswrapper[4971]: I0309 09:58:41.097280 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc647b25-56e8-48e6-9159-aef1dde5bf61-scripts\") pod \"dc647b25-56e8-48e6-9159-aef1dde5bf61\" (UID: \"dc647b25-56e8-48e6-9159-aef1dde5bf61\") " Mar 09 09:58:41 crc kubenswrapper[4971]: I0309 09:58:41.098113 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc647b25-56e8-48e6-9159-aef1dde5bf61-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "dc647b25-56e8-48e6-9159-aef1dde5bf61" (UID: "dc647b25-56e8-48e6-9159-aef1dde5bf61"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:58:41 crc kubenswrapper[4971]: I0309 09:58:41.098157 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc647b25-56e8-48e6-9159-aef1dde5bf61-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "dc647b25-56e8-48e6-9159-aef1dde5bf61" (UID: "dc647b25-56e8-48e6-9159-aef1dde5bf61"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:58:41 crc kubenswrapper[4971]: I0309 09:58:41.108817 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc647b25-56e8-48e6-9159-aef1dde5bf61-kube-api-access-t5gxx" (OuterVolumeSpecName: "kube-api-access-t5gxx") pod "dc647b25-56e8-48e6-9159-aef1dde5bf61" (UID: "dc647b25-56e8-48e6-9159-aef1dde5bf61"). InnerVolumeSpecName "kube-api-access-t5gxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:58:41 crc kubenswrapper[4971]: I0309 09:58:41.124143 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc647b25-56e8-48e6-9159-aef1dde5bf61-scripts" (OuterVolumeSpecName: "scripts") pod "dc647b25-56e8-48e6-9159-aef1dde5bf61" (UID: "dc647b25-56e8-48e6-9159-aef1dde5bf61"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:58:41 crc kubenswrapper[4971]: I0309 09:58:41.124634 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc647b25-56e8-48e6-9159-aef1dde5bf61-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "dc647b25-56e8-48e6-9159-aef1dde5bf61" (UID: "dc647b25-56e8-48e6-9159-aef1dde5bf61"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:58:41 crc kubenswrapper[4971]: I0309 09:58:41.128040 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc647b25-56e8-48e6-9159-aef1dde5bf61-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "dc647b25-56e8-48e6-9159-aef1dde5bf61" (UID: "dc647b25-56e8-48e6-9159-aef1dde5bf61"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:58:41 crc kubenswrapper[4971]: I0309 09:58:41.166628 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc647b25-56e8-48e6-9159-aef1dde5bf61" path="/var/lib/kubelet/pods/dc647b25-56e8-48e6-9159-aef1dde5bf61/volumes" Mar 09 09:58:41 crc kubenswrapper[4971]: I0309 09:58:41.199421 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dc647b25-56e8-48e6-9159-aef1dde5bf61-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:41 crc kubenswrapper[4971]: I0309 09:58:41.199473 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5gxx\" (UniqueName: \"kubernetes.io/projected/dc647b25-56e8-48e6-9159-aef1dde5bf61-kube-api-access-t5gxx\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:41 crc kubenswrapper[4971]: I0309 09:58:41.199488 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dc647b25-56e8-48e6-9159-aef1dde5bf61-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:41 crc kubenswrapper[4971]: I0309 09:58:41.199501 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dc647b25-56e8-48e6-9159-aef1dde5bf61-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:41 crc kubenswrapper[4971]: I0309 09:58:41.199512 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc647b25-56e8-48e6-9159-aef1dde5bf61-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:41 crc kubenswrapper[4971]: I0309 09:58:41.199522 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dc647b25-56e8-48e6-9159-aef1dde5bf61-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:41 crc kubenswrapper[4971]: I0309 09:58:41.621083 4971 scope.go:117] "RemoveContainer" containerID="0cd6cc2a47cb8373e2d5f9a9c76982bd9c67d753e300eee5ae292b2b3363fe21" Mar 09 09:58:41 crc kubenswrapper[4971]: I0309 09:58:41.621094 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xrxt9" Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.127449 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl"] Mar 09 09:58:42 crc kubenswrapper[4971]: E0309 09:58:42.128252 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc647b25-56e8-48e6-9159-aef1dde5bf61" containerName="swift-ring-rebalance" Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.128277 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc647b25-56e8-48e6-9159-aef1dde5bf61" containerName="swift-ring-rebalance" Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.128579 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc647b25-56e8-48e6-9159-aef1dde5bf61" containerName="swift-ring-rebalance" Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.129545 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl" Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.131778 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.132064 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.142019 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl"] Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.316878 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-ring-data-devices\") pod \"swift-ring-rebalance-debug-qhjpl\" (UID: \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl" Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.317172 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-swiftconf\") pod \"swift-ring-rebalance-debug-qhjpl\" (UID: \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl" Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.317268 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mflhk\" (UniqueName: \"kubernetes.io/projected/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-kube-api-access-mflhk\") pod \"swift-ring-rebalance-debug-qhjpl\" (UID: \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl" Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.317493 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-scripts\") pod \"swift-ring-rebalance-debug-qhjpl\" (UID: \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl" Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.317584 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-dispersionconf\") pod \"swift-ring-rebalance-debug-qhjpl\" (UID: \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl" Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.317612 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-etc-swift\") pod \"swift-ring-rebalance-debug-qhjpl\" (UID: \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl" Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.419166 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-swiftconf\") pod \"swift-ring-rebalance-debug-qhjpl\" (UID: \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl" Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.419232 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mflhk\" (UniqueName: \"kubernetes.io/projected/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-kube-api-access-mflhk\") pod \"swift-ring-rebalance-debug-qhjpl\" (UID: \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl" Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.419274 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-scripts\") pod \"swift-ring-rebalance-debug-qhjpl\" (UID: \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl" Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.419320 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-dispersionconf\") pod \"swift-ring-rebalance-debug-qhjpl\" (UID: \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl" Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.419361 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-etc-swift\") pod \"swift-ring-rebalance-debug-qhjpl\" (UID: \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl" Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.419430 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-ring-data-devices\") pod \"swift-ring-rebalance-debug-qhjpl\" (UID: \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl" Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.420011 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-etc-swift\") pod \"swift-ring-rebalance-debug-qhjpl\" (UID: \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl" Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.420191 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-scripts\") pod \"swift-ring-rebalance-debug-qhjpl\" (UID: \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl" Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.420258 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-ring-data-devices\") pod \"swift-ring-rebalance-debug-qhjpl\" (UID: \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl" Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.423849 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-swiftconf\") pod \"swift-ring-rebalance-debug-qhjpl\" (UID: \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl" Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.423851 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-dispersionconf\") pod \"swift-ring-rebalance-debug-qhjpl\" (UID: \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl" Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.434761 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mflhk\" (UniqueName: \"kubernetes.io/projected/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-kube-api-access-mflhk\") pod \"swift-ring-rebalance-debug-qhjpl\" (UID: \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl" Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.444777 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl" Mar 09 09:58:42 crc kubenswrapper[4971]: I0309 09:58:42.910574 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl"] Mar 09 09:58:43 crc kubenswrapper[4971]: I0309 09:58:43.651422 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl" event={"ID":"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec","Type":"ContainerStarted","Data":"04e71a63bf8cbd6485681602893ee5568ca8fadae56b469a333d393d95daef9f"} Mar 09 09:58:43 crc kubenswrapper[4971]: I0309 09:58:43.651763 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl" event={"ID":"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec","Type":"ContainerStarted","Data":"67a72a57d6a469681d066dab96d855b4f61eea718a1f88320ed8ba288fb068f9"} Mar 09 09:58:43 crc kubenswrapper[4971]: I0309 09:58:43.677641 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl" podStartSLOduration=1.677625698 podStartE2EDuration="1.677625698s" podCreationTimestamp="2026-03-09 09:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:58:43.670020599 +0000 UTC m=+2327.229948409" watchObservedRunningTime="2026-03-09 09:58:43.677625698 +0000 UTC m=+2327.237553508" Mar 09 09:58:45 crc kubenswrapper[4971]: I0309 09:58:45.669941 4971 generic.go:334] "Generic (PLEG): container finished" podID="35c8a04c-5da1-4c26-8cf7-b86cd0e85dec" containerID="04e71a63bf8cbd6485681602893ee5568ca8fadae56b469a333d393d95daef9f" exitCode=0 Mar 09 09:58:45 crc kubenswrapper[4971]: I0309 09:58:45.670044 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl" event={"ID":"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec","Type":"ContainerDied","Data":"04e71a63bf8cbd6485681602893ee5568ca8fadae56b469a333d393d95daef9f"} Mar 09 09:58:47 crc kubenswrapper[4971]: I0309 09:58:47.034245 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl" Mar 09 09:58:47 crc kubenswrapper[4971]: I0309 09:58:47.062123 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl"] Mar 09 09:58:47 crc kubenswrapper[4971]: I0309 09:58:47.068434 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl"] Mar 09 09:58:47 crc kubenswrapper[4971]: I0309 09:58:47.219967 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-etc-swift\") pod \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\" (UID: \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\") " Mar 09 09:58:47 crc kubenswrapper[4971]: I0309 09:58:47.220035 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-ring-data-devices\") pod \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\" (UID: \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\") " Mar 09 09:58:47 crc kubenswrapper[4971]: I0309 09:58:47.220059 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-scripts\") pod \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\" (UID: \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\") " Mar 09 09:58:47 crc kubenswrapper[4971]: I0309 09:58:47.220161 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mflhk\" (UniqueName: \"kubernetes.io/projected/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-kube-api-access-mflhk\") pod \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\" (UID: \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\") " Mar 09 09:58:47 crc kubenswrapper[4971]: I0309 09:58:47.220190 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-dispersionconf\") pod \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\" (UID: \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\") " Mar 09 09:58:47 crc kubenswrapper[4971]: I0309 09:58:47.220232 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-swiftconf\") pod \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\" (UID: \"35c8a04c-5da1-4c26-8cf7-b86cd0e85dec\") " Mar 09 09:58:47 crc kubenswrapper[4971]: I0309 09:58:47.221318 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "35c8a04c-5da1-4c26-8cf7-b86cd0e85dec" (UID: "35c8a04c-5da1-4c26-8cf7-b86cd0e85dec"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:58:47 crc kubenswrapper[4971]: I0309 09:58:47.221603 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "35c8a04c-5da1-4c26-8cf7-b86cd0e85dec" (UID: "35c8a04c-5da1-4c26-8cf7-b86cd0e85dec"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:58:47 crc kubenswrapper[4971]: I0309 09:58:47.226461 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-kube-api-access-mflhk" (OuterVolumeSpecName: "kube-api-access-mflhk") pod "35c8a04c-5da1-4c26-8cf7-b86cd0e85dec" (UID: "35c8a04c-5da1-4c26-8cf7-b86cd0e85dec"). InnerVolumeSpecName "kube-api-access-mflhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:58:47 crc kubenswrapper[4971]: I0309 09:58:47.254404 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-scripts" (OuterVolumeSpecName: "scripts") pod "35c8a04c-5da1-4c26-8cf7-b86cd0e85dec" (UID: "35c8a04c-5da1-4c26-8cf7-b86cd0e85dec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:58:47 crc kubenswrapper[4971]: I0309 09:58:47.254765 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "35c8a04c-5da1-4c26-8cf7-b86cd0e85dec" (UID: "35c8a04c-5da1-4c26-8cf7-b86cd0e85dec"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:58:47 crc kubenswrapper[4971]: I0309 09:58:47.255038 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "35c8a04c-5da1-4c26-8cf7-b86cd0e85dec" (UID: "35c8a04c-5da1-4c26-8cf7-b86cd0e85dec"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:58:47 crc kubenswrapper[4971]: I0309 09:58:47.322555 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mflhk\" (UniqueName: \"kubernetes.io/projected/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-kube-api-access-mflhk\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:47 crc kubenswrapper[4971]: I0309 09:58:47.322593 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:47 crc kubenswrapper[4971]: I0309 09:58:47.322608 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:47 crc kubenswrapper[4971]: I0309 09:58:47.322620 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:47 crc kubenswrapper[4971]: I0309 09:58:47.322631 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:47 crc kubenswrapper[4971]: I0309 09:58:47.322666 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:47 crc kubenswrapper[4971]: I0309 09:58:47.693775 4971 scope.go:117] "RemoveContainer" containerID="04e71a63bf8cbd6485681602893ee5568ca8fadae56b469a333d393d95daef9f" Mar 09 09:58:47 crc kubenswrapper[4971]: I0309 09:58:47.693802 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qhjpl" Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.205905 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v8429"] Mar 09 09:58:48 crc kubenswrapper[4971]: E0309 09:58:48.206317 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c8a04c-5da1-4c26-8cf7-b86cd0e85dec" containerName="swift-ring-rebalance" Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.206337 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c8a04c-5da1-4c26-8cf7-b86cd0e85dec" containerName="swift-ring-rebalance" Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.206667 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="35c8a04c-5da1-4c26-8cf7-b86cd0e85dec" containerName="swift-ring-rebalance" Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.207887 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8429" Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.214777 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.214901 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.232423 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v8429"] Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.241278 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fb576367-5ecb-4b0f-9e72-eeecea558d93-etc-swift\") pod \"swift-ring-rebalance-debug-v8429\" (UID: \"fb576367-5ecb-4b0f-9e72-eeecea558d93\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8429" Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.241394 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb576367-5ecb-4b0f-9e72-eeecea558d93-scripts\") pod \"swift-ring-rebalance-debug-v8429\" (UID: \"fb576367-5ecb-4b0f-9e72-eeecea558d93\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8429" Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.241516 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95rwz\" (UniqueName: \"kubernetes.io/projected/fb576367-5ecb-4b0f-9e72-eeecea558d93-kube-api-access-95rwz\") pod \"swift-ring-rebalance-debug-v8429\" (UID: \"fb576367-5ecb-4b0f-9e72-eeecea558d93\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8429" Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.241551 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fb576367-5ecb-4b0f-9e72-eeecea558d93-ring-data-devices\") pod \"swift-ring-rebalance-debug-v8429\" (UID: \"fb576367-5ecb-4b0f-9e72-eeecea558d93\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8429" Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.241587 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fb576367-5ecb-4b0f-9e72-eeecea558d93-dispersionconf\") pod \"swift-ring-rebalance-debug-v8429\" (UID: \"fb576367-5ecb-4b0f-9e72-eeecea558d93\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8429" Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.241619 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fb576367-5ecb-4b0f-9e72-eeecea558d93-swiftconf\") pod \"swift-ring-rebalance-debug-v8429\" (UID: \"fb576367-5ecb-4b0f-9e72-eeecea558d93\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8429" Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.343236 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb576367-5ecb-4b0f-9e72-eeecea558d93-scripts\") pod \"swift-ring-rebalance-debug-v8429\" (UID: \"fb576367-5ecb-4b0f-9e72-eeecea558d93\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8429" Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.343583 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95rwz\" (UniqueName: \"kubernetes.io/projected/fb576367-5ecb-4b0f-9e72-eeecea558d93-kube-api-access-95rwz\") pod \"swift-ring-rebalance-debug-v8429\" (UID: \"fb576367-5ecb-4b0f-9e72-eeecea558d93\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8429" Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.343611 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fb576367-5ecb-4b0f-9e72-eeecea558d93-ring-data-devices\") pod \"swift-ring-rebalance-debug-v8429\" (UID: \"fb576367-5ecb-4b0f-9e72-eeecea558d93\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8429" Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.343632 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fb576367-5ecb-4b0f-9e72-eeecea558d93-dispersionconf\") pod \"swift-ring-rebalance-debug-v8429\" (UID: \"fb576367-5ecb-4b0f-9e72-eeecea558d93\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8429" Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.343654 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fb576367-5ecb-4b0f-9e72-eeecea558d93-swiftconf\") pod \"swift-ring-rebalance-debug-v8429\" (UID: \"fb576367-5ecb-4b0f-9e72-eeecea558d93\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8429" Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.343689 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fb576367-5ecb-4b0f-9e72-eeecea558d93-etc-swift\") pod \"swift-ring-rebalance-debug-v8429\" (UID: \"fb576367-5ecb-4b0f-9e72-eeecea558d93\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8429" Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.344291 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb576367-5ecb-4b0f-9e72-eeecea558d93-scripts\") pod \"swift-ring-rebalance-debug-v8429\" (UID: \"fb576367-5ecb-4b0f-9e72-eeecea558d93\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8429" Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.344335 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fb576367-5ecb-4b0f-9e72-eeecea558d93-ring-data-devices\") pod \"swift-ring-rebalance-debug-v8429\" (UID: \"fb576367-5ecb-4b0f-9e72-eeecea558d93\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8429" Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.344799 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fb576367-5ecb-4b0f-9e72-eeecea558d93-etc-swift\") pod \"swift-ring-rebalance-debug-v8429\" (UID: \"fb576367-5ecb-4b0f-9e72-eeecea558d93\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8429" Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.349239 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fb576367-5ecb-4b0f-9e72-eeecea558d93-swiftconf\") pod \"swift-ring-rebalance-debug-v8429\" (UID: \"fb576367-5ecb-4b0f-9e72-eeecea558d93\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8429" Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.349570 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fb576367-5ecb-4b0f-9e72-eeecea558d93-dispersionconf\") pod \"swift-ring-rebalance-debug-v8429\" (UID: \"fb576367-5ecb-4b0f-9e72-eeecea558d93\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8429" Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.367413 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95rwz\" (UniqueName: \"kubernetes.io/projected/fb576367-5ecb-4b0f-9e72-eeecea558d93-kube-api-access-95rwz\") pod \"swift-ring-rebalance-debug-v8429\" (UID: \"fb576367-5ecb-4b0f-9e72-eeecea558d93\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8429" Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.532921 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8429" Mar 09 09:58:48 crc kubenswrapper[4971]: I0309 09:58:48.975846 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v8429"] Mar 09 09:58:48 crc kubenswrapper[4971]: W0309 09:58:48.977529 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb576367_5ecb_4b0f_9e72_eeecea558d93.slice/crio-7fcaafd6ce4ff63c7e060c8f806bde7142f0d301cd41f26127a642d2cde2baf0 WatchSource:0}: Error finding container 7fcaafd6ce4ff63c7e060c8f806bde7142f0d301cd41f26127a642d2cde2baf0: Status 404 returned error can't find the container with id 7fcaafd6ce4ff63c7e060c8f806bde7142f0d301cd41f26127a642d2cde2baf0 Mar 09 09:58:49 crc kubenswrapper[4971]: I0309 09:58:49.166610 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35c8a04c-5da1-4c26-8cf7-b86cd0e85dec" path="/var/lib/kubelet/pods/35c8a04c-5da1-4c26-8cf7-b86cd0e85dec/volumes" Mar 09 09:58:49 crc kubenswrapper[4971]: I0309 09:58:49.713475 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8429" event={"ID":"fb576367-5ecb-4b0f-9e72-eeecea558d93","Type":"ContainerStarted","Data":"3c333d82222ddd5a72bedfefd528ead9d93b394c3110378c6c63bf67ceeb4f09"} Mar 09 09:58:49 crc kubenswrapper[4971]: I0309 09:58:49.713811 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8429" event={"ID":"fb576367-5ecb-4b0f-9e72-eeecea558d93","Type":"ContainerStarted","Data":"7fcaafd6ce4ff63c7e060c8f806bde7142f0d301cd41f26127a642d2cde2baf0"} Mar 09 09:58:50 crc kubenswrapper[4971]: I0309 09:58:50.730767 4971 generic.go:334] "Generic (PLEG): container finished" podID="fb576367-5ecb-4b0f-9e72-eeecea558d93" containerID="3c333d82222ddd5a72bedfefd528ead9d93b394c3110378c6c63bf67ceeb4f09" exitCode=0 Mar 09 09:58:50 crc kubenswrapper[4971]: I0309 09:58:50.730872 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8429" event={"ID":"fb576367-5ecb-4b0f-9e72-eeecea558d93","Type":"ContainerDied","Data":"3c333d82222ddd5a72bedfefd528ead9d93b394c3110378c6c63bf67ceeb4f09"} Mar 09 09:58:51 crc kubenswrapper[4971]: I0309 09:58:51.994154 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8429" Mar 09 09:58:52 crc kubenswrapper[4971]: I0309 09:58:52.035314 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v8429"] Mar 09 09:58:52 crc kubenswrapper[4971]: I0309 09:58:52.045793 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v8429"] Mar 09 09:58:52 crc kubenswrapper[4971]: I0309 09:58:52.102542 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fb576367-5ecb-4b0f-9e72-eeecea558d93-dispersionconf\") pod \"fb576367-5ecb-4b0f-9e72-eeecea558d93\" (UID: \"fb576367-5ecb-4b0f-9e72-eeecea558d93\") " Mar 09 09:58:52 crc kubenswrapper[4971]: I0309 09:58:52.102612 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fb576367-5ecb-4b0f-9e72-eeecea558d93-etc-swift\") pod \"fb576367-5ecb-4b0f-9e72-eeecea558d93\" (UID: \"fb576367-5ecb-4b0f-9e72-eeecea558d93\") " Mar 09 09:58:52 crc kubenswrapper[4971]: I0309 09:58:52.102734 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fb576367-5ecb-4b0f-9e72-eeecea558d93-swiftconf\") pod \"fb576367-5ecb-4b0f-9e72-eeecea558d93\" (UID: \"fb576367-5ecb-4b0f-9e72-eeecea558d93\") " Mar 09 09:58:52 crc kubenswrapper[4971]: I0309 09:58:52.102789 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb576367-5ecb-4b0f-9e72-eeecea558d93-scripts\") pod \"fb576367-5ecb-4b0f-9e72-eeecea558d93\" (UID: \"fb576367-5ecb-4b0f-9e72-eeecea558d93\") " Mar 09 09:58:52 crc kubenswrapper[4971]: I0309 09:58:52.102822 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95rwz\" (UniqueName: \"kubernetes.io/projected/fb576367-5ecb-4b0f-9e72-eeecea558d93-kube-api-access-95rwz\") pod \"fb576367-5ecb-4b0f-9e72-eeecea558d93\" (UID: \"fb576367-5ecb-4b0f-9e72-eeecea558d93\") " Mar 09 09:58:52 crc kubenswrapper[4971]: I0309 09:58:52.102843 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fb576367-5ecb-4b0f-9e72-eeecea558d93-ring-data-devices\") pod \"fb576367-5ecb-4b0f-9e72-eeecea558d93\" (UID: \"fb576367-5ecb-4b0f-9e72-eeecea558d93\") " Mar 09 09:58:52 crc kubenswrapper[4971]: I0309 09:58:52.103677 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb576367-5ecb-4b0f-9e72-eeecea558d93-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "fb576367-5ecb-4b0f-9e72-eeecea558d93" (UID: "fb576367-5ecb-4b0f-9e72-eeecea558d93"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:58:52 crc kubenswrapper[4971]: I0309 09:58:52.103836 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb576367-5ecb-4b0f-9e72-eeecea558d93-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "fb576367-5ecb-4b0f-9e72-eeecea558d93" (UID: "fb576367-5ecb-4b0f-9e72-eeecea558d93"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:58:52 crc kubenswrapper[4971]: I0309 09:58:52.108931 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb576367-5ecb-4b0f-9e72-eeecea558d93-kube-api-access-95rwz" (OuterVolumeSpecName: "kube-api-access-95rwz") pod "fb576367-5ecb-4b0f-9e72-eeecea558d93" (UID: "fb576367-5ecb-4b0f-9e72-eeecea558d93"). InnerVolumeSpecName "kube-api-access-95rwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:58:52 crc kubenswrapper[4971]: I0309 09:58:52.128112 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb576367-5ecb-4b0f-9e72-eeecea558d93-scripts" (OuterVolumeSpecName: "scripts") pod "fb576367-5ecb-4b0f-9e72-eeecea558d93" (UID: "fb576367-5ecb-4b0f-9e72-eeecea558d93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:58:52 crc kubenswrapper[4971]: I0309 09:58:52.129101 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb576367-5ecb-4b0f-9e72-eeecea558d93-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "fb576367-5ecb-4b0f-9e72-eeecea558d93" (UID: "fb576367-5ecb-4b0f-9e72-eeecea558d93"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:58:52 crc kubenswrapper[4971]: I0309 09:58:52.131748 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb576367-5ecb-4b0f-9e72-eeecea558d93-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "fb576367-5ecb-4b0f-9e72-eeecea558d93" (UID: "fb576367-5ecb-4b0f-9e72-eeecea558d93"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:58:52 crc kubenswrapper[4971]: I0309 09:58:52.204167 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fb576367-5ecb-4b0f-9e72-eeecea558d93-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:52 crc kubenswrapper[4971]: I0309 09:58:52.204219 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fb576367-5ecb-4b0f-9e72-eeecea558d93-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:52 crc kubenswrapper[4971]: I0309 09:58:52.204230 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb576367-5ecb-4b0f-9e72-eeecea558d93-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:52 crc kubenswrapper[4971]: I0309 09:58:52.204241 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95rwz\" (UniqueName: \"kubernetes.io/projected/fb576367-5ecb-4b0f-9e72-eeecea558d93-kube-api-access-95rwz\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:52 crc kubenswrapper[4971]: I0309 09:58:52.204255 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fb576367-5ecb-4b0f-9e72-eeecea558d93-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:52 crc kubenswrapper[4971]: I0309 09:58:52.204266 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fb576367-5ecb-4b0f-9e72-eeecea558d93-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:52 crc kubenswrapper[4971]: I0309 09:58:52.750864 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fcaafd6ce4ff63c7e060c8f806bde7142f0d301cd41f26127a642d2cde2baf0" Mar 09 09:58:52 crc kubenswrapper[4971]: I0309 09:58:52.750922 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v8429" Mar 09 09:58:52 crc kubenswrapper[4971]: E0309 09:58:52.876193 4971 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb576367_5ecb_4b0f_9e72_eeecea558d93.slice\": RecentStats: unable to find data in memory cache]" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.164776 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb576367-5ecb-4b0f-9e72-eeecea558d93" path="/var/lib/kubelet/pods/fb576367-5ecb-4b0f-9e72-eeecea558d93/volumes" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.180314 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8"] Mar 09 09:58:53 crc kubenswrapper[4971]: E0309 09:58:53.180689 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb576367-5ecb-4b0f-9e72-eeecea558d93" containerName="swift-ring-rebalance" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.180713 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb576367-5ecb-4b0f-9e72-eeecea558d93" containerName="swift-ring-rebalance" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.180888 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb576367-5ecb-4b0f-9e72-eeecea558d93" containerName="swift-ring-rebalance" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.181461 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.183242 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.183669 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.193810 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8"] Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.319982 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/97a20402-79c0-4c53-aae4-61a0d37b67b4-ring-data-devices\") pod \"swift-ring-rebalance-debug-c7kd8\" (UID: \"97a20402-79c0-4c53-aae4-61a0d37b67b4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.320095 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6wgf\" (UniqueName: \"kubernetes.io/projected/97a20402-79c0-4c53-aae4-61a0d37b67b4-kube-api-access-q6wgf\") pod \"swift-ring-rebalance-debug-c7kd8\" (UID: \"97a20402-79c0-4c53-aae4-61a0d37b67b4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.320177 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/97a20402-79c0-4c53-aae4-61a0d37b67b4-dispersionconf\") pod \"swift-ring-rebalance-debug-c7kd8\" (UID: \"97a20402-79c0-4c53-aae4-61a0d37b67b4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.320216 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/97a20402-79c0-4c53-aae4-61a0d37b67b4-etc-swift\") pod \"swift-ring-rebalance-debug-c7kd8\" (UID: \"97a20402-79c0-4c53-aae4-61a0d37b67b4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.320268 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97a20402-79c0-4c53-aae4-61a0d37b67b4-scripts\") pod \"swift-ring-rebalance-debug-c7kd8\" (UID: \"97a20402-79c0-4c53-aae4-61a0d37b67b4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.320313 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/97a20402-79c0-4c53-aae4-61a0d37b67b4-swiftconf\") pod \"swift-ring-rebalance-debug-c7kd8\" (UID: \"97a20402-79c0-4c53-aae4-61a0d37b67b4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.422623 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/97a20402-79c0-4c53-aae4-61a0d37b67b4-swiftconf\") pod \"swift-ring-rebalance-debug-c7kd8\" (UID: \"97a20402-79c0-4c53-aae4-61a0d37b67b4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.423012 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/97a20402-79c0-4c53-aae4-61a0d37b67b4-ring-data-devices\") pod \"swift-ring-rebalance-debug-c7kd8\" (UID: \"97a20402-79c0-4c53-aae4-61a0d37b67b4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.423069 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6wgf\" (UniqueName: \"kubernetes.io/projected/97a20402-79c0-4c53-aae4-61a0d37b67b4-kube-api-access-q6wgf\") pod \"swift-ring-rebalance-debug-c7kd8\" (UID: \"97a20402-79c0-4c53-aae4-61a0d37b67b4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.423118 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/97a20402-79c0-4c53-aae4-61a0d37b67b4-dispersionconf\") pod \"swift-ring-rebalance-debug-c7kd8\" (UID: \"97a20402-79c0-4c53-aae4-61a0d37b67b4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.423141 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/97a20402-79c0-4c53-aae4-61a0d37b67b4-etc-swift\") pod \"swift-ring-rebalance-debug-c7kd8\" (UID: \"97a20402-79c0-4c53-aae4-61a0d37b67b4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.423190 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97a20402-79c0-4c53-aae4-61a0d37b67b4-scripts\") pod \"swift-ring-rebalance-debug-c7kd8\" (UID: \"97a20402-79c0-4c53-aae4-61a0d37b67b4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.423869 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/97a20402-79c0-4c53-aae4-61a0d37b67b4-ring-data-devices\") pod \"swift-ring-rebalance-debug-c7kd8\" (UID: \"97a20402-79c0-4c53-aae4-61a0d37b67b4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.424541 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/97a20402-79c0-4c53-aae4-61a0d37b67b4-etc-swift\") pod \"swift-ring-rebalance-debug-c7kd8\" (UID: \"97a20402-79c0-4c53-aae4-61a0d37b67b4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.424801 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97a20402-79c0-4c53-aae4-61a0d37b67b4-scripts\") pod \"swift-ring-rebalance-debug-c7kd8\" (UID: \"97a20402-79c0-4c53-aae4-61a0d37b67b4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.426629 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/97a20402-79c0-4c53-aae4-61a0d37b67b4-swiftconf\") pod \"swift-ring-rebalance-debug-c7kd8\" (UID: \"97a20402-79c0-4c53-aae4-61a0d37b67b4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.426697 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/97a20402-79c0-4c53-aae4-61a0d37b67b4-dispersionconf\") pod \"swift-ring-rebalance-debug-c7kd8\" (UID: \"97a20402-79c0-4c53-aae4-61a0d37b67b4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.439166 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6wgf\" (UniqueName: \"kubernetes.io/projected/97a20402-79c0-4c53-aae4-61a0d37b67b4-kube-api-access-q6wgf\") pod \"swift-ring-rebalance-debug-c7kd8\" (UID: \"97a20402-79c0-4c53-aae4-61a0d37b67b4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.513749 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8" Mar 09 09:58:53 crc kubenswrapper[4971]: I0309 09:58:53.914478 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8"] Mar 09 09:58:53 crc kubenswrapper[4971]: W0309 09:58:53.918940 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97a20402_79c0_4c53_aae4_61a0d37b67b4.slice/crio-c8d9256acacdb6e06234d4a64097167111423d8c70d5f79e08773d25c832410f WatchSource:0}: Error finding container c8d9256acacdb6e06234d4a64097167111423d8c70d5f79e08773d25c832410f: Status 404 returned error can't find the container with id c8d9256acacdb6e06234d4a64097167111423d8c70d5f79e08773d25c832410f Mar 09 09:58:54 crc kubenswrapper[4971]: I0309 09:58:54.767848 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8" event={"ID":"97a20402-79c0-4c53-aae4-61a0d37b67b4","Type":"ContainerStarted","Data":"ecb5af70365e99bb17d86c3bb6ae67474ac6fa2bf570bd9afe669807aad42c61"} Mar 09 09:58:54 crc kubenswrapper[4971]: I0309 09:58:54.767892 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8" event={"ID":"97a20402-79c0-4c53-aae4-61a0d37b67b4","Type":"ContainerStarted","Data":"c8d9256acacdb6e06234d4a64097167111423d8c70d5f79e08773d25c832410f"} Mar 09 09:58:54 crc kubenswrapper[4971]: I0309 09:58:54.791925 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8" podStartSLOduration=1.79190762 podStartE2EDuration="1.79190762s" podCreationTimestamp="2026-03-09 09:58:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:58:54.786244565 +0000 UTC m=+2338.346172375" watchObservedRunningTime="2026-03-09 09:58:54.79190762 +0000 UTC m=+2338.351835430" Mar 09 09:58:55 crc kubenswrapper[4971]: I0309 09:58:55.777741 4971 generic.go:334] "Generic (PLEG): container finished" podID="97a20402-79c0-4c53-aae4-61a0d37b67b4" containerID="ecb5af70365e99bb17d86c3bb6ae67474ac6fa2bf570bd9afe669807aad42c61" exitCode=0 Mar 09 09:58:55 crc kubenswrapper[4971]: I0309 09:58:55.777842 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8" event={"ID":"97a20402-79c0-4c53-aae4-61a0d37b67b4","Type":"ContainerDied","Data":"ecb5af70365e99bb17d86c3bb6ae67474ac6fa2bf570bd9afe669807aad42c61"} Mar 09 09:58:57 crc kubenswrapper[4971]: I0309 09:58:57.129093 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8" Mar 09 09:58:57 crc kubenswrapper[4971]: I0309 09:58:57.168201 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8"] Mar 09 09:58:57 crc kubenswrapper[4971]: I0309 09:58:57.168267 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8"] Mar 09 09:58:57 crc kubenswrapper[4971]: I0309 09:58:57.294506 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/97a20402-79c0-4c53-aae4-61a0d37b67b4-swiftconf\") pod \"97a20402-79c0-4c53-aae4-61a0d37b67b4\" (UID: \"97a20402-79c0-4c53-aae4-61a0d37b67b4\") " Mar 09 09:58:57 crc kubenswrapper[4971]: I0309 09:58:57.294618 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6wgf\" (UniqueName: \"kubernetes.io/projected/97a20402-79c0-4c53-aae4-61a0d37b67b4-kube-api-access-q6wgf\") pod \"97a20402-79c0-4c53-aae4-61a0d37b67b4\" (UID: \"97a20402-79c0-4c53-aae4-61a0d37b67b4\") " Mar 09 09:58:57 crc kubenswrapper[4971]: I0309 09:58:57.294653 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/97a20402-79c0-4c53-aae4-61a0d37b67b4-etc-swift\") pod \"97a20402-79c0-4c53-aae4-61a0d37b67b4\" (UID: \"97a20402-79c0-4c53-aae4-61a0d37b67b4\") " Mar 09 09:58:57 crc kubenswrapper[4971]: I0309 09:58:57.294674 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/97a20402-79c0-4c53-aae4-61a0d37b67b4-dispersionconf\") pod \"97a20402-79c0-4c53-aae4-61a0d37b67b4\" (UID: \"97a20402-79c0-4c53-aae4-61a0d37b67b4\") " Mar 09 09:58:57 crc kubenswrapper[4971]: I0309 09:58:57.294775 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97a20402-79c0-4c53-aae4-61a0d37b67b4-scripts\") pod \"97a20402-79c0-4c53-aae4-61a0d37b67b4\" (UID: \"97a20402-79c0-4c53-aae4-61a0d37b67b4\") " Mar 09 09:58:57 crc kubenswrapper[4971]: I0309 09:58:57.294793 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/97a20402-79c0-4c53-aae4-61a0d37b67b4-ring-data-devices\") pod \"97a20402-79c0-4c53-aae4-61a0d37b67b4\" (UID: \"97a20402-79c0-4c53-aae4-61a0d37b67b4\") " Mar 09 09:58:57 crc kubenswrapper[4971]: I0309 09:58:57.296191 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97a20402-79c0-4c53-aae4-61a0d37b67b4-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "97a20402-79c0-4c53-aae4-61a0d37b67b4" (UID: "97a20402-79c0-4c53-aae4-61a0d37b67b4"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:58:57 crc kubenswrapper[4971]: I0309 09:58:57.296566 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97a20402-79c0-4c53-aae4-61a0d37b67b4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "97a20402-79c0-4c53-aae4-61a0d37b67b4" (UID: "97a20402-79c0-4c53-aae4-61a0d37b67b4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:58:57 crc kubenswrapper[4971]: I0309 09:58:57.306692 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97a20402-79c0-4c53-aae4-61a0d37b67b4-kube-api-access-q6wgf" (OuterVolumeSpecName: "kube-api-access-q6wgf") pod "97a20402-79c0-4c53-aae4-61a0d37b67b4" (UID: "97a20402-79c0-4c53-aae4-61a0d37b67b4"). InnerVolumeSpecName "kube-api-access-q6wgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:58:57 crc kubenswrapper[4971]: I0309 09:58:57.321098 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97a20402-79c0-4c53-aae4-61a0d37b67b4-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "97a20402-79c0-4c53-aae4-61a0d37b67b4" (UID: "97a20402-79c0-4c53-aae4-61a0d37b67b4"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:58:57 crc kubenswrapper[4971]: I0309 09:58:57.323442 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97a20402-79c0-4c53-aae4-61a0d37b67b4-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "97a20402-79c0-4c53-aae4-61a0d37b67b4" (UID: "97a20402-79c0-4c53-aae4-61a0d37b67b4"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:58:57 crc kubenswrapper[4971]: I0309 09:58:57.329505 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97a20402-79c0-4c53-aae4-61a0d37b67b4-scripts" (OuterVolumeSpecName: "scripts") pod "97a20402-79c0-4c53-aae4-61a0d37b67b4" (UID: "97a20402-79c0-4c53-aae4-61a0d37b67b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:58:57 crc kubenswrapper[4971]: I0309 09:58:57.397034 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/97a20402-79c0-4c53-aae4-61a0d37b67b4-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:57 crc kubenswrapper[4971]: I0309 09:58:57.397532 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6wgf\" (UniqueName: \"kubernetes.io/projected/97a20402-79c0-4c53-aae4-61a0d37b67b4-kube-api-access-q6wgf\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:57 crc kubenswrapper[4971]: I0309 09:58:57.397568 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/97a20402-79c0-4c53-aae4-61a0d37b67b4-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:57 crc kubenswrapper[4971]: I0309 09:58:57.397586 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/97a20402-79c0-4c53-aae4-61a0d37b67b4-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:57 crc kubenswrapper[4971]: I0309 09:58:57.397605 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97a20402-79c0-4c53-aae4-61a0d37b67b4-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:57 crc kubenswrapper[4971]: I0309 09:58:57.397622 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/97a20402-79c0-4c53-aae4-61a0d37b67b4-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:57 crc kubenswrapper[4971]: I0309 09:58:57.798798 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8d9256acacdb6e06234d4a64097167111423d8c70d5f79e08773d25c832410f" Mar 09 09:58:57 crc kubenswrapper[4971]: I0309 09:58:57.798850 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c7kd8" Mar 09 09:58:58 crc kubenswrapper[4971]: I0309 09:58:58.293106 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8"] Mar 09 09:58:58 crc kubenswrapper[4971]: E0309 09:58:58.293435 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97a20402-79c0-4c53-aae4-61a0d37b67b4" containerName="swift-ring-rebalance" Mar 09 09:58:58 crc kubenswrapper[4971]: I0309 09:58:58.293455 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="97a20402-79c0-4c53-aae4-61a0d37b67b4" containerName="swift-ring-rebalance" Mar 09 09:58:58 crc kubenswrapper[4971]: I0309 09:58:58.293602 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="97a20402-79c0-4c53-aae4-61a0d37b67b4" containerName="swift-ring-rebalance" Mar 09 09:58:58 crc kubenswrapper[4971]: I0309 09:58:58.294034 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8" Mar 09 09:58:58 crc kubenswrapper[4971]: I0309 09:58:58.296204 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:58:58 crc kubenswrapper[4971]: I0309 09:58:58.296320 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:58:58 crc kubenswrapper[4971]: I0309 09:58:58.307532 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8"] Mar 09 09:58:58 crc kubenswrapper[4971]: I0309 09:58:58.416207 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8d801ee0-aede-4275-8a32-e8b4cefdf954-swiftconf\") pod \"swift-ring-rebalance-debug-j2rc8\" (UID: \"8d801ee0-aede-4275-8a32-e8b4cefdf954\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8" Mar 09 09:58:58 crc kubenswrapper[4971]: I0309 09:58:58.416297 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d801ee0-aede-4275-8a32-e8b4cefdf954-scripts\") pod \"swift-ring-rebalance-debug-j2rc8\" (UID: \"8d801ee0-aede-4275-8a32-e8b4cefdf954\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8" Mar 09 09:58:58 crc kubenswrapper[4971]: I0309 09:58:58.416332 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5swbt\" (UniqueName: \"kubernetes.io/projected/8d801ee0-aede-4275-8a32-e8b4cefdf954-kube-api-access-5swbt\") pod \"swift-ring-rebalance-debug-j2rc8\" (UID: \"8d801ee0-aede-4275-8a32-e8b4cefdf954\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8" Mar 09 09:58:58 crc kubenswrapper[4971]: I0309 09:58:58.416386 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8d801ee0-aede-4275-8a32-e8b4cefdf954-ring-data-devices\") pod \"swift-ring-rebalance-debug-j2rc8\" (UID: \"8d801ee0-aede-4275-8a32-e8b4cefdf954\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8" Mar 09 09:58:58 crc kubenswrapper[4971]: I0309 09:58:58.416441 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8d801ee0-aede-4275-8a32-e8b4cefdf954-etc-swift\") pod \"swift-ring-rebalance-debug-j2rc8\" (UID: \"8d801ee0-aede-4275-8a32-e8b4cefdf954\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8" Mar 09 09:58:58 crc kubenswrapper[4971]: I0309 09:58:58.416481 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8d801ee0-aede-4275-8a32-e8b4cefdf954-dispersionconf\") pod \"swift-ring-rebalance-debug-j2rc8\" (UID: \"8d801ee0-aede-4275-8a32-e8b4cefdf954\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8" Mar 09 09:58:58 crc kubenswrapper[4971]: I0309 09:58:58.517807 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8d801ee0-aede-4275-8a32-e8b4cefdf954-ring-data-devices\") pod \"swift-ring-rebalance-debug-j2rc8\" (UID: \"8d801ee0-aede-4275-8a32-e8b4cefdf954\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8" Mar 09 09:58:58 crc kubenswrapper[4971]: I0309 09:58:58.517852 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8d801ee0-aede-4275-8a32-e8b4cefdf954-etc-swift\") pod \"swift-ring-rebalance-debug-j2rc8\" (UID: \"8d801ee0-aede-4275-8a32-e8b4cefdf954\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8" Mar 09 09:58:58 crc kubenswrapper[4971]: I0309 09:58:58.517870 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8d801ee0-aede-4275-8a32-e8b4cefdf954-dispersionconf\") pod \"swift-ring-rebalance-debug-j2rc8\" (UID: \"8d801ee0-aede-4275-8a32-e8b4cefdf954\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8" Mar 09 09:58:58 crc kubenswrapper[4971]: I0309 09:58:58.517922 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8d801ee0-aede-4275-8a32-e8b4cefdf954-swiftconf\") pod \"swift-ring-rebalance-debug-j2rc8\" (UID: \"8d801ee0-aede-4275-8a32-e8b4cefdf954\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8" Mar 09 09:58:58 crc kubenswrapper[4971]: I0309 09:58:58.517968 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d801ee0-aede-4275-8a32-e8b4cefdf954-scripts\") pod \"swift-ring-rebalance-debug-j2rc8\" (UID: \"8d801ee0-aede-4275-8a32-e8b4cefdf954\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8" Mar 09 09:58:58 crc kubenswrapper[4971]: I0309 09:58:58.518003 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5swbt\" (UniqueName: \"kubernetes.io/projected/8d801ee0-aede-4275-8a32-e8b4cefdf954-kube-api-access-5swbt\") pod \"swift-ring-rebalance-debug-j2rc8\" (UID: \"8d801ee0-aede-4275-8a32-e8b4cefdf954\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8" Mar 09 09:58:58 crc kubenswrapper[4971]: I0309 09:58:58.518825 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8d801ee0-aede-4275-8a32-e8b4cefdf954-ring-data-devices\") pod \"swift-ring-rebalance-debug-j2rc8\" (UID: \"8d801ee0-aede-4275-8a32-e8b4cefdf954\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8" Mar 09 09:58:58 crc kubenswrapper[4971]: I0309 09:58:58.518900 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8d801ee0-aede-4275-8a32-e8b4cefdf954-etc-swift\") pod \"swift-ring-rebalance-debug-j2rc8\" (UID: \"8d801ee0-aede-4275-8a32-e8b4cefdf954\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8" Mar 09 09:58:58 crc kubenswrapper[4971]: I0309 09:58:58.519286 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d801ee0-aede-4275-8a32-e8b4cefdf954-scripts\") pod \"swift-ring-rebalance-debug-j2rc8\" (UID: \"8d801ee0-aede-4275-8a32-e8b4cefdf954\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8" Mar 09 09:58:58 crc kubenswrapper[4971]: I0309 09:58:58.523169 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8d801ee0-aede-4275-8a32-e8b4cefdf954-dispersionconf\") pod \"swift-ring-rebalance-debug-j2rc8\" (UID: \"8d801ee0-aede-4275-8a32-e8b4cefdf954\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8" Mar 09 09:58:58 crc kubenswrapper[4971]: I0309 09:58:58.523528 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8d801ee0-aede-4275-8a32-e8b4cefdf954-swiftconf\") pod \"swift-ring-rebalance-debug-j2rc8\" (UID: \"8d801ee0-aede-4275-8a32-e8b4cefdf954\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8" Mar 09 09:58:58 crc kubenswrapper[4971]: I0309 09:58:58.537961 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5swbt\" (UniqueName: \"kubernetes.io/projected/8d801ee0-aede-4275-8a32-e8b4cefdf954-kube-api-access-5swbt\") pod \"swift-ring-rebalance-debug-j2rc8\" (UID: \"8d801ee0-aede-4275-8a32-e8b4cefdf954\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8" Mar 09 09:58:58 crc kubenswrapper[4971]: I0309 09:58:58.626900 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8" Mar 09 09:58:59 crc kubenswrapper[4971]: I0309 09:58:59.047832 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8"] Mar 09 09:58:59 crc kubenswrapper[4971]: I0309 09:58:59.166334 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97a20402-79c0-4c53-aae4-61a0d37b67b4" path="/var/lib/kubelet/pods/97a20402-79c0-4c53-aae4-61a0d37b67b4/volumes" Mar 09 09:58:59 crc kubenswrapper[4971]: I0309 09:58:59.826965 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8" event={"ID":"8d801ee0-aede-4275-8a32-e8b4cefdf954","Type":"ContainerStarted","Data":"4732f9b6cb4b774e4532abcdfa0dace1c4b84b196710fc30c5555b299ded7e47"} Mar 09 09:58:59 crc kubenswrapper[4971]: I0309 09:58:59.827393 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8" event={"ID":"8d801ee0-aede-4275-8a32-e8b4cefdf954","Type":"ContainerStarted","Data":"2e915838718cd9a8fde3510ba704dcadff5ccfe3a2dbb6798ddeb1a649dccea6"} Mar 09 09:58:59 crc kubenswrapper[4971]: I0309 09:58:59.850798 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8" podStartSLOduration=1.850774632 podStartE2EDuration="1.850774632s" podCreationTimestamp="2026-03-09 09:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:58:59.845508298 +0000 UTC m=+2343.405436118" watchObservedRunningTime="2026-03-09 09:58:59.850774632 +0000 UTC m=+2343.410702442" Mar 09 09:59:00 crc kubenswrapper[4971]: I0309 09:59:00.837252 4971 generic.go:334] "Generic (PLEG): container finished" podID="8d801ee0-aede-4275-8a32-e8b4cefdf954" containerID="4732f9b6cb4b774e4532abcdfa0dace1c4b84b196710fc30c5555b299ded7e47" exitCode=0 Mar 09 09:59:00 crc kubenswrapper[4971]: I0309 09:59:00.837291 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8" event={"ID":"8d801ee0-aede-4275-8a32-e8b4cefdf954","Type":"ContainerDied","Data":"4732f9b6cb4b774e4532abcdfa0dace1c4b84b196710fc30c5555b299ded7e47"} Mar 09 09:59:02 crc kubenswrapper[4971]: I0309 09:59:02.110066 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8" Mar 09 09:59:02 crc kubenswrapper[4971]: I0309 09:59:02.150056 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8"] Mar 09 09:59:02 crc kubenswrapper[4971]: I0309 09:59:02.155224 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8"] Mar 09 09:59:02 crc kubenswrapper[4971]: I0309 09:59:02.172735 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8d801ee0-aede-4275-8a32-e8b4cefdf954-swiftconf\") pod \"8d801ee0-aede-4275-8a32-e8b4cefdf954\" (UID: \"8d801ee0-aede-4275-8a32-e8b4cefdf954\") " Mar 09 09:59:02 crc kubenswrapper[4971]: I0309 09:59:02.172868 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8d801ee0-aede-4275-8a32-e8b4cefdf954-etc-swift\") pod \"8d801ee0-aede-4275-8a32-e8b4cefdf954\" (UID: \"8d801ee0-aede-4275-8a32-e8b4cefdf954\") " Mar 09 09:59:02 crc kubenswrapper[4971]: I0309 09:59:02.172896 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d801ee0-aede-4275-8a32-e8b4cefdf954-scripts\") pod \"8d801ee0-aede-4275-8a32-e8b4cefdf954\" (UID: \"8d801ee0-aede-4275-8a32-e8b4cefdf954\") " Mar 09 09:59:02 crc kubenswrapper[4971]: I0309 09:59:02.172929 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5swbt\" (UniqueName: \"kubernetes.io/projected/8d801ee0-aede-4275-8a32-e8b4cefdf954-kube-api-access-5swbt\") pod \"8d801ee0-aede-4275-8a32-e8b4cefdf954\" (UID: \"8d801ee0-aede-4275-8a32-e8b4cefdf954\") " Mar 09 09:59:02 crc kubenswrapper[4971]: I0309 09:59:02.173010 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8d801ee0-aede-4275-8a32-e8b4cefdf954-ring-data-devices\") pod \"8d801ee0-aede-4275-8a32-e8b4cefdf954\" (UID: \"8d801ee0-aede-4275-8a32-e8b4cefdf954\") " Mar 09 09:59:02 crc kubenswrapper[4971]: I0309 09:59:02.173033 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8d801ee0-aede-4275-8a32-e8b4cefdf954-dispersionconf\") pod \"8d801ee0-aede-4275-8a32-e8b4cefdf954\" (UID: \"8d801ee0-aede-4275-8a32-e8b4cefdf954\") " Mar 09 09:59:02 crc kubenswrapper[4971]: I0309 09:59:02.174940 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d801ee0-aede-4275-8a32-e8b4cefdf954-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8d801ee0-aede-4275-8a32-e8b4cefdf954" (UID: "8d801ee0-aede-4275-8a32-e8b4cefdf954"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:59:02 crc kubenswrapper[4971]: I0309 09:59:02.175133 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d801ee0-aede-4275-8a32-e8b4cefdf954-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8d801ee0-aede-4275-8a32-e8b4cefdf954" (UID: "8d801ee0-aede-4275-8a32-e8b4cefdf954"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:59:02 crc kubenswrapper[4971]: I0309 09:59:02.179579 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d801ee0-aede-4275-8a32-e8b4cefdf954-kube-api-access-5swbt" (OuterVolumeSpecName: "kube-api-access-5swbt") pod "8d801ee0-aede-4275-8a32-e8b4cefdf954" (UID: "8d801ee0-aede-4275-8a32-e8b4cefdf954"). InnerVolumeSpecName "kube-api-access-5swbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:59:02 crc kubenswrapper[4971]: I0309 09:59:02.193960 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d801ee0-aede-4275-8a32-e8b4cefdf954-scripts" (OuterVolumeSpecName: "scripts") pod "8d801ee0-aede-4275-8a32-e8b4cefdf954" (UID: "8d801ee0-aede-4275-8a32-e8b4cefdf954"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:59:02 crc kubenswrapper[4971]: I0309 09:59:02.194853 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d801ee0-aede-4275-8a32-e8b4cefdf954-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8d801ee0-aede-4275-8a32-e8b4cefdf954" (UID: "8d801ee0-aede-4275-8a32-e8b4cefdf954"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:59:02 crc kubenswrapper[4971]: I0309 09:59:02.198646 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d801ee0-aede-4275-8a32-e8b4cefdf954-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8d801ee0-aede-4275-8a32-e8b4cefdf954" (UID: "8d801ee0-aede-4275-8a32-e8b4cefdf954"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:59:02 crc kubenswrapper[4971]: I0309 09:59:02.273912 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8d801ee0-aede-4275-8a32-e8b4cefdf954-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:02 crc kubenswrapper[4971]: I0309 09:59:02.273942 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d801ee0-aede-4275-8a32-e8b4cefdf954-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:02 crc kubenswrapper[4971]: I0309 09:59:02.273952 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5swbt\" (UniqueName: \"kubernetes.io/projected/8d801ee0-aede-4275-8a32-e8b4cefdf954-kube-api-access-5swbt\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:02 crc kubenswrapper[4971]: I0309 09:59:02.273963 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8d801ee0-aede-4275-8a32-e8b4cefdf954-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:02 crc kubenswrapper[4971]: I0309 09:59:02.273971 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8d801ee0-aede-4275-8a32-e8b4cefdf954-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:02 crc kubenswrapper[4971]: I0309 09:59:02.273980 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8d801ee0-aede-4275-8a32-e8b4cefdf954-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:02 crc kubenswrapper[4971]: I0309 09:59:02.856743 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e915838718cd9a8fde3510ba704dcadff5ccfe3a2dbb6798ddeb1a649dccea6" Mar 09 09:59:02 crc kubenswrapper[4971]: I0309 09:59:02.856796 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j2rc8" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.160672 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d801ee0-aede-4275-8a32-e8b4cefdf954" path="/var/lib/kubelet/pods/8d801ee0-aede-4275-8a32-e8b4cefdf954/volumes" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.288712 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2snpm"] Mar 09 09:59:03 crc kubenswrapper[4971]: E0309 09:59:03.289082 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d801ee0-aede-4275-8a32-e8b4cefdf954" containerName="swift-ring-rebalance" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.289099 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d801ee0-aede-4275-8a32-e8b4cefdf954" containerName="swift-ring-rebalance" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.289246 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d801ee0-aede-4275-8a32-e8b4cefdf954" containerName="swift-ring-rebalance" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.289707 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2snpm" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.291305 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.291317 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.304655 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2snpm"] Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.388170 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8fb7a0af-0c9f-4286-8f2d-d465058822f5-ring-data-devices\") pod \"swift-ring-rebalance-debug-2snpm\" (UID: \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2snpm" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.388244 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fb7a0af-0c9f-4286-8f2d-d465058822f5-scripts\") pod \"swift-ring-rebalance-debug-2snpm\" (UID: \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2snpm" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.388287 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fstj4\" (UniqueName: \"kubernetes.io/projected/8fb7a0af-0c9f-4286-8f2d-d465058822f5-kube-api-access-fstj4\") pod \"swift-ring-rebalance-debug-2snpm\" (UID: \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2snpm" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.388309 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8fb7a0af-0c9f-4286-8f2d-d465058822f5-etc-swift\") pod \"swift-ring-rebalance-debug-2snpm\" (UID: \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2snpm" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.388339 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8fb7a0af-0c9f-4286-8f2d-d465058822f5-swiftconf\") pod \"swift-ring-rebalance-debug-2snpm\" (UID: \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2snpm" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.388443 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8fb7a0af-0c9f-4286-8f2d-d465058822f5-dispersionconf\") pod \"swift-ring-rebalance-debug-2snpm\" (UID: \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2snpm" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.490340 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fstj4\" (UniqueName: \"kubernetes.io/projected/8fb7a0af-0c9f-4286-8f2d-d465058822f5-kube-api-access-fstj4\") pod \"swift-ring-rebalance-debug-2snpm\" (UID: \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2snpm" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.490402 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8fb7a0af-0c9f-4286-8f2d-d465058822f5-etc-swift\") pod \"swift-ring-rebalance-debug-2snpm\" (UID: \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2snpm" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.490427 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8fb7a0af-0c9f-4286-8f2d-d465058822f5-swiftconf\") pod \"swift-ring-rebalance-debug-2snpm\" (UID: \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2snpm" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.490483 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8fb7a0af-0c9f-4286-8f2d-d465058822f5-dispersionconf\") pod \"swift-ring-rebalance-debug-2snpm\" (UID: \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2snpm" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.490539 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8fb7a0af-0c9f-4286-8f2d-d465058822f5-ring-data-devices\") pod \"swift-ring-rebalance-debug-2snpm\" (UID: \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2snpm" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.490565 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fb7a0af-0c9f-4286-8f2d-d465058822f5-scripts\") pod \"swift-ring-rebalance-debug-2snpm\" (UID: \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2snpm" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.491197 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fb7a0af-0c9f-4286-8f2d-d465058822f5-scripts\") pod \"swift-ring-rebalance-debug-2snpm\" (UID: \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2snpm" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.491397 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8fb7a0af-0c9f-4286-8f2d-d465058822f5-etc-swift\") pod \"swift-ring-rebalance-debug-2snpm\" (UID: \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2snpm" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.491410 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8fb7a0af-0c9f-4286-8f2d-d465058822f5-ring-data-devices\") pod \"swift-ring-rebalance-debug-2snpm\" (UID: \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2snpm" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.495556 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8fb7a0af-0c9f-4286-8f2d-d465058822f5-dispersionconf\") pod \"swift-ring-rebalance-debug-2snpm\" (UID: \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2snpm" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.495728 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8fb7a0af-0c9f-4286-8f2d-d465058822f5-swiftconf\") pod \"swift-ring-rebalance-debug-2snpm\" (UID: \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2snpm" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.507652 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fstj4\" (UniqueName: \"kubernetes.io/projected/8fb7a0af-0c9f-4286-8f2d-d465058822f5-kube-api-access-fstj4\") pod \"swift-ring-rebalance-debug-2snpm\" (UID: \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2snpm" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.604404 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2snpm" Mar 09 09:59:03 crc kubenswrapper[4971]: I0309 09:59:03.995262 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2snpm"] Mar 09 09:59:04 crc kubenswrapper[4971]: W0309 09:59:04.000037 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fb7a0af_0c9f_4286_8f2d_d465058822f5.slice/crio-7c7d4854863c014f4b0990ced6c7d3d3e4e911e2dd54805711cd71b41fe66905 WatchSource:0}: Error finding container 7c7d4854863c014f4b0990ced6c7d3d3e4e911e2dd54805711cd71b41fe66905: Status 404 returned error can't find the container with id 7c7d4854863c014f4b0990ced6c7d3d3e4e911e2dd54805711cd71b41fe66905 Mar 09 09:59:04 crc kubenswrapper[4971]: I0309 09:59:04.877099 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2snpm" event={"ID":"8fb7a0af-0c9f-4286-8f2d-d465058822f5","Type":"ContainerStarted","Data":"7ce52775298d36f70200a2f109c51e6cebafcadeef3055e9a76cf5b65b05d0ca"} Mar 09 09:59:04 crc kubenswrapper[4971]: I0309 09:59:04.877490 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2snpm" event={"ID":"8fb7a0af-0c9f-4286-8f2d-d465058822f5","Type":"ContainerStarted","Data":"7c7d4854863c014f4b0990ced6c7d3d3e4e911e2dd54805711cd71b41fe66905"} Mar 09 09:59:04 crc kubenswrapper[4971]: I0309 09:59:04.895992 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2snpm" podStartSLOduration=1.89596885 podStartE2EDuration="1.89596885s" podCreationTimestamp="2026-03-09 09:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:59:04.891475587 +0000 UTC m=+2348.451403397" watchObservedRunningTime="2026-03-09 09:59:04.89596885 +0000 UTC m=+2348.455896660" Mar 09 09:59:05 crc kubenswrapper[4971]: I0309 09:59:05.886589 4971 generic.go:334] "Generic (PLEG): container finished" podID="8fb7a0af-0c9f-4286-8f2d-d465058822f5" containerID="7ce52775298d36f70200a2f109c51e6cebafcadeef3055e9a76cf5b65b05d0ca" exitCode=0 Mar 09 09:59:05 crc kubenswrapper[4971]: I0309 09:59:05.886635 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2snpm" event={"ID":"8fb7a0af-0c9f-4286-8f2d-d465058822f5","Type":"ContainerDied","Data":"7ce52775298d36f70200a2f109c51e6cebafcadeef3055e9a76cf5b65b05d0ca"} Mar 09 09:59:06 crc kubenswrapper[4971]: I0309 09:59:06.568642 4971 scope.go:117] "RemoveContainer" containerID="e93b3ecbf05d2a80a452f4e09c1b56b77749f18c5dbdbbeb500da4b4ce2182cf" Mar 09 09:59:06 crc kubenswrapper[4971]: I0309 09:59:06.604032 4971 scope.go:117] "RemoveContainer" containerID="e2465713fde0531a23362ad6760ad7da36b11e07dc49d14c5f7e7292edb7c0b4" Mar 09 09:59:06 crc kubenswrapper[4971]: I0309 09:59:06.635401 4971 scope.go:117] "RemoveContainer" containerID="9637f2c7d9b596a0022bdd874771092ff9c33d601f8048f17784582729bc9372" Mar 09 09:59:06 crc kubenswrapper[4971]: I0309 09:59:06.662070 4971 scope.go:117] "RemoveContainer" containerID="155016833dc428af8abfc333edc45781dccb518cf18df24f90eae524b6a98026" Mar 09 09:59:06 crc kubenswrapper[4971]: I0309 09:59:06.689992 4971 scope.go:117] "RemoveContainer" containerID="732971afa840b8df1a6736b8629246b240281a77aa37b9cb42e5b31288370944" Mar 09 09:59:06 crc kubenswrapper[4971]: I0309 09:59:06.725894 4971 scope.go:117] "RemoveContainer" containerID="5749369ae261d05f2985c4d1397094182354c50c40b2b8ee796b9f35da067601" Mar 09 09:59:06 crc kubenswrapper[4971]: I0309 09:59:06.750206 4971 scope.go:117] "RemoveContainer" containerID="a1e7cb89cb53089e1513d4e0dd5f9375d450bdb2770aca66d03ef0c8257ff6f3" Mar 09 09:59:06 crc kubenswrapper[4971]: I0309 09:59:06.775054 4971 scope.go:117] "RemoveContainer" containerID="c790fcd5ec69025e216f89e967a044d0b659fc7fc83557580a7466e7fb5576f5" Mar 09 09:59:06 crc kubenswrapper[4971]: I0309 09:59:06.801194 4971 scope.go:117] "RemoveContainer" containerID="e0be9f0aa4d9663449403044f1891acba9dccc3538adc7e5ef36feb822a5a6dd" Mar 09 09:59:06 crc kubenswrapper[4971]: I0309 09:59:06.825945 4971 scope.go:117] "RemoveContainer" containerID="6bdf31424b9fac476fcdee729b745912a6c0e7331c5d9d088040ee0700caaf27" Mar 09 09:59:07 crc kubenswrapper[4971]: I0309 09:59:07.134154 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2snpm" Mar 09 09:59:07 crc kubenswrapper[4971]: I0309 09:59:07.173750 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2snpm"] Mar 09 09:59:07 crc kubenswrapper[4971]: I0309 09:59:07.179606 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2snpm"] Mar 09 09:59:07 crc kubenswrapper[4971]: I0309 09:59:07.242060 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fb7a0af-0c9f-4286-8f2d-d465058822f5-scripts\") pod \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\" (UID: \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\") " Mar 09 09:59:07 crc kubenswrapper[4971]: I0309 09:59:07.242168 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fstj4\" (UniqueName: \"kubernetes.io/projected/8fb7a0af-0c9f-4286-8f2d-d465058822f5-kube-api-access-fstj4\") pod \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\" (UID: \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\") " Mar 09 09:59:07 crc kubenswrapper[4971]: I0309 09:59:07.242193 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8fb7a0af-0c9f-4286-8f2d-d465058822f5-swiftconf\") pod \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\" (UID: \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\") " Mar 09 09:59:07 crc kubenswrapper[4971]: I0309 09:59:07.242225 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8fb7a0af-0c9f-4286-8f2d-d465058822f5-dispersionconf\") pod \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\" (UID: \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\") " Mar 09 09:59:07 crc kubenswrapper[4971]: I0309 09:59:07.242364 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8fb7a0af-0c9f-4286-8f2d-d465058822f5-etc-swift\") pod \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\" (UID: \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\") " Mar 09 09:59:07 crc kubenswrapper[4971]: I0309 09:59:07.242389 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8fb7a0af-0c9f-4286-8f2d-d465058822f5-ring-data-devices\") pod \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\" (UID: \"8fb7a0af-0c9f-4286-8f2d-d465058822f5\") " Mar 09 09:59:07 crc kubenswrapper[4971]: I0309 09:59:07.243036 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fb7a0af-0c9f-4286-8f2d-d465058822f5-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8fb7a0af-0c9f-4286-8f2d-d465058822f5" (UID: "8fb7a0af-0c9f-4286-8f2d-d465058822f5"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:59:07 crc kubenswrapper[4971]: I0309 09:59:07.243141 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fb7a0af-0c9f-4286-8f2d-d465058822f5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8fb7a0af-0c9f-4286-8f2d-d465058822f5" (UID: "8fb7a0af-0c9f-4286-8f2d-d465058822f5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:59:07 crc kubenswrapper[4971]: I0309 09:59:07.247556 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fb7a0af-0c9f-4286-8f2d-d465058822f5-kube-api-access-fstj4" (OuterVolumeSpecName: "kube-api-access-fstj4") pod "8fb7a0af-0c9f-4286-8f2d-d465058822f5" (UID: "8fb7a0af-0c9f-4286-8f2d-d465058822f5"). InnerVolumeSpecName "kube-api-access-fstj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:59:07 crc kubenswrapper[4971]: I0309 09:59:07.261954 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fb7a0af-0c9f-4286-8f2d-d465058822f5-scripts" (OuterVolumeSpecName: "scripts") pod "8fb7a0af-0c9f-4286-8f2d-d465058822f5" (UID: "8fb7a0af-0c9f-4286-8f2d-d465058822f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:59:07 crc kubenswrapper[4971]: I0309 09:59:07.262299 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fb7a0af-0c9f-4286-8f2d-d465058822f5-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8fb7a0af-0c9f-4286-8f2d-d465058822f5" (UID: "8fb7a0af-0c9f-4286-8f2d-d465058822f5"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:59:07 crc kubenswrapper[4971]: I0309 09:59:07.264010 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fb7a0af-0c9f-4286-8f2d-d465058822f5-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8fb7a0af-0c9f-4286-8f2d-d465058822f5" (UID: "8fb7a0af-0c9f-4286-8f2d-d465058822f5"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:59:07 crc kubenswrapper[4971]: I0309 09:59:07.344391 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fstj4\" (UniqueName: \"kubernetes.io/projected/8fb7a0af-0c9f-4286-8f2d-d465058822f5-kube-api-access-fstj4\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:07 crc kubenswrapper[4971]: I0309 09:59:07.344540 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8fb7a0af-0c9f-4286-8f2d-d465058822f5-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:07 crc kubenswrapper[4971]: I0309 09:59:07.344569 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8fb7a0af-0c9f-4286-8f2d-d465058822f5-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:07 crc kubenswrapper[4971]: I0309 09:59:07.344585 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8fb7a0af-0c9f-4286-8f2d-d465058822f5-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:07 crc kubenswrapper[4971]: I0309 09:59:07.344596 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8fb7a0af-0c9f-4286-8f2d-d465058822f5-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:07 crc kubenswrapper[4971]: I0309 09:59:07.344609 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fb7a0af-0c9f-4286-8f2d-d465058822f5-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:07 crc kubenswrapper[4971]: I0309 09:59:07.922976 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2snpm" Mar 09 09:59:07 crc kubenswrapper[4971]: I0309 09:59:07.922999 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c7d4854863c014f4b0990ced6c7d3d3e4e911e2dd54805711cd71b41fe66905" Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.298440 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-624gh"] Mar 09 09:59:08 crc kubenswrapper[4971]: E0309 09:59:08.298748 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fb7a0af-0c9f-4286-8f2d-d465058822f5" containerName="swift-ring-rebalance" Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.298762 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fb7a0af-0c9f-4286-8f2d-d465058822f5" containerName="swift-ring-rebalance" Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.298894 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fb7a0af-0c9f-4286-8f2d-d465058822f5" containerName="swift-ring-rebalance" Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.299579 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-624gh" Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.301446 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.305234 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.307377 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-624gh"] Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.462542 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-dispersionconf\") pod \"swift-ring-rebalance-debug-624gh\" (UID: \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-624gh" Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.462667 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-scripts\") pod \"swift-ring-rebalance-debug-624gh\" (UID: \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-624gh" Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.462904 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-swiftconf\") pod \"swift-ring-rebalance-debug-624gh\" (UID: \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-624gh" Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.463082 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-ring-data-devices\") pod \"swift-ring-rebalance-debug-624gh\" (UID: \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-624gh" Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.463311 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9qp9\" (UniqueName: \"kubernetes.io/projected/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-kube-api-access-v9qp9\") pod \"swift-ring-rebalance-debug-624gh\" (UID: \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-624gh" Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.463495 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-etc-swift\") pod \"swift-ring-rebalance-debug-624gh\" (UID: \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-624gh" Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.565343 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-ring-data-devices\") pod \"swift-ring-rebalance-debug-624gh\" (UID: \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-624gh" Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.565470 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9qp9\" (UniqueName: \"kubernetes.io/projected/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-kube-api-access-v9qp9\") pod \"swift-ring-rebalance-debug-624gh\" (UID: \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-624gh" Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.565520 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-etc-swift\") pod \"swift-ring-rebalance-debug-624gh\" (UID: \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-624gh" Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.565568 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-dispersionconf\") pod \"swift-ring-rebalance-debug-624gh\" (UID: \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-624gh" Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.565609 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-scripts\") pod \"swift-ring-rebalance-debug-624gh\" (UID: \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-624gh" Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.565663 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-swiftconf\") pod \"swift-ring-rebalance-debug-624gh\" (UID: \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-624gh" Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.566026 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-ring-data-devices\") pod \"swift-ring-rebalance-debug-624gh\" (UID: \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-624gh" Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.566251 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-etc-swift\") pod \"swift-ring-rebalance-debug-624gh\" (UID: \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-624gh" Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.566990 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-scripts\") pod \"swift-ring-rebalance-debug-624gh\" (UID: \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-624gh" Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.570938 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-swiftconf\") pod \"swift-ring-rebalance-debug-624gh\" (UID: \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-624gh" Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.571048 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-dispersionconf\") pod \"swift-ring-rebalance-debug-624gh\" (UID: \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-624gh" Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.590644 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9qp9\" (UniqueName: \"kubernetes.io/projected/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-kube-api-access-v9qp9\") pod \"swift-ring-rebalance-debug-624gh\" (UID: \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-624gh" Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.616589 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-624gh" Mar 09 09:59:08 crc kubenswrapper[4971]: I0309 09:59:08.999554 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-624gh"] Mar 09 09:59:09 crc kubenswrapper[4971]: I0309 09:59:09.159755 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fb7a0af-0c9f-4286-8f2d-d465058822f5" path="/var/lib/kubelet/pods/8fb7a0af-0c9f-4286-8f2d-d465058822f5/volumes" Mar 09 09:59:09 crc kubenswrapper[4971]: I0309 09:59:09.946898 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-624gh" event={"ID":"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7","Type":"ContainerStarted","Data":"5a62677f01f116d16f099dcfd7688e175521fbaa6f39a9541196f81f5f0dc3db"} Mar 09 09:59:09 crc kubenswrapper[4971]: I0309 09:59:09.946946 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-624gh" event={"ID":"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7","Type":"ContainerStarted","Data":"dffd8bb669d0a5425225bdb0dae92bc2888a99c575dd27469fc8cef28ec8f62a"} Mar 09 09:59:09 crc kubenswrapper[4971]: I0309 09:59:09.969641 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-624gh" podStartSLOduration=1.969610989 podStartE2EDuration="1.969610989s" podCreationTimestamp="2026-03-09 09:59:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:59:09.962379961 +0000 UTC m=+2353.522307771" watchObservedRunningTime="2026-03-09 09:59:09.969610989 +0000 UTC m=+2353.529538839" Mar 09 09:59:10 crc kubenswrapper[4971]: I0309 09:59:10.957194 4971 generic.go:334] "Generic (PLEG): container finished" podID="eee27fc4-e719-49dd-8c93-6fcfe6dda0d7" containerID="5a62677f01f116d16f099dcfd7688e175521fbaa6f39a9541196f81f5f0dc3db" exitCode=0 Mar 09 09:59:10 crc kubenswrapper[4971]: I0309 09:59:10.957264 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-624gh" event={"ID":"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7","Type":"ContainerDied","Data":"5a62677f01f116d16f099dcfd7688e175521fbaa6f39a9541196f81f5f0dc3db"} Mar 09 09:59:12 crc kubenswrapper[4971]: I0309 09:59:12.223762 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-624gh" Mar 09 09:59:12 crc kubenswrapper[4971]: I0309 09:59:12.254319 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-624gh"] Mar 09 09:59:12 crc kubenswrapper[4971]: I0309 09:59:12.259988 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-624gh"] Mar 09 09:59:12 crc kubenswrapper[4971]: I0309 09:59:12.320728 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-etc-swift\") pod \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\" (UID: \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\") " Mar 09 09:59:12 crc kubenswrapper[4971]: I0309 09:59:12.320786 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9qp9\" (UniqueName: \"kubernetes.io/projected/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-kube-api-access-v9qp9\") pod \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\" (UID: \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\") " Mar 09 09:59:12 crc kubenswrapper[4971]: I0309 09:59:12.320837 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-ring-data-devices\") pod \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\" (UID: \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\") " Mar 09 09:59:12 crc kubenswrapper[4971]: I0309 09:59:12.320914 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-swiftconf\") pod \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\" (UID: \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\") " Mar 09 09:59:12 crc kubenswrapper[4971]: I0309 09:59:12.320958 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-dispersionconf\") pod \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\" (UID: \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\") " Mar 09 09:59:12 crc kubenswrapper[4971]: I0309 09:59:12.320979 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-scripts\") pod \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\" (UID: \"eee27fc4-e719-49dd-8c93-6fcfe6dda0d7\") " Mar 09 09:59:12 crc kubenswrapper[4971]: I0309 09:59:12.321676 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "eee27fc4-e719-49dd-8c93-6fcfe6dda0d7" (UID: "eee27fc4-e719-49dd-8c93-6fcfe6dda0d7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:59:12 crc kubenswrapper[4971]: I0309 09:59:12.321681 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "eee27fc4-e719-49dd-8c93-6fcfe6dda0d7" (UID: "eee27fc4-e719-49dd-8c93-6fcfe6dda0d7"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:59:12 crc kubenswrapper[4971]: I0309 09:59:12.325969 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-kube-api-access-v9qp9" (OuterVolumeSpecName: "kube-api-access-v9qp9") pod "eee27fc4-e719-49dd-8c93-6fcfe6dda0d7" (UID: "eee27fc4-e719-49dd-8c93-6fcfe6dda0d7"). InnerVolumeSpecName "kube-api-access-v9qp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:59:12 crc kubenswrapper[4971]: I0309 09:59:12.339265 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-scripts" (OuterVolumeSpecName: "scripts") pod "eee27fc4-e719-49dd-8c93-6fcfe6dda0d7" (UID: "eee27fc4-e719-49dd-8c93-6fcfe6dda0d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:59:12 crc kubenswrapper[4971]: I0309 09:59:12.342209 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "eee27fc4-e719-49dd-8c93-6fcfe6dda0d7" (UID: "eee27fc4-e719-49dd-8c93-6fcfe6dda0d7"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:59:12 crc kubenswrapper[4971]: I0309 09:59:12.348097 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "eee27fc4-e719-49dd-8c93-6fcfe6dda0d7" (UID: "eee27fc4-e719-49dd-8c93-6fcfe6dda0d7"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:59:12 crc kubenswrapper[4971]: I0309 09:59:12.423947 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:12 crc kubenswrapper[4971]: I0309 09:59:12.423977 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:12 crc kubenswrapper[4971]: I0309 09:59:12.423986 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:12 crc kubenswrapper[4971]: I0309 09:59:12.423996 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9qp9\" (UniqueName: \"kubernetes.io/projected/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-kube-api-access-v9qp9\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:12 crc kubenswrapper[4971]: I0309 09:59:12.424008 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:12 crc kubenswrapper[4971]: I0309 09:59:12.424017 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:12 crc kubenswrapper[4971]: I0309 09:59:12.977018 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dffd8bb669d0a5425225bdb0dae92bc2888a99c575dd27469fc8cef28ec8f62a" Mar 09 09:59:12 crc kubenswrapper[4971]: I0309 09:59:12.977105 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-624gh" Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.163813 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eee27fc4-e719-49dd-8c93-6fcfe6dda0d7" path="/var/lib/kubelet/pods/eee27fc4-e719-49dd-8c93-6fcfe6dda0d7/volumes" Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.425871 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c775b"] Mar 09 09:59:13 crc kubenswrapper[4971]: E0309 09:59:13.426629 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee27fc4-e719-49dd-8c93-6fcfe6dda0d7" containerName="swift-ring-rebalance" Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.426648 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee27fc4-e719-49dd-8c93-6fcfe6dda0d7" containerName="swift-ring-rebalance" Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.426846 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="eee27fc4-e719-49dd-8c93-6fcfe6dda0d7" containerName="swift-ring-rebalance" Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.427504 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c775b" Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.435283 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c775b"] Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.440821 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.440998 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.544785 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8367a2d9-a755-4a83-96de-48072345b97c-dispersionconf\") pod \"swift-ring-rebalance-debug-c775b\" (UID: \"8367a2d9-a755-4a83-96de-48072345b97c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c775b" Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.545255 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n2l9\" (UniqueName: \"kubernetes.io/projected/8367a2d9-a755-4a83-96de-48072345b97c-kube-api-access-2n2l9\") pod \"swift-ring-rebalance-debug-c775b\" (UID: \"8367a2d9-a755-4a83-96de-48072345b97c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c775b" Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.545374 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8367a2d9-a755-4a83-96de-48072345b97c-scripts\") pod \"swift-ring-rebalance-debug-c775b\" (UID: \"8367a2d9-a755-4a83-96de-48072345b97c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c775b" Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.545617 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8367a2d9-a755-4a83-96de-48072345b97c-etc-swift\") pod \"swift-ring-rebalance-debug-c775b\" (UID: \"8367a2d9-a755-4a83-96de-48072345b97c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c775b" Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.545645 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8367a2d9-a755-4a83-96de-48072345b97c-ring-data-devices\") pod \"swift-ring-rebalance-debug-c775b\" (UID: \"8367a2d9-a755-4a83-96de-48072345b97c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c775b" Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.545675 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8367a2d9-a755-4a83-96de-48072345b97c-swiftconf\") pod \"swift-ring-rebalance-debug-c775b\" (UID: \"8367a2d9-a755-4a83-96de-48072345b97c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c775b" Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.647219 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8367a2d9-a755-4a83-96de-48072345b97c-etc-swift\") pod \"swift-ring-rebalance-debug-c775b\" (UID: \"8367a2d9-a755-4a83-96de-48072345b97c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c775b" Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.647285 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8367a2d9-a755-4a83-96de-48072345b97c-ring-data-devices\") pod \"swift-ring-rebalance-debug-c775b\" (UID: \"8367a2d9-a755-4a83-96de-48072345b97c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c775b" Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.647325 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8367a2d9-a755-4a83-96de-48072345b97c-swiftconf\") pod \"swift-ring-rebalance-debug-c775b\" (UID: \"8367a2d9-a755-4a83-96de-48072345b97c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c775b" Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.647378 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8367a2d9-a755-4a83-96de-48072345b97c-dispersionconf\") pod \"swift-ring-rebalance-debug-c775b\" (UID: \"8367a2d9-a755-4a83-96de-48072345b97c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c775b" Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.647403 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n2l9\" (UniqueName: \"kubernetes.io/projected/8367a2d9-a755-4a83-96de-48072345b97c-kube-api-access-2n2l9\") pod \"swift-ring-rebalance-debug-c775b\" (UID: \"8367a2d9-a755-4a83-96de-48072345b97c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c775b" Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.647436 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8367a2d9-a755-4a83-96de-48072345b97c-scripts\") pod \"swift-ring-rebalance-debug-c775b\" (UID: \"8367a2d9-a755-4a83-96de-48072345b97c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c775b" Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.647759 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8367a2d9-a755-4a83-96de-48072345b97c-etc-swift\") pod \"swift-ring-rebalance-debug-c775b\" (UID: \"8367a2d9-a755-4a83-96de-48072345b97c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c775b" Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.648609 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8367a2d9-a755-4a83-96de-48072345b97c-ring-data-devices\") pod \"swift-ring-rebalance-debug-c775b\" (UID: \"8367a2d9-a755-4a83-96de-48072345b97c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c775b" Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.648642 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8367a2d9-a755-4a83-96de-48072345b97c-scripts\") pod \"swift-ring-rebalance-debug-c775b\" (UID: \"8367a2d9-a755-4a83-96de-48072345b97c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c775b" Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.651646 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8367a2d9-a755-4a83-96de-48072345b97c-dispersionconf\") pod \"swift-ring-rebalance-debug-c775b\" (UID: \"8367a2d9-a755-4a83-96de-48072345b97c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c775b" Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.651910 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8367a2d9-a755-4a83-96de-48072345b97c-swiftconf\") pod \"swift-ring-rebalance-debug-c775b\" (UID: \"8367a2d9-a755-4a83-96de-48072345b97c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c775b" Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.664925 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n2l9\" (UniqueName: \"kubernetes.io/projected/8367a2d9-a755-4a83-96de-48072345b97c-kube-api-access-2n2l9\") pod \"swift-ring-rebalance-debug-c775b\" (UID: \"8367a2d9-a755-4a83-96de-48072345b97c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c775b" Mar 09 09:59:13 crc kubenswrapper[4971]: I0309 09:59:13.753998 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c775b" Mar 09 09:59:14 crc kubenswrapper[4971]: I0309 09:59:14.186773 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c775b"] Mar 09 09:59:14 crc kubenswrapper[4971]: I0309 09:59:14.996073 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c775b" event={"ID":"8367a2d9-a755-4a83-96de-48072345b97c","Type":"ContainerStarted","Data":"50f20ce0bf95f462480f5417e1377c40128cdb475c8b05fba7fd110117ed1896"} Mar 09 09:59:14 crc kubenswrapper[4971]: I0309 09:59:14.996440 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c775b" event={"ID":"8367a2d9-a755-4a83-96de-48072345b97c","Type":"ContainerStarted","Data":"d60a62051113617b895638a906fa963ac1f7228bec8d8912bbb2b872dc97e8f9"} Mar 09 09:59:15 crc kubenswrapper[4971]: I0309 09:59:15.016617 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c775b" podStartSLOduration=2.016595286 podStartE2EDuration="2.016595286s" podCreationTimestamp="2026-03-09 09:59:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:59:15.014943501 +0000 UTC m=+2358.574871341" watchObservedRunningTime="2026-03-09 09:59:15.016595286 +0000 UTC m=+2358.576523096" Mar 09 09:59:16 crc kubenswrapper[4971]: I0309 09:59:16.005106 4971 generic.go:334] "Generic (PLEG): container finished" podID="8367a2d9-a755-4a83-96de-48072345b97c" containerID="50f20ce0bf95f462480f5417e1377c40128cdb475c8b05fba7fd110117ed1896" exitCode=0 Mar 09 09:59:16 crc kubenswrapper[4971]: I0309 09:59:16.005203 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c775b" event={"ID":"8367a2d9-a755-4a83-96de-48072345b97c","Type":"ContainerDied","Data":"50f20ce0bf95f462480f5417e1377c40128cdb475c8b05fba7fd110117ed1896"} Mar 09 09:59:17 crc kubenswrapper[4971]: I0309 09:59:17.336148 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c775b" Mar 09 09:59:17 crc kubenswrapper[4971]: I0309 09:59:17.376264 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c775b"] Mar 09 09:59:17 crc kubenswrapper[4971]: I0309 09:59:17.382991 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c775b"] Mar 09 09:59:17 crc kubenswrapper[4971]: I0309 09:59:17.505064 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8367a2d9-a755-4a83-96de-48072345b97c-etc-swift\") pod \"8367a2d9-a755-4a83-96de-48072345b97c\" (UID: \"8367a2d9-a755-4a83-96de-48072345b97c\") " Mar 09 09:59:17 crc kubenswrapper[4971]: I0309 09:59:17.505187 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8367a2d9-a755-4a83-96de-48072345b97c-dispersionconf\") pod \"8367a2d9-a755-4a83-96de-48072345b97c\" (UID: \"8367a2d9-a755-4a83-96de-48072345b97c\") " Mar 09 09:59:17 crc kubenswrapper[4971]: I0309 09:59:17.505234 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n2l9\" (UniqueName: \"kubernetes.io/projected/8367a2d9-a755-4a83-96de-48072345b97c-kube-api-access-2n2l9\") pod \"8367a2d9-a755-4a83-96de-48072345b97c\" (UID: \"8367a2d9-a755-4a83-96de-48072345b97c\") " Mar 09 09:59:17 crc kubenswrapper[4971]: I0309 09:59:17.505306 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8367a2d9-a755-4a83-96de-48072345b97c-swiftconf\") pod \"8367a2d9-a755-4a83-96de-48072345b97c\" (UID: \"8367a2d9-a755-4a83-96de-48072345b97c\") " Mar 09 09:59:17 crc kubenswrapper[4971]: I0309 09:59:17.505369 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8367a2d9-a755-4a83-96de-48072345b97c-ring-data-devices\") pod \"8367a2d9-a755-4a83-96de-48072345b97c\" (UID: \"8367a2d9-a755-4a83-96de-48072345b97c\") " Mar 09 09:59:17 crc kubenswrapper[4971]: I0309 09:59:17.505390 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8367a2d9-a755-4a83-96de-48072345b97c-scripts\") pod \"8367a2d9-a755-4a83-96de-48072345b97c\" (UID: \"8367a2d9-a755-4a83-96de-48072345b97c\") " Mar 09 09:59:17 crc kubenswrapper[4971]: I0309 09:59:17.506032 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8367a2d9-a755-4a83-96de-48072345b97c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8367a2d9-a755-4a83-96de-48072345b97c" (UID: "8367a2d9-a755-4a83-96de-48072345b97c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:59:17 crc kubenswrapper[4971]: I0309 09:59:17.506318 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8367a2d9-a755-4a83-96de-48072345b97c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8367a2d9-a755-4a83-96de-48072345b97c" (UID: "8367a2d9-a755-4a83-96de-48072345b97c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:59:17 crc kubenswrapper[4971]: I0309 09:59:17.517761 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8367a2d9-a755-4a83-96de-48072345b97c-kube-api-access-2n2l9" (OuterVolumeSpecName: "kube-api-access-2n2l9") pod "8367a2d9-a755-4a83-96de-48072345b97c" (UID: "8367a2d9-a755-4a83-96de-48072345b97c"). InnerVolumeSpecName "kube-api-access-2n2l9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:59:17 crc kubenswrapper[4971]: I0309 09:59:17.528193 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8367a2d9-a755-4a83-96de-48072345b97c-scripts" (OuterVolumeSpecName: "scripts") pod "8367a2d9-a755-4a83-96de-48072345b97c" (UID: "8367a2d9-a755-4a83-96de-48072345b97c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:59:17 crc kubenswrapper[4971]: I0309 09:59:17.529841 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8367a2d9-a755-4a83-96de-48072345b97c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8367a2d9-a755-4a83-96de-48072345b97c" (UID: "8367a2d9-a755-4a83-96de-48072345b97c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:59:17 crc kubenswrapper[4971]: I0309 09:59:17.541991 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8367a2d9-a755-4a83-96de-48072345b97c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8367a2d9-a755-4a83-96de-48072345b97c" (UID: "8367a2d9-a755-4a83-96de-48072345b97c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:59:17 crc kubenswrapper[4971]: I0309 09:59:17.606794 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8367a2d9-a755-4a83-96de-48072345b97c-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:17 crc kubenswrapper[4971]: I0309 09:59:17.606826 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8367a2d9-a755-4a83-96de-48072345b97c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:17 crc kubenswrapper[4971]: I0309 09:59:17.606836 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8367a2d9-a755-4a83-96de-48072345b97c-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:17 crc kubenswrapper[4971]: I0309 09:59:17.606845 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8367a2d9-a755-4a83-96de-48072345b97c-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:17 crc kubenswrapper[4971]: I0309 09:59:17.606853 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8367a2d9-a755-4a83-96de-48072345b97c-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:17 crc kubenswrapper[4971]: I0309 09:59:17.606862 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2n2l9\" (UniqueName: \"kubernetes.io/projected/8367a2d9-a755-4a83-96de-48072345b97c-kube-api-access-2n2l9\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.036995 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d60a62051113617b895638a906fa963ac1f7228bec8d8912bbb2b872dc97e8f9" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.037444 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c775b" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.506492 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb"] Mar 09 09:59:18 crc kubenswrapper[4971]: E0309 09:59:18.507290 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8367a2d9-a755-4a83-96de-48072345b97c" containerName="swift-ring-rebalance" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.507443 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="8367a2d9-a755-4a83-96de-48072345b97c" containerName="swift-ring-rebalance" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.507694 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="8367a2d9-a755-4a83-96de-48072345b97c" containerName="swift-ring-rebalance" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.508307 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.511119 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.511304 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.518757 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb"] Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.624813 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjrwt\" (UniqueName: \"kubernetes.io/projected/18ceb86a-42e8-4dbf-8c57-f936995fdd05-kube-api-access-qjrwt\") pod \"swift-ring-rebalance-debug-qtvzb\" (UID: \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.624857 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18ceb86a-42e8-4dbf-8c57-f936995fdd05-scripts\") pod \"swift-ring-rebalance-debug-qtvzb\" (UID: \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.624997 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/18ceb86a-42e8-4dbf-8c57-f936995fdd05-swiftconf\") pod \"swift-ring-rebalance-debug-qtvzb\" (UID: \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.625076 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/18ceb86a-42e8-4dbf-8c57-f936995fdd05-ring-data-devices\") pod \"swift-ring-rebalance-debug-qtvzb\" (UID: \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.625120 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/18ceb86a-42e8-4dbf-8c57-f936995fdd05-etc-swift\") pod \"swift-ring-rebalance-debug-qtvzb\" (UID: \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.625248 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/18ceb86a-42e8-4dbf-8c57-f936995fdd05-dispersionconf\") pod \"swift-ring-rebalance-debug-qtvzb\" (UID: \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.726247 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjrwt\" (UniqueName: \"kubernetes.io/projected/18ceb86a-42e8-4dbf-8c57-f936995fdd05-kube-api-access-qjrwt\") pod \"swift-ring-rebalance-debug-qtvzb\" (UID: \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.726294 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18ceb86a-42e8-4dbf-8c57-f936995fdd05-scripts\") pod \"swift-ring-rebalance-debug-qtvzb\" (UID: \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.726327 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/18ceb86a-42e8-4dbf-8c57-f936995fdd05-swiftconf\") pod \"swift-ring-rebalance-debug-qtvzb\" (UID: \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.726367 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/18ceb86a-42e8-4dbf-8c57-f936995fdd05-ring-data-devices\") pod \"swift-ring-rebalance-debug-qtvzb\" (UID: \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.726391 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/18ceb86a-42e8-4dbf-8c57-f936995fdd05-etc-swift\") pod \"swift-ring-rebalance-debug-qtvzb\" (UID: \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.726439 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/18ceb86a-42e8-4dbf-8c57-f936995fdd05-dispersionconf\") pod \"swift-ring-rebalance-debug-qtvzb\" (UID: \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.727151 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/18ceb86a-42e8-4dbf-8c57-f936995fdd05-etc-swift\") pod \"swift-ring-rebalance-debug-qtvzb\" (UID: \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.727250 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18ceb86a-42e8-4dbf-8c57-f936995fdd05-scripts\") pod \"swift-ring-rebalance-debug-qtvzb\" (UID: \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.727687 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/18ceb86a-42e8-4dbf-8c57-f936995fdd05-ring-data-devices\") pod \"swift-ring-rebalance-debug-qtvzb\" (UID: \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.730791 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/18ceb86a-42e8-4dbf-8c57-f936995fdd05-dispersionconf\") pod \"swift-ring-rebalance-debug-qtvzb\" (UID: \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.735859 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/18ceb86a-42e8-4dbf-8c57-f936995fdd05-swiftconf\") pod \"swift-ring-rebalance-debug-qtvzb\" (UID: \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.746034 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjrwt\" (UniqueName: \"kubernetes.io/projected/18ceb86a-42e8-4dbf-8c57-f936995fdd05-kube-api-access-qjrwt\") pod \"swift-ring-rebalance-debug-qtvzb\" (UID: \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb" Mar 09 09:59:18 crc kubenswrapper[4971]: I0309 09:59:18.827685 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb" Mar 09 09:59:19 crc kubenswrapper[4971]: I0309 09:59:19.163889 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8367a2d9-a755-4a83-96de-48072345b97c" path="/var/lib/kubelet/pods/8367a2d9-a755-4a83-96de-48072345b97c/volumes" Mar 09 09:59:19 crc kubenswrapper[4971]: I0309 09:59:19.252372 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb"] Mar 09 09:59:20 crc kubenswrapper[4971]: I0309 09:59:20.055468 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb" event={"ID":"18ceb86a-42e8-4dbf-8c57-f936995fdd05","Type":"ContainerStarted","Data":"b41063b93081525248eaaf64bb95b955d94452c3ca2c31d4954f1b31cba02039"} Mar 09 09:59:20 crc kubenswrapper[4971]: I0309 09:59:20.055832 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb" event={"ID":"18ceb86a-42e8-4dbf-8c57-f936995fdd05","Type":"ContainerStarted","Data":"bda81fbdea867ec84363de32e606a95e803a49f3a20115cf65fb3bde65da53ba"} Mar 09 09:59:20 crc kubenswrapper[4971]: I0309 09:59:20.072003 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb" podStartSLOduration=2.071982223 podStartE2EDuration="2.071982223s" podCreationTimestamp="2026-03-09 09:59:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:59:20.071957312 +0000 UTC m=+2363.631885122" watchObservedRunningTime="2026-03-09 09:59:20.071982223 +0000 UTC m=+2363.631910043" Mar 09 09:59:21 crc kubenswrapper[4971]: I0309 09:59:21.066398 4971 generic.go:334] "Generic (PLEG): container finished" podID="18ceb86a-42e8-4dbf-8c57-f936995fdd05" containerID="b41063b93081525248eaaf64bb95b955d94452c3ca2c31d4954f1b31cba02039" exitCode=0 Mar 09 09:59:21 crc kubenswrapper[4971]: I0309 09:59:21.066516 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb" event={"ID":"18ceb86a-42e8-4dbf-8c57-f936995fdd05","Type":"ContainerDied","Data":"b41063b93081525248eaaf64bb95b955d94452c3ca2c31d4954f1b31cba02039"} Mar 09 09:59:22 crc kubenswrapper[4971]: I0309 09:59:22.351562 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb" Mar 09 09:59:22 crc kubenswrapper[4971]: I0309 09:59:22.382358 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb"] Mar 09 09:59:22 crc kubenswrapper[4971]: I0309 09:59:22.391762 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb"] Mar 09 09:59:22 crc kubenswrapper[4971]: I0309 09:59:22.478561 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/18ceb86a-42e8-4dbf-8c57-f936995fdd05-swiftconf\") pod \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\" (UID: \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\") " Mar 09 09:59:22 crc kubenswrapper[4971]: I0309 09:59:22.478606 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/18ceb86a-42e8-4dbf-8c57-f936995fdd05-ring-data-devices\") pod \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\" (UID: \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\") " Mar 09 09:59:22 crc kubenswrapper[4971]: I0309 09:59:22.478644 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18ceb86a-42e8-4dbf-8c57-f936995fdd05-scripts\") pod \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\" (UID: \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\") " Mar 09 09:59:22 crc kubenswrapper[4971]: I0309 09:59:22.478687 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjrwt\" (UniqueName: \"kubernetes.io/projected/18ceb86a-42e8-4dbf-8c57-f936995fdd05-kube-api-access-qjrwt\") pod \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\" (UID: \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\") " Mar 09 09:59:22 crc kubenswrapper[4971]: I0309 09:59:22.478755 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/18ceb86a-42e8-4dbf-8c57-f936995fdd05-etc-swift\") pod \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\" (UID: \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\") " Mar 09 09:59:22 crc kubenswrapper[4971]: I0309 09:59:22.478787 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/18ceb86a-42e8-4dbf-8c57-f936995fdd05-dispersionconf\") pod \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\" (UID: \"18ceb86a-42e8-4dbf-8c57-f936995fdd05\") " Mar 09 09:59:22 crc kubenswrapper[4971]: I0309 09:59:22.479942 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18ceb86a-42e8-4dbf-8c57-f936995fdd05-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "18ceb86a-42e8-4dbf-8c57-f936995fdd05" (UID: "18ceb86a-42e8-4dbf-8c57-f936995fdd05"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:59:22 crc kubenswrapper[4971]: I0309 09:59:22.480501 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18ceb86a-42e8-4dbf-8c57-f936995fdd05-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "18ceb86a-42e8-4dbf-8c57-f936995fdd05" (UID: "18ceb86a-42e8-4dbf-8c57-f936995fdd05"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:59:22 crc kubenswrapper[4971]: I0309 09:59:22.485101 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18ceb86a-42e8-4dbf-8c57-f936995fdd05-kube-api-access-qjrwt" (OuterVolumeSpecName: "kube-api-access-qjrwt") pod "18ceb86a-42e8-4dbf-8c57-f936995fdd05" (UID: "18ceb86a-42e8-4dbf-8c57-f936995fdd05"). InnerVolumeSpecName "kube-api-access-qjrwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:59:22 crc kubenswrapper[4971]: I0309 09:59:22.499515 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18ceb86a-42e8-4dbf-8c57-f936995fdd05-scripts" (OuterVolumeSpecName: "scripts") pod "18ceb86a-42e8-4dbf-8c57-f936995fdd05" (UID: "18ceb86a-42e8-4dbf-8c57-f936995fdd05"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:59:22 crc kubenswrapper[4971]: I0309 09:59:22.500876 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18ceb86a-42e8-4dbf-8c57-f936995fdd05-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "18ceb86a-42e8-4dbf-8c57-f936995fdd05" (UID: "18ceb86a-42e8-4dbf-8c57-f936995fdd05"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:59:22 crc kubenswrapper[4971]: I0309 09:59:22.502530 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18ceb86a-42e8-4dbf-8c57-f936995fdd05-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "18ceb86a-42e8-4dbf-8c57-f936995fdd05" (UID: "18ceb86a-42e8-4dbf-8c57-f936995fdd05"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:59:22 crc kubenswrapper[4971]: I0309 09:59:22.580052 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/18ceb86a-42e8-4dbf-8c57-f936995fdd05-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:22 crc kubenswrapper[4971]: I0309 09:59:22.580088 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/18ceb86a-42e8-4dbf-8c57-f936995fdd05-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:22 crc kubenswrapper[4971]: I0309 09:59:22.580101 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18ceb86a-42e8-4dbf-8c57-f936995fdd05-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:22 crc kubenswrapper[4971]: I0309 09:59:22.580110 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjrwt\" (UniqueName: \"kubernetes.io/projected/18ceb86a-42e8-4dbf-8c57-f936995fdd05-kube-api-access-qjrwt\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:22 crc kubenswrapper[4971]: I0309 09:59:22.580120 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/18ceb86a-42e8-4dbf-8c57-f936995fdd05-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:22 crc kubenswrapper[4971]: I0309 09:59:22.580128 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/18ceb86a-42e8-4dbf-8c57-f936995fdd05-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.087473 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bda81fbdea867ec84363de32e606a95e803a49f3a20115cf65fb3bde65da53ba" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.087639 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qtvzb" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.161404 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18ceb86a-42e8-4dbf-8c57-f936995fdd05" path="/var/lib/kubelet/pods/18ceb86a-42e8-4dbf-8c57-f936995fdd05/volumes" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.510034 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-77b8c"] Mar 09 09:59:23 crc kubenswrapper[4971]: E0309 09:59:23.510400 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ceb86a-42e8-4dbf-8c57-f936995fdd05" containerName="swift-ring-rebalance" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.510414 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ceb86a-42e8-4dbf-8c57-f936995fdd05" containerName="swift-ring-rebalance" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.510579 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="18ceb86a-42e8-4dbf-8c57-f936995fdd05" containerName="swift-ring-rebalance" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.511091 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-77b8c" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.514163 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.514418 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.519522 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-77b8c"] Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.693804 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5de6a56b-640c-455a-9e8e-115512e8d43e-ring-data-devices\") pod \"swift-ring-rebalance-debug-77b8c\" (UID: \"5de6a56b-640c-455a-9e8e-115512e8d43e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77b8c" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.693860 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5de6a56b-640c-455a-9e8e-115512e8d43e-dispersionconf\") pod \"swift-ring-rebalance-debug-77b8c\" (UID: \"5de6a56b-640c-455a-9e8e-115512e8d43e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77b8c" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.693898 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5de6a56b-640c-455a-9e8e-115512e8d43e-swiftconf\") pod \"swift-ring-rebalance-debug-77b8c\" (UID: \"5de6a56b-640c-455a-9e8e-115512e8d43e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77b8c" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.693942 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9ghr\" (UniqueName: \"kubernetes.io/projected/5de6a56b-640c-455a-9e8e-115512e8d43e-kube-api-access-c9ghr\") pod \"swift-ring-rebalance-debug-77b8c\" (UID: \"5de6a56b-640c-455a-9e8e-115512e8d43e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77b8c" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.694005 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5de6a56b-640c-455a-9e8e-115512e8d43e-etc-swift\") pod \"swift-ring-rebalance-debug-77b8c\" (UID: \"5de6a56b-640c-455a-9e8e-115512e8d43e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77b8c" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.694028 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5de6a56b-640c-455a-9e8e-115512e8d43e-scripts\") pod \"swift-ring-rebalance-debug-77b8c\" (UID: \"5de6a56b-640c-455a-9e8e-115512e8d43e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77b8c" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.795325 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5de6a56b-640c-455a-9e8e-115512e8d43e-swiftconf\") pod \"swift-ring-rebalance-debug-77b8c\" (UID: \"5de6a56b-640c-455a-9e8e-115512e8d43e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77b8c" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.795469 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9ghr\" (UniqueName: \"kubernetes.io/projected/5de6a56b-640c-455a-9e8e-115512e8d43e-kube-api-access-c9ghr\") pod \"swift-ring-rebalance-debug-77b8c\" (UID: \"5de6a56b-640c-455a-9e8e-115512e8d43e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77b8c" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.795537 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5de6a56b-640c-455a-9e8e-115512e8d43e-etc-swift\") pod \"swift-ring-rebalance-debug-77b8c\" (UID: \"5de6a56b-640c-455a-9e8e-115512e8d43e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77b8c" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.795579 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5de6a56b-640c-455a-9e8e-115512e8d43e-scripts\") pod \"swift-ring-rebalance-debug-77b8c\" (UID: \"5de6a56b-640c-455a-9e8e-115512e8d43e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77b8c" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.795707 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5de6a56b-640c-455a-9e8e-115512e8d43e-ring-data-devices\") pod \"swift-ring-rebalance-debug-77b8c\" (UID: \"5de6a56b-640c-455a-9e8e-115512e8d43e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77b8c" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.795740 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5de6a56b-640c-455a-9e8e-115512e8d43e-dispersionconf\") pod \"swift-ring-rebalance-debug-77b8c\" (UID: \"5de6a56b-640c-455a-9e8e-115512e8d43e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77b8c" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.796080 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5de6a56b-640c-455a-9e8e-115512e8d43e-etc-swift\") pod \"swift-ring-rebalance-debug-77b8c\" (UID: \"5de6a56b-640c-455a-9e8e-115512e8d43e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77b8c" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.796797 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5de6a56b-640c-455a-9e8e-115512e8d43e-scripts\") pod \"swift-ring-rebalance-debug-77b8c\" (UID: \"5de6a56b-640c-455a-9e8e-115512e8d43e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77b8c" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.796824 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5de6a56b-640c-455a-9e8e-115512e8d43e-ring-data-devices\") pod \"swift-ring-rebalance-debug-77b8c\" (UID: \"5de6a56b-640c-455a-9e8e-115512e8d43e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77b8c" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.806162 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5de6a56b-640c-455a-9e8e-115512e8d43e-swiftconf\") pod \"swift-ring-rebalance-debug-77b8c\" (UID: \"5de6a56b-640c-455a-9e8e-115512e8d43e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77b8c" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.806269 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5de6a56b-640c-455a-9e8e-115512e8d43e-dispersionconf\") pod \"swift-ring-rebalance-debug-77b8c\" (UID: \"5de6a56b-640c-455a-9e8e-115512e8d43e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77b8c" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.816017 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9ghr\" (UniqueName: \"kubernetes.io/projected/5de6a56b-640c-455a-9e8e-115512e8d43e-kube-api-access-c9ghr\") pod \"swift-ring-rebalance-debug-77b8c\" (UID: \"5de6a56b-640c-455a-9e8e-115512e8d43e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-77b8c" Mar 09 09:59:23 crc kubenswrapper[4971]: I0309 09:59:23.831789 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-77b8c" Mar 09 09:59:24 crc kubenswrapper[4971]: I0309 09:59:24.251215 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-77b8c"] Mar 09 09:59:25 crc kubenswrapper[4971]: I0309 09:59:25.102452 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-77b8c" event={"ID":"5de6a56b-640c-455a-9e8e-115512e8d43e","Type":"ContainerStarted","Data":"19a3eec3364417e11fa09e5352c12f11703d798d0c7cef43bb83339bf7456876"} Mar 09 09:59:25 crc kubenswrapper[4971]: I0309 09:59:25.102768 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-77b8c" event={"ID":"5de6a56b-640c-455a-9e8e-115512e8d43e","Type":"ContainerStarted","Data":"9adf7050e4be0fe59a5e99b6f8a42a086a6dfe1c265802ba186d2e08acba5686"} Mar 09 09:59:25 crc kubenswrapper[4971]: I0309 09:59:25.125606 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-77b8c" podStartSLOduration=2.125585302 podStartE2EDuration="2.125585302s" podCreationTimestamp="2026-03-09 09:59:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:59:25.118446356 +0000 UTC m=+2368.678374166" watchObservedRunningTime="2026-03-09 09:59:25.125585302 +0000 UTC m=+2368.685513112" Mar 09 09:59:26 crc kubenswrapper[4971]: I0309 09:59:26.115293 4971 generic.go:334] "Generic (PLEG): container finished" podID="5de6a56b-640c-455a-9e8e-115512e8d43e" containerID="19a3eec3364417e11fa09e5352c12f11703d798d0c7cef43bb83339bf7456876" exitCode=0 Mar 09 09:59:26 crc kubenswrapper[4971]: I0309 09:59:26.115394 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-77b8c" event={"ID":"5de6a56b-640c-455a-9e8e-115512e8d43e","Type":"ContainerDied","Data":"19a3eec3364417e11fa09e5352c12f11703d798d0c7cef43bb83339bf7456876"} Mar 09 09:59:27 crc kubenswrapper[4971]: I0309 09:59:27.427927 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-77b8c" Mar 09 09:59:27 crc kubenswrapper[4971]: I0309 09:59:27.458921 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-77b8c"] Mar 09 09:59:27 crc kubenswrapper[4971]: I0309 09:59:27.464592 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-77b8c"] Mar 09 09:59:27 crc kubenswrapper[4971]: I0309 09:59:27.564993 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5de6a56b-640c-455a-9e8e-115512e8d43e-etc-swift\") pod \"5de6a56b-640c-455a-9e8e-115512e8d43e\" (UID: \"5de6a56b-640c-455a-9e8e-115512e8d43e\") " Mar 09 09:59:27 crc kubenswrapper[4971]: I0309 09:59:27.565083 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5de6a56b-640c-455a-9e8e-115512e8d43e-swiftconf\") pod \"5de6a56b-640c-455a-9e8e-115512e8d43e\" (UID: \"5de6a56b-640c-455a-9e8e-115512e8d43e\") " Mar 09 09:59:27 crc kubenswrapper[4971]: I0309 09:59:27.565109 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5de6a56b-640c-455a-9e8e-115512e8d43e-scripts\") pod \"5de6a56b-640c-455a-9e8e-115512e8d43e\" (UID: \"5de6a56b-640c-455a-9e8e-115512e8d43e\") " Mar 09 09:59:27 crc kubenswrapper[4971]: I0309 09:59:27.565224 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9ghr\" (UniqueName: \"kubernetes.io/projected/5de6a56b-640c-455a-9e8e-115512e8d43e-kube-api-access-c9ghr\") pod \"5de6a56b-640c-455a-9e8e-115512e8d43e\" (UID: \"5de6a56b-640c-455a-9e8e-115512e8d43e\") " Mar 09 09:59:27 crc kubenswrapper[4971]: I0309 09:59:27.565249 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5de6a56b-640c-455a-9e8e-115512e8d43e-dispersionconf\") pod \"5de6a56b-640c-455a-9e8e-115512e8d43e\" (UID: \"5de6a56b-640c-455a-9e8e-115512e8d43e\") " Mar 09 09:59:27 crc kubenswrapper[4971]: I0309 09:59:27.565316 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5de6a56b-640c-455a-9e8e-115512e8d43e-ring-data-devices\") pod \"5de6a56b-640c-455a-9e8e-115512e8d43e\" (UID: \"5de6a56b-640c-455a-9e8e-115512e8d43e\") " Mar 09 09:59:27 crc kubenswrapper[4971]: I0309 09:59:27.566130 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5de6a56b-640c-455a-9e8e-115512e8d43e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5de6a56b-640c-455a-9e8e-115512e8d43e" (UID: "5de6a56b-640c-455a-9e8e-115512e8d43e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:59:27 crc kubenswrapper[4971]: I0309 09:59:27.566485 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5de6a56b-640c-455a-9e8e-115512e8d43e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5de6a56b-640c-455a-9e8e-115512e8d43e" (UID: "5de6a56b-640c-455a-9e8e-115512e8d43e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:59:27 crc kubenswrapper[4971]: I0309 09:59:27.574545 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5de6a56b-640c-455a-9e8e-115512e8d43e-kube-api-access-c9ghr" (OuterVolumeSpecName: "kube-api-access-c9ghr") pod "5de6a56b-640c-455a-9e8e-115512e8d43e" (UID: "5de6a56b-640c-455a-9e8e-115512e8d43e"). InnerVolumeSpecName "kube-api-access-c9ghr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:59:27 crc kubenswrapper[4971]: I0309 09:59:27.584568 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5de6a56b-640c-455a-9e8e-115512e8d43e-scripts" (OuterVolumeSpecName: "scripts") pod "5de6a56b-640c-455a-9e8e-115512e8d43e" (UID: "5de6a56b-640c-455a-9e8e-115512e8d43e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:59:27 crc kubenswrapper[4971]: I0309 09:59:27.585762 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5de6a56b-640c-455a-9e8e-115512e8d43e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5de6a56b-640c-455a-9e8e-115512e8d43e" (UID: "5de6a56b-640c-455a-9e8e-115512e8d43e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:59:27 crc kubenswrapper[4971]: I0309 09:59:27.589121 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5de6a56b-640c-455a-9e8e-115512e8d43e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5de6a56b-640c-455a-9e8e-115512e8d43e" (UID: "5de6a56b-640c-455a-9e8e-115512e8d43e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:59:27 crc kubenswrapper[4971]: I0309 09:59:27.666927 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9ghr\" (UniqueName: \"kubernetes.io/projected/5de6a56b-640c-455a-9e8e-115512e8d43e-kube-api-access-c9ghr\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:27 crc kubenswrapper[4971]: I0309 09:59:27.666977 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5de6a56b-640c-455a-9e8e-115512e8d43e-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:27 crc kubenswrapper[4971]: I0309 09:59:27.666987 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5de6a56b-640c-455a-9e8e-115512e8d43e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:27 crc kubenswrapper[4971]: I0309 09:59:27.666996 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5de6a56b-640c-455a-9e8e-115512e8d43e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:27 crc kubenswrapper[4971]: I0309 09:59:27.667005 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5de6a56b-640c-455a-9e8e-115512e8d43e-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:27 crc kubenswrapper[4971]: I0309 09:59:27.667013 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5de6a56b-640c-455a-9e8e-115512e8d43e-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.131108 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9adf7050e4be0fe59a5e99b6f8a42a086a6dfe1c265802ba186d2e08acba5686" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.131156 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-77b8c" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.580669 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m"] Mar 09 09:59:28 crc kubenswrapper[4971]: E0309 09:59:28.581249 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5de6a56b-640c-455a-9e8e-115512e8d43e" containerName="swift-ring-rebalance" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.581260 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5de6a56b-640c-455a-9e8e-115512e8d43e" containerName="swift-ring-rebalance" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.581427 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="5de6a56b-640c-455a-9e8e-115512e8d43e" containerName="swift-ring-rebalance" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.581886 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.583574 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.584808 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.599708 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m"] Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.681460 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-etc-swift\") pod \"swift-ring-rebalance-debug-8kr4m\" (UID: \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.681614 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-swiftconf\") pod \"swift-ring-rebalance-debug-8kr4m\" (UID: \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.681637 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-ring-data-devices\") pod \"swift-ring-rebalance-debug-8kr4m\" (UID: \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.681675 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh87w\" (UniqueName: \"kubernetes.io/projected/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-kube-api-access-gh87w\") pod \"swift-ring-rebalance-debug-8kr4m\" (UID: \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.681694 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-scripts\") pod \"swift-ring-rebalance-debug-8kr4m\" (UID: \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.681723 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-dispersionconf\") pod \"swift-ring-rebalance-debug-8kr4m\" (UID: \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.783721 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-swiftconf\") pod \"swift-ring-rebalance-debug-8kr4m\" (UID: \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.783788 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-ring-data-devices\") pod \"swift-ring-rebalance-debug-8kr4m\" (UID: \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.783825 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh87w\" (UniqueName: \"kubernetes.io/projected/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-kube-api-access-gh87w\") pod \"swift-ring-rebalance-debug-8kr4m\" (UID: \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.783852 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-scripts\") pod \"swift-ring-rebalance-debug-8kr4m\" (UID: \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.783877 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-dispersionconf\") pod \"swift-ring-rebalance-debug-8kr4m\" (UID: \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.783934 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-etc-swift\") pod \"swift-ring-rebalance-debug-8kr4m\" (UID: \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.784569 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-etc-swift\") pod \"swift-ring-rebalance-debug-8kr4m\" (UID: \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.785190 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-scripts\") pod \"swift-ring-rebalance-debug-8kr4m\" (UID: \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.785279 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-ring-data-devices\") pod \"swift-ring-rebalance-debug-8kr4m\" (UID: \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.788602 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-swiftconf\") pod \"swift-ring-rebalance-debug-8kr4m\" (UID: \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.790195 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-dispersionconf\") pod \"swift-ring-rebalance-debug-8kr4m\" (UID: \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.801023 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh87w\" (UniqueName: \"kubernetes.io/projected/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-kube-api-access-gh87w\") pod \"swift-ring-rebalance-debug-8kr4m\" (UID: \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m" Mar 09 09:59:28 crc kubenswrapper[4971]: I0309 09:59:28.899061 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m" Mar 09 09:59:29 crc kubenswrapper[4971]: I0309 09:59:29.162149 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5de6a56b-640c-455a-9e8e-115512e8d43e" path="/var/lib/kubelet/pods/5de6a56b-640c-455a-9e8e-115512e8d43e/volumes" Mar 09 09:59:29 crc kubenswrapper[4971]: I0309 09:59:29.327882 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m"] Mar 09 09:59:30 crc kubenswrapper[4971]: I0309 09:59:30.153929 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m" event={"ID":"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac","Type":"ContainerStarted","Data":"559b9119735dd68b60865bfec42f44977615ea484db494ad16a6a30b7badf4a1"} Mar 09 09:59:30 crc kubenswrapper[4971]: I0309 09:59:30.154253 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m" event={"ID":"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac","Type":"ContainerStarted","Data":"372dd61892b259cc288db3a44d4cf46fc2ec5cf27a0d4c65878ce01567ba5c6d"} Mar 09 09:59:30 crc kubenswrapper[4971]: I0309 09:59:30.176804 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m" podStartSLOduration=2.176787605 podStartE2EDuration="2.176787605s" podCreationTimestamp="2026-03-09 09:59:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:59:30.169461323 +0000 UTC m=+2373.729389143" watchObservedRunningTime="2026-03-09 09:59:30.176787605 +0000 UTC m=+2373.736715415" Mar 09 09:59:31 crc kubenswrapper[4971]: I0309 09:59:31.168895 4971 generic.go:334] "Generic (PLEG): container finished" podID="ef5d24f4-de83-401a-b2e2-213c7dfcd6ac" containerID="559b9119735dd68b60865bfec42f44977615ea484db494ad16a6a30b7badf4a1" exitCode=0 Mar 09 09:59:31 crc kubenswrapper[4971]: I0309 09:59:31.168998 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m" event={"ID":"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac","Type":"ContainerDied","Data":"559b9119735dd68b60865bfec42f44977615ea484db494ad16a6a30b7badf4a1"} Mar 09 09:59:32 crc kubenswrapper[4971]: I0309 09:59:32.444456 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m" Mar 09 09:59:32 crc kubenswrapper[4971]: I0309 09:59:32.480852 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m"] Mar 09 09:59:32 crc kubenswrapper[4971]: I0309 09:59:32.488747 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m"] Mar 09 09:59:32 crc kubenswrapper[4971]: I0309 09:59:32.555264 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-scripts\") pod \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\" (UID: \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\") " Mar 09 09:59:32 crc kubenswrapper[4971]: I0309 09:59:32.555341 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-etc-swift\") pod \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\" (UID: \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\") " Mar 09 09:59:32 crc kubenswrapper[4971]: I0309 09:59:32.555384 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-ring-data-devices\") pod \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\" (UID: \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\") " Mar 09 09:59:32 crc kubenswrapper[4971]: I0309 09:59:32.555464 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh87w\" (UniqueName: \"kubernetes.io/projected/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-kube-api-access-gh87w\") pod \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\" (UID: \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\") " Mar 09 09:59:32 crc kubenswrapper[4971]: I0309 09:59:32.555499 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-dispersionconf\") pod \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\" (UID: \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\") " Mar 09 09:59:32 crc kubenswrapper[4971]: I0309 09:59:32.555546 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-swiftconf\") pod \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\" (UID: \"ef5d24f4-de83-401a-b2e2-213c7dfcd6ac\") " Mar 09 09:59:32 crc kubenswrapper[4971]: I0309 09:59:32.556209 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ef5d24f4-de83-401a-b2e2-213c7dfcd6ac" (UID: "ef5d24f4-de83-401a-b2e2-213c7dfcd6ac"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:59:32 crc kubenswrapper[4971]: I0309 09:59:32.556653 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ef5d24f4-de83-401a-b2e2-213c7dfcd6ac" (UID: "ef5d24f4-de83-401a-b2e2-213c7dfcd6ac"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:59:32 crc kubenswrapper[4971]: I0309 09:59:32.560914 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-kube-api-access-gh87w" (OuterVolumeSpecName: "kube-api-access-gh87w") pod "ef5d24f4-de83-401a-b2e2-213c7dfcd6ac" (UID: "ef5d24f4-de83-401a-b2e2-213c7dfcd6ac"). InnerVolumeSpecName "kube-api-access-gh87w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:59:32 crc kubenswrapper[4971]: I0309 09:59:32.575817 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-scripts" (OuterVolumeSpecName: "scripts") pod "ef5d24f4-de83-401a-b2e2-213c7dfcd6ac" (UID: "ef5d24f4-de83-401a-b2e2-213c7dfcd6ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:59:32 crc kubenswrapper[4971]: I0309 09:59:32.577518 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ef5d24f4-de83-401a-b2e2-213c7dfcd6ac" (UID: "ef5d24f4-de83-401a-b2e2-213c7dfcd6ac"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:59:32 crc kubenswrapper[4971]: I0309 09:59:32.578409 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ef5d24f4-de83-401a-b2e2-213c7dfcd6ac" (UID: "ef5d24f4-de83-401a-b2e2-213c7dfcd6ac"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:59:32 crc kubenswrapper[4971]: I0309 09:59:32.657606 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh87w\" (UniqueName: \"kubernetes.io/projected/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-kube-api-access-gh87w\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:32 crc kubenswrapper[4971]: I0309 09:59:32.657652 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:32 crc kubenswrapper[4971]: I0309 09:59:32.657665 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:32 crc kubenswrapper[4971]: I0309 09:59:32.657681 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:32 crc kubenswrapper[4971]: I0309 09:59:32.657691 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:32 crc kubenswrapper[4971]: I0309 09:59:32.657702 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.161694 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef5d24f4-de83-401a-b2e2-213c7dfcd6ac" path="/var/lib/kubelet/pods/ef5d24f4-de83-401a-b2e2-213c7dfcd6ac/volumes" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.184206 4971 scope.go:117] "RemoveContainer" containerID="559b9119735dd68b60865bfec42f44977615ea484db494ad16a6a30b7badf4a1" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.184251 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8kr4m" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.643109 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb"] Mar 09 09:59:33 crc kubenswrapper[4971]: E0309 09:59:33.644284 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef5d24f4-de83-401a-b2e2-213c7dfcd6ac" containerName="swift-ring-rebalance" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.644307 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef5d24f4-de83-401a-b2e2-213c7dfcd6ac" containerName="swift-ring-rebalance" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.644523 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef5d24f4-de83-401a-b2e2-213c7dfcd6ac" containerName="swift-ring-rebalance" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.645754 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.650907 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.651268 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.653183 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb"] Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.670882 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmrm8\" (UniqueName: \"kubernetes.io/projected/71df1a0b-8156-4095-bccc-571f82f58f9d-kube-api-access-pmrm8\") pod \"swift-ring-rebalance-debug-kzkgb\" (UID: \"71df1a0b-8156-4095-bccc-571f82f58f9d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.670928 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/71df1a0b-8156-4095-bccc-571f82f58f9d-etc-swift\") pod \"swift-ring-rebalance-debug-kzkgb\" (UID: \"71df1a0b-8156-4095-bccc-571f82f58f9d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.670970 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/71df1a0b-8156-4095-bccc-571f82f58f9d-ring-data-devices\") pod \"swift-ring-rebalance-debug-kzkgb\" (UID: \"71df1a0b-8156-4095-bccc-571f82f58f9d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.671031 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71df1a0b-8156-4095-bccc-571f82f58f9d-scripts\") pod \"swift-ring-rebalance-debug-kzkgb\" (UID: \"71df1a0b-8156-4095-bccc-571f82f58f9d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.671065 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/71df1a0b-8156-4095-bccc-571f82f58f9d-swiftconf\") pod \"swift-ring-rebalance-debug-kzkgb\" (UID: \"71df1a0b-8156-4095-bccc-571f82f58f9d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.671160 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/71df1a0b-8156-4095-bccc-571f82f58f9d-dispersionconf\") pod \"swift-ring-rebalance-debug-kzkgb\" (UID: \"71df1a0b-8156-4095-bccc-571f82f58f9d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.772101 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmrm8\" (UniqueName: \"kubernetes.io/projected/71df1a0b-8156-4095-bccc-571f82f58f9d-kube-api-access-pmrm8\") pod \"swift-ring-rebalance-debug-kzkgb\" (UID: \"71df1a0b-8156-4095-bccc-571f82f58f9d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.772340 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/71df1a0b-8156-4095-bccc-571f82f58f9d-etc-swift\") pod \"swift-ring-rebalance-debug-kzkgb\" (UID: \"71df1a0b-8156-4095-bccc-571f82f58f9d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.772427 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/71df1a0b-8156-4095-bccc-571f82f58f9d-ring-data-devices\") pod \"swift-ring-rebalance-debug-kzkgb\" (UID: \"71df1a0b-8156-4095-bccc-571f82f58f9d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.772508 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71df1a0b-8156-4095-bccc-571f82f58f9d-scripts\") pod \"swift-ring-rebalance-debug-kzkgb\" (UID: \"71df1a0b-8156-4095-bccc-571f82f58f9d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.772551 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/71df1a0b-8156-4095-bccc-571f82f58f9d-swiftconf\") pod \"swift-ring-rebalance-debug-kzkgb\" (UID: \"71df1a0b-8156-4095-bccc-571f82f58f9d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.772600 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/71df1a0b-8156-4095-bccc-571f82f58f9d-dispersionconf\") pod \"swift-ring-rebalance-debug-kzkgb\" (UID: \"71df1a0b-8156-4095-bccc-571f82f58f9d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.772803 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/71df1a0b-8156-4095-bccc-571f82f58f9d-etc-swift\") pod \"swift-ring-rebalance-debug-kzkgb\" (UID: \"71df1a0b-8156-4095-bccc-571f82f58f9d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.773266 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/71df1a0b-8156-4095-bccc-571f82f58f9d-ring-data-devices\") pod \"swift-ring-rebalance-debug-kzkgb\" (UID: \"71df1a0b-8156-4095-bccc-571f82f58f9d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.774013 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71df1a0b-8156-4095-bccc-571f82f58f9d-scripts\") pod \"swift-ring-rebalance-debug-kzkgb\" (UID: \"71df1a0b-8156-4095-bccc-571f82f58f9d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.777841 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/71df1a0b-8156-4095-bccc-571f82f58f9d-swiftconf\") pod \"swift-ring-rebalance-debug-kzkgb\" (UID: \"71df1a0b-8156-4095-bccc-571f82f58f9d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.777973 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/71df1a0b-8156-4095-bccc-571f82f58f9d-dispersionconf\") pod \"swift-ring-rebalance-debug-kzkgb\" (UID: \"71df1a0b-8156-4095-bccc-571f82f58f9d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.790313 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmrm8\" (UniqueName: \"kubernetes.io/projected/71df1a0b-8156-4095-bccc-571f82f58f9d-kube-api-access-pmrm8\") pod \"swift-ring-rebalance-debug-kzkgb\" (UID: \"71df1a0b-8156-4095-bccc-571f82f58f9d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb" Mar 09 09:59:33 crc kubenswrapper[4971]: I0309 09:59:33.971784 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb" Mar 09 09:59:34 crc kubenswrapper[4971]: I0309 09:59:34.420135 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb"] Mar 09 09:59:35 crc kubenswrapper[4971]: I0309 09:59:35.201892 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb" event={"ID":"71df1a0b-8156-4095-bccc-571f82f58f9d","Type":"ContainerStarted","Data":"c4ecc5a1f2d36b3a9c1f8a6b86b36611cc38f88183b958b2dd145d47eb8f469f"} Mar 09 09:59:35 crc kubenswrapper[4971]: I0309 09:59:35.203290 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb" event={"ID":"71df1a0b-8156-4095-bccc-571f82f58f9d","Type":"ContainerStarted","Data":"14de0cb8b39fb6100411059d8e7a2ef075e8a250f6846c0912411130d2bfc655"} Mar 09 09:59:35 crc kubenswrapper[4971]: I0309 09:59:35.226998 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb" podStartSLOduration=2.226978989 podStartE2EDuration="2.226978989s" podCreationTimestamp="2026-03-09 09:59:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:59:35.216637155 +0000 UTC m=+2378.776564965" watchObservedRunningTime="2026-03-09 09:59:35.226978989 +0000 UTC m=+2378.786906809" Mar 09 09:59:36 crc kubenswrapper[4971]: I0309 09:59:36.213427 4971 generic.go:334] "Generic (PLEG): container finished" podID="71df1a0b-8156-4095-bccc-571f82f58f9d" containerID="c4ecc5a1f2d36b3a9c1f8a6b86b36611cc38f88183b958b2dd145d47eb8f469f" exitCode=0 Mar 09 09:59:36 crc kubenswrapper[4971]: I0309 09:59:36.213470 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb" event={"ID":"71df1a0b-8156-4095-bccc-571f82f58f9d","Type":"ContainerDied","Data":"c4ecc5a1f2d36b3a9c1f8a6b86b36611cc38f88183b958b2dd145d47eb8f469f"} Mar 09 09:59:37 crc kubenswrapper[4971]: I0309 09:59:37.499456 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb" Mar 09 09:59:37 crc kubenswrapper[4971]: I0309 09:59:37.553293 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb"] Mar 09 09:59:37 crc kubenswrapper[4971]: I0309 09:59:37.559805 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb"] Mar 09 09:59:37 crc kubenswrapper[4971]: I0309 09:59:37.668472 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/71df1a0b-8156-4095-bccc-571f82f58f9d-swiftconf\") pod \"71df1a0b-8156-4095-bccc-571f82f58f9d\" (UID: \"71df1a0b-8156-4095-bccc-571f82f58f9d\") " Mar 09 09:59:37 crc kubenswrapper[4971]: I0309 09:59:37.668787 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/71df1a0b-8156-4095-bccc-571f82f58f9d-ring-data-devices\") pod \"71df1a0b-8156-4095-bccc-571f82f58f9d\" (UID: \"71df1a0b-8156-4095-bccc-571f82f58f9d\") " Mar 09 09:59:37 crc kubenswrapper[4971]: I0309 09:59:37.668814 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/71df1a0b-8156-4095-bccc-571f82f58f9d-dispersionconf\") pod \"71df1a0b-8156-4095-bccc-571f82f58f9d\" (UID: \"71df1a0b-8156-4095-bccc-571f82f58f9d\") " Mar 09 09:59:37 crc kubenswrapper[4971]: I0309 09:59:37.668874 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/71df1a0b-8156-4095-bccc-571f82f58f9d-etc-swift\") pod \"71df1a0b-8156-4095-bccc-571f82f58f9d\" (UID: \"71df1a0b-8156-4095-bccc-571f82f58f9d\") " Mar 09 09:59:37 crc kubenswrapper[4971]: I0309 09:59:37.668981 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71df1a0b-8156-4095-bccc-571f82f58f9d-scripts\") pod \"71df1a0b-8156-4095-bccc-571f82f58f9d\" (UID: \"71df1a0b-8156-4095-bccc-571f82f58f9d\") " Mar 09 09:59:37 crc kubenswrapper[4971]: I0309 09:59:37.669005 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmrm8\" (UniqueName: \"kubernetes.io/projected/71df1a0b-8156-4095-bccc-571f82f58f9d-kube-api-access-pmrm8\") pod \"71df1a0b-8156-4095-bccc-571f82f58f9d\" (UID: \"71df1a0b-8156-4095-bccc-571f82f58f9d\") " Mar 09 09:59:37 crc kubenswrapper[4971]: I0309 09:59:37.669167 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71df1a0b-8156-4095-bccc-571f82f58f9d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "71df1a0b-8156-4095-bccc-571f82f58f9d" (UID: "71df1a0b-8156-4095-bccc-571f82f58f9d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:59:37 crc kubenswrapper[4971]: I0309 09:59:37.669329 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/71df1a0b-8156-4095-bccc-571f82f58f9d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:37 crc kubenswrapper[4971]: I0309 09:59:37.669936 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71df1a0b-8156-4095-bccc-571f82f58f9d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "71df1a0b-8156-4095-bccc-571f82f58f9d" (UID: "71df1a0b-8156-4095-bccc-571f82f58f9d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:59:37 crc kubenswrapper[4971]: I0309 09:59:37.681549 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71df1a0b-8156-4095-bccc-571f82f58f9d-kube-api-access-pmrm8" (OuterVolumeSpecName: "kube-api-access-pmrm8") pod "71df1a0b-8156-4095-bccc-571f82f58f9d" (UID: "71df1a0b-8156-4095-bccc-571f82f58f9d"). InnerVolumeSpecName "kube-api-access-pmrm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:59:37 crc kubenswrapper[4971]: I0309 09:59:37.689699 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71df1a0b-8156-4095-bccc-571f82f58f9d-scripts" (OuterVolumeSpecName: "scripts") pod "71df1a0b-8156-4095-bccc-571f82f58f9d" (UID: "71df1a0b-8156-4095-bccc-571f82f58f9d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:59:37 crc kubenswrapper[4971]: I0309 09:59:37.692878 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71df1a0b-8156-4095-bccc-571f82f58f9d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "71df1a0b-8156-4095-bccc-571f82f58f9d" (UID: "71df1a0b-8156-4095-bccc-571f82f58f9d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:59:37 crc kubenswrapper[4971]: I0309 09:59:37.693481 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71df1a0b-8156-4095-bccc-571f82f58f9d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "71df1a0b-8156-4095-bccc-571f82f58f9d" (UID: "71df1a0b-8156-4095-bccc-571f82f58f9d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:59:37 crc kubenswrapper[4971]: I0309 09:59:37.770495 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71df1a0b-8156-4095-bccc-571f82f58f9d-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:37 crc kubenswrapper[4971]: I0309 09:59:37.770548 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmrm8\" (UniqueName: \"kubernetes.io/projected/71df1a0b-8156-4095-bccc-571f82f58f9d-kube-api-access-pmrm8\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:37 crc kubenswrapper[4971]: I0309 09:59:37.770562 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/71df1a0b-8156-4095-bccc-571f82f58f9d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:37 crc kubenswrapper[4971]: I0309 09:59:37.770574 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/71df1a0b-8156-4095-bccc-571f82f58f9d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:37 crc kubenswrapper[4971]: I0309 09:59:37.770585 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/71df1a0b-8156-4095-bccc-571f82f58f9d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.232978 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14de0cb8b39fb6100411059d8e7a2ef075e8a250f6846c0912411130d2bfc655" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.233025 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kzkgb" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.664809 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx"] Mar 09 09:59:38 crc kubenswrapper[4971]: E0309 09:59:38.665184 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71df1a0b-8156-4095-bccc-571f82f58f9d" containerName="swift-ring-rebalance" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.665200 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="71df1a0b-8156-4095-bccc-571f82f58f9d" containerName="swift-ring-rebalance" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.665406 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="71df1a0b-8156-4095-bccc-571f82f58f9d" containerName="swift-ring-rebalance" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.665992 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.671234 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.671601 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.674524 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx"] Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.783117 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8ptg\" (UniqueName: \"kubernetes.io/projected/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-kube-api-access-q8ptg\") pod \"swift-ring-rebalance-debug-h2ftx\" (UID: \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.783258 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-scripts\") pod \"swift-ring-rebalance-debug-h2ftx\" (UID: \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.783327 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-swiftconf\") pod \"swift-ring-rebalance-debug-h2ftx\" (UID: \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.783531 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-ring-data-devices\") pod \"swift-ring-rebalance-debug-h2ftx\" (UID: \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.783635 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-etc-swift\") pod \"swift-ring-rebalance-debug-h2ftx\" (UID: \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.783718 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-dispersionconf\") pod \"swift-ring-rebalance-debug-h2ftx\" (UID: \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.885221 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-dispersionconf\") pod \"swift-ring-rebalance-debug-h2ftx\" (UID: \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.885322 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8ptg\" (UniqueName: \"kubernetes.io/projected/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-kube-api-access-q8ptg\") pod \"swift-ring-rebalance-debug-h2ftx\" (UID: \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.885378 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-scripts\") pod \"swift-ring-rebalance-debug-h2ftx\" (UID: \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.885405 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-swiftconf\") pod \"swift-ring-rebalance-debug-h2ftx\" (UID: \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.885469 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-ring-data-devices\") pod \"swift-ring-rebalance-debug-h2ftx\" (UID: \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.885514 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-etc-swift\") pod \"swift-ring-rebalance-debug-h2ftx\" (UID: \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.886001 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-etc-swift\") pod \"swift-ring-rebalance-debug-h2ftx\" (UID: \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.887205 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-scripts\") pod \"swift-ring-rebalance-debug-h2ftx\" (UID: \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.887727 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-ring-data-devices\") pod \"swift-ring-rebalance-debug-h2ftx\" (UID: \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.889773 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-swiftconf\") pod \"swift-ring-rebalance-debug-h2ftx\" (UID: \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.889873 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-dispersionconf\") pod \"swift-ring-rebalance-debug-h2ftx\" (UID: \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.903292 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8ptg\" (UniqueName: \"kubernetes.io/projected/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-kube-api-access-q8ptg\") pod \"swift-ring-rebalance-debug-h2ftx\" (UID: \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx" Mar 09 09:59:38 crc kubenswrapper[4971]: I0309 09:59:38.983178 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx" Mar 09 09:59:39 crc kubenswrapper[4971]: I0309 09:59:39.162119 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71df1a0b-8156-4095-bccc-571f82f58f9d" path="/var/lib/kubelet/pods/71df1a0b-8156-4095-bccc-571f82f58f9d/volumes" Mar 09 09:59:39 crc kubenswrapper[4971]: I0309 09:59:39.395630 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx"] Mar 09 09:59:39 crc kubenswrapper[4971]: W0309 09:59:39.406202 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4a50db1_35b4_4d51_93c7_42dcc0ba0966.slice/crio-c0be37f7406450ae52ec344ae136481b9b28a103efc9a721e7e4e0dc3745240b WatchSource:0}: Error finding container c0be37f7406450ae52ec344ae136481b9b28a103efc9a721e7e4e0dc3745240b: Status 404 returned error can't find the container with id c0be37f7406450ae52ec344ae136481b9b28a103efc9a721e7e4e0dc3745240b Mar 09 09:59:40 crc kubenswrapper[4971]: I0309 09:59:40.249541 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx" event={"ID":"f4a50db1-35b4-4d51-93c7-42dcc0ba0966","Type":"ContainerStarted","Data":"a7717fbca1ae023ce28e5ec93363d049b08c2f0a769ba080356ff17b3225e3c9"} Mar 09 09:59:40 crc kubenswrapper[4971]: I0309 09:59:40.249865 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx" event={"ID":"f4a50db1-35b4-4d51-93c7-42dcc0ba0966","Type":"ContainerStarted","Data":"c0be37f7406450ae52ec344ae136481b9b28a103efc9a721e7e4e0dc3745240b"} Mar 09 09:59:40 crc kubenswrapper[4971]: I0309 09:59:40.266339 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx" podStartSLOduration=2.266319736 podStartE2EDuration="2.266319736s" podCreationTimestamp="2026-03-09 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:59:40.264235459 +0000 UTC m=+2383.824163279" watchObservedRunningTime="2026-03-09 09:59:40.266319736 +0000 UTC m=+2383.826247546" Mar 09 09:59:41 crc kubenswrapper[4971]: I0309 09:59:41.263249 4971 generic.go:334] "Generic (PLEG): container finished" podID="f4a50db1-35b4-4d51-93c7-42dcc0ba0966" containerID="a7717fbca1ae023ce28e5ec93363d049b08c2f0a769ba080356ff17b3225e3c9" exitCode=0 Mar 09 09:59:41 crc kubenswrapper[4971]: I0309 09:59:41.263340 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx" event={"ID":"f4a50db1-35b4-4d51-93c7-42dcc0ba0966","Type":"ContainerDied","Data":"a7717fbca1ae023ce28e5ec93363d049b08c2f0a769ba080356ff17b3225e3c9"} Mar 09 09:59:42 crc kubenswrapper[4971]: I0309 09:59:42.515177 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx" Mar 09 09:59:42 crc kubenswrapper[4971]: I0309 09:59:42.542095 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-scripts\") pod \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\" (UID: \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\") " Mar 09 09:59:42 crc kubenswrapper[4971]: I0309 09:59:42.542149 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-etc-swift\") pod \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\" (UID: \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\") " Mar 09 09:59:42 crc kubenswrapper[4971]: I0309 09:59:42.542193 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-dispersionconf\") pod \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\" (UID: \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\") " Mar 09 09:59:42 crc kubenswrapper[4971]: I0309 09:59:42.542219 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-swiftconf\") pod \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\" (UID: \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\") " Mar 09 09:59:42 crc kubenswrapper[4971]: I0309 09:59:42.542247 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-ring-data-devices\") pod \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\" (UID: \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\") " Mar 09 09:59:42 crc kubenswrapper[4971]: I0309 09:59:42.542264 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8ptg\" (UniqueName: \"kubernetes.io/projected/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-kube-api-access-q8ptg\") pod \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\" (UID: \"f4a50db1-35b4-4d51-93c7-42dcc0ba0966\") " Mar 09 09:59:42 crc kubenswrapper[4971]: I0309 09:59:42.544925 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f4a50db1-35b4-4d51-93c7-42dcc0ba0966" (UID: "f4a50db1-35b4-4d51-93c7-42dcc0ba0966"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:59:42 crc kubenswrapper[4971]: I0309 09:59:42.545847 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f4a50db1-35b4-4d51-93c7-42dcc0ba0966" (UID: "f4a50db1-35b4-4d51-93c7-42dcc0ba0966"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:59:42 crc kubenswrapper[4971]: I0309 09:59:42.548476 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx"] Mar 09 09:59:42 crc kubenswrapper[4971]: I0309 09:59:42.548528 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-kube-api-access-q8ptg" (OuterVolumeSpecName: "kube-api-access-q8ptg") pod "f4a50db1-35b4-4d51-93c7-42dcc0ba0966" (UID: "f4a50db1-35b4-4d51-93c7-42dcc0ba0966"). InnerVolumeSpecName "kube-api-access-q8ptg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:59:42 crc kubenswrapper[4971]: I0309 09:59:42.552584 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx"] Mar 09 09:59:42 crc kubenswrapper[4971]: I0309 09:59:42.562766 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f4a50db1-35b4-4d51-93c7-42dcc0ba0966" (UID: "f4a50db1-35b4-4d51-93c7-42dcc0ba0966"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:59:42 crc kubenswrapper[4971]: I0309 09:59:42.564293 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f4a50db1-35b4-4d51-93c7-42dcc0ba0966" (UID: "f4a50db1-35b4-4d51-93c7-42dcc0ba0966"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:59:42 crc kubenswrapper[4971]: I0309 09:59:42.566962 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-scripts" (OuterVolumeSpecName: "scripts") pod "f4a50db1-35b4-4d51-93c7-42dcc0ba0966" (UID: "f4a50db1-35b4-4d51-93c7-42dcc0ba0966"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:59:42 crc kubenswrapper[4971]: I0309 09:59:42.644175 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:42 crc kubenswrapper[4971]: I0309 09:59:42.644207 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:42 crc kubenswrapper[4971]: I0309 09:59:42.644222 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:42 crc kubenswrapper[4971]: I0309 09:59:42.644232 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:42 crc kubenswrapper[4971]: I0309 09:59:42.644244 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8ptg\" (UniqueName: \"kubernetes.io/projected/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-kube-api-access-q8ptg\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:42 crc kubenswrapper[4971]: I0309 09:59:42.644256 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4a50db1-35b4-4d51-93c7-42dcc0ba0966-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.165001 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4a50db1-35b4-4d51-93c7-42dcc0ba0966" path="/var/lib/kubelet/pods/f4a50db1-35b4-4d51-93c7-42dcc0ba0966/volumes" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.280503 4971 scope.go:117] "RemoveContainer" containerID="a7717fbca1ae023ce28e5ec93363d049b08c2f0a769ba080356ff17b3225e3c9" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.280546 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-h2ftx" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.708670 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf"] Mar 09 09:59:43 crc kubenswrapper[4971]: E0309 09:59:43.709230 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4a50db1-35b4-4d51-93c7-42dcc0ba0966" containerName="swift-ring-rebalance" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.709244 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4a50db1-35b4-4d51-93c7-42dcc0ba0966" containerName="swift-ring-rebalance" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.709393 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4a50db1-35b4-4d51-93c7-42dcc0ba0966" containerName="swift-ring-rebalance" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.709948 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.714789 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.717292 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.728164 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf"] Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.757953 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b5c01b16-5e73-4637-88c9-d82f3fb7a747-etc-swift\") pod \"swift-ring-rebalance-debug-2ldxf\" (UID: \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.758019 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp9kn\" (UniqueName: \"kubernetes.io/projected/b5c01b16-5e73-4637-88c9-d82f3fb7a747-kube-api-access-fp9kn\") pod \"swift-ring-rebalance-debug-2ldxf\" (UID: \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.758096 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b5c01b16-5e73-4637-88c9-d82f3fb7a747-swiftconf\") pod \"swift-ring-rebalance-debug-2ldxf\" (UID: \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.758130 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b5c01b16-5e73-4637-88c9-d82f3fb7a747-ring-data-devices\") pod \"swift-ring-rebalance-debug-2ldxf\" (UID: \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.758149 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b5c01b16-5e73-4637-88c9-d82f3fb7a747-dispersionconf\") pod \"swift-ring-rebalance-debug-2ldxf\" (UID: \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.758168 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5c01b16-5e73-4637-88c9-d82f3fb7a747-scripts\") pod \"swift-ring-rebalance-debug-2ldxf\" (UID: \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.860640 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b5c01b16-5e73-4637-88c9-d82f3fb7a747-swiftconf\") pod \"swift-ring-rebalance-debug-2ldxf\" (UID: \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.860718 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b5c01b16-5e73-4637-88c9-d82f3fb7a747-ring-data-devices\") pod \"swift-ring-rebalance-debug-2ldxf\" (UID: \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.860751 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b5c01b16-5e73-4637-88c9-d82f3fb7a747-dispersionconf\") pod \"swift-ring-rebalance-debug-2ldxf\" (UID: \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.860779 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5c01b16-5e73-4637-88c9-d82f3fb7a747-scripts\") pod \"swift-ring-rebalance-debug-2ldxf\" (UID: \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.860869 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b5c01b16-5e73-4637-88c9-d82f3fb7a747-etc-swift\") pod \"swift-ring-rebalance-debug-2ldxf\" (UID: \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.860900 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp9kn\" (UniqueName: \"kubernetes.io/projected/b5c01b16-5e73-4637-88c9-d82f3fb7a747-kube-api-access-fp9kn\") pod \"swift-ring-rebalance-debug-2ldxf\" (UID: \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.861595 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b5c01b16-5e73-4637-88c9-d82f3fb7a747-ring-data-devices\") pod \"swift-ring-rebalance-debug-2ldxf\" (UID: \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.861890 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5c01b16-5e73-4637-88c9-d82f3fb7a747-scripts\") pod \"swift-ring-rebalance-debug-2ldxf\" (UID: \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.861900 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b5c01b16-5e73-4637-88c9-d82f3fb7a747-etc-swift\") pod \"swift-ring-rebalance-debug-2ldxf\" (UID: \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.865025 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b5c01b16-5e73-4637-88c9-d82f3fb7a747-dispersionconf\") pod \"swift-ring-rebalance-debug-2ldxf\" (UID: \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.873474 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b5c01b16-5e73-4637-88c9-d82f3fb7a747-swiftconf\") pod \"swift-ring-rebalance-debug-2ldxf\" (UID: \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf" Mar 09 09:59:43 crc kubenswrapper[4971]: I0309 09:59:43.886719 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp9kn\" (UniqueName: \"kubernetes.io/projected/b5c01b16-5e73-4637-88c9-d82f3fb7a747-kube-api-access-fp9kn\") pod \"swift-ring-rebalance-debug-2ldxf\" (UID: \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf" Mar 09 09:59:44 crc kubenswrapper[4971]: I0309 09:59:44.032644 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf" Mar 09 09:59:44 crc kubenswrapper[4971]: I0309 09:59:44.459499 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf"] Mar 09 09:59:44 crc kubenswrapper[4971]: W0309 09:59:44.462114 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5c01b16_5e73_4637_88c9_d82f3fb7a747.slice/crio-b3a84848872749a535af282df1beafd62a001fddfc31f14089e2ae8b33f73472 WatchSource:0}: Error finding container b3a84848872749a535af282df1beafd62a001fddfc31f14089e2ae8b33f73472: Status 404 returned error can't find the container with id b3a84848872749a535af282df1beafd62a001fddfc31f14089e2ae8b33f73472 Mar 09 09:59:45 crc kubenswrapper[4971]: I0309 09:59:45.302271 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf" event={"ID":"b5c01b16-5e73-4637-88c9-d82f3fb7a747","Type":"ContainerStarted","Data":"61b867e30f8fe175a116c548a92649ea336dc608ac8f50edb676419f68ea2343"} Mar 09 09:59:45 crc kubenswrapper[4971]: I0309 09:59:45.302545 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf" event={"ID":"b5c01b16-5e73-4637-88c9-d82f3fb7a747","Type":"ContainerStarted","Data":"b3a84848872749a535af282df1beafd62a001fddfc31f14089e2ae8b33f73472"} Mar 09 09:59:45 crc kubenswrapper[4971]: I0309 09:59:45.350443 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf" podStartSLOduration=2.350422471 podStartE2EDuration="2.350422471s" podCreationTimestamp="2026-03-09 09:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:59:45.347042579 +0000 UTC m=+2388.906970389" watchObservedRunningTime="2026-03-09 09:59:45.350422471 +0000 UTC m=+2388.910350291" Mar 09 09:59:46 crc kubenswrapper[4971]: I0309 09:59:46.314553 4971 generic.go:334] "Generic (PLEG): container finished" podID="b5c01b16-5e73-4637-88c9-d82f3fb7a747" containerID="61b867e30f8fe175a116c548a92649ea336dc608ac8f50edb676419f68ea2343" exitCode=0 Mar 09 09:59:46 crc kubenswrapper[4971]: I0309 09:59:46.314613 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf" event={"ID":"b5c01b16-5e73-4637-88c9-d82f3fb7a747","Type":"ContainerDied","Data":"61b867e30f8fe175a116c548a92649ea336dc608ac8f50edb676419f68ea2343"} Mar 09 09:59:47 crc kubenswrapper[4971]: I0309 09:59:47.561282 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf" Mar 09 09:59:47 crc kubenswrapper[4971]: I0309 09:59:47.603574 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf"] Mar 09 09:59:47 crc kubenswrapper[4971]: I0309 09:59:47.612700 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf"] Mar 09 09:59:47 crc kubenswrapper[4971]: I0309 09:59:47.712423 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b5c01b16-5e73-4637-88c9-d82f3fb7a747-dispersionconf\") pod \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\" (UID: \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\") " Mar 09 09:59:47 crc kubenswrapper[4971]: I0309 09:59:47.712582 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp9kn\" (UniqueName: \"kubernetes.io/projected/b5c01b16-5e73-4637-88c9-d82f3fb7a747-kube-api-access-fp9kn\") pod \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\" (UID: \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\") " Mar 09 09:59:47 crc kubenswrapper[4971]: I0309 09:59:47.712614 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b5c01b16-5e73-4637-88c9-d82f3fb7a747-ring-data-devices\") pod \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\" (UID: \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\") " Mar 09 09:59:47 crc kubenswrapper[4971]: I0309 09:59:47.712653 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b5c01b16-5e73-4637-88c9-d82f3fb7a747-swiftconf\") pod \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\" (UID: \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\") " Mar 09 09:59:47 crc kubenswrapper[4971]: I0309 09:59:47.712739 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5c01b16-5e73-4637-88c9-d82f3fb7a747-scripts\") pod \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\" (UID: \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\") " Mar 09 09:59:47 crc kubenswrapper[4971]: I0309 09:59:47.712773 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b5c01b16-5e73-4637-88c9-d82f3fb7a747-etc-swift\") pod \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\" (UID: \"b5c01b16-5e73-4637-88c9-d82f3fb7a747\") " Mar 09 09:59:47 crc kubenswrapper[4971]: I0309 09:59:47.713485 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5c01b16-5e73-4637-88c9-d82f3fb7a747-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b5c01b16-5e73-4637-88c9-d82f3fb7a747" (UID: "b5c01b16-5e73-4637-88c9-d82f3fb7a747"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:59:47 crc kubenswrapper[4971]: I0309 09:59:47.713853 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5c01b16-5e73-4637-88c9-d82f3fb7a747-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b5c01b16-5e73-4637-88c9-d82f3fb7a747" (UID: "b5c01b16-5e73-4637-88c9-d82f3fb7a747"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:59:47 crc kubenswrapper[4971]: I0309 09:59:47.717590 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5c01b16-5e73-4637-88c9-d82f3fb7a747-kube-api-access-fp9kn" (OuterVolumeSpecName: "kube-api-access-fp9kn") pod "b5c01b16-5e73-4637-88c9-d82f3fb7a747" (UID: "b5c01b16-5e73-4637-88c9-d82f3fb7a747"). InnerVolumeSpecName "kube-api-access-fp9kn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:59:47 crc kubenswrapper[4971]: I0309 09:59:47.733503 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5c01b16-5e73-4637-88c9-d82f3fb7a747-scripts" (OuterVolumeSpecName: "scripts") pod "b5c01b16-5e73-4637-88c9-d82f3fb7a747" (UID: "b5c01b16-5e73-4637-88c9-d82f3fb7a747"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:59:47 crc kubenswrapper[4971]: I0309 09:59:47.737380 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5c01b16-5e73-4637-88c9-d82f3fb7a747-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b5c01b16-5e73-4637-88c9-d82f3fb7a747" (UID: "b5c01b16-5e73-4637-88c9-d82f3fb7a747"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:59:47 crc kubenswrapper[4971]: I0309 09:59:47.746885 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5c01b16-5e73-4637-88c9-d82f3fb7a747-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b5c01b16-5e73-4637-88c9-d82f3fb7a747" (UID: "b5c01b16-5e73-4637-88c9-d82f3fb7a747"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:59:47 crc kubenswrapper[4971]: I0309 09:59:47.814947 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp9kn\" (UniqueName: \"kubernetes.io/projected/b5c01b16-5e73-4637-88c9-d82f3fb7a747-kube-api-access-fp9kn\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:47 crc kubenswrapper[4971]: I0309 09:59:47.814997 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b5c01b16-5e73-4637-88c9-d82f3fb7a747-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:47 crc kubenswrapper[4971]: I0309 09:59:47.815012 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b5c01b16-5e73-4637-88c9-d82f3fb7a747-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:47 crc kubenswrapper[4971]: I0309 09:59:47.815027 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5c01b16-5e73-4637-88c9-d82f3fb7a747-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:47 crc kubenswrapper[4971]: I0309 09:59:47.815040 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b5c01b16-5e73-4637-88c9-d82f3fb7a747-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:47 crc kubenswrapper[4971]: I0309 09:59:47.815053 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b5c01b16-5e73-4637-88c9-d82f3fb7a747-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:48 crc kubenswrapper[4971]: I0309 09:59:48.331563 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3a84848872749a535af282df1beafd62a001fddfc31f14089e2ae8b33f73472" Mar 09 09:59:48 crc kubenswrapper[4971]: I0309 09:59:48.332126 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2ldxf" Mar 09 09:59:48 crc kubenswrapper[4971]: I0309 09:59:48.733578 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-86kr5"] Mar 09 09:59:48 crc kubenswrapper[4971]: E0309 09:59:48.733951 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5c01b16-5e73-4637-88c9-d82f3fb7a747" containerName="swift-ring-rebalance" Mar 09 09:59:48 crc kubenswrapper[4971]: I0309 09:59:48.733965 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c01b16-5e73-4637-88c9-d82f3fb7a747" containerName="swift-ring-rebalance" Mar 09 09:59:48 crc kubenswrapper[4971]: I0309 09:59:48.734135 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5c01b16-5e73-4637-88c9-d82f3fb7a747" containerName="swift-ring-rebalance" Mar 09 09:59:48 crc kubenswrapper[4971]: I0309 09:59:48.734698 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-86kr5" Mar 09 09:59:48 crc kubenswrapper[4971]: I0309 09:59:48.744215 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-86kr5"] Mar 09 09:59:48 crc kubenswrapper[4971]: I0309 09:59:48.744671 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:59:48 crc kubenswrapper[4971]: I0309 09:59:48.744874 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:59:48 crc kubenswrapper[4971]: I0309 09:59:48.932412 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/779de251-8b36-4a76-9a74-216ac9d7ffff-dispersionconf\") pod \"swift-ring-rebalance-debug-86kr5\" (UID: \"779de251-8b36-4a76-9a74-216ac9d7ffff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-86kr5" Mar 09 09:59:48 crc kubenswrapper[4971]: I0309 09:59:48.932489 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxlsr\" (UniqueName: \"kubernetes.io/projected/779de251-8b36-4a76-9a74-216ac9d7ffff-kube-api-access-mxlsr\") pod \"swift-ring-rebalance-debug-86kr5\" (UID: \"779de251-8b36-4a76-9a74-216ac9d7ffff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-86kr5" Mar 09 09:59:48 crc kubenswrapper[4971]: I0309 09:59:48.932515 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/779de251-8b36-4a76-9a74-216ac9d7ffff-swiftconf\") pod \"swift-ring-rebalance-debug-86kr5\" (UID: \"779de251-8b36-4a76-9a74-216ac9d7ffff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-86kr5" Mar 09 09:59:48 crc kubenswrapper[4971]: I0309 09:59:48.932579 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/779de251-8b36-4a76-9a74-216ac9d7ffff-etc-swift\") pod \"swift-ring-rebalance-debug-86kr5\" (UID: \"779de251-8b36-4a76-9a74-216ac9d7ffff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-86kr5" Mar 09 09:59:48 crc kubenswrapper[4971]: I0309 09:59:48.932607 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/779de251-8b36-4a76-9a74-216ac9d7ffff-scripts\") pod \"swift-ring-rebalance-debug-86kr5\" (UID: \"779de251-8b36-4a76-9a74-216ac9d7ffff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-86kr5" Mar 09 09:59:48 crc kubenswrapper[4971]: I0309 09:59:48.932629 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/779de251-8b36-4a76-9a74-216ac9d7ffff-ring-data-devices\") pod \"swift-ring-rebalance-debug-86kr5\" (UID: \"779de251-8b36-4a76-9a74-216ac9d7ffff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-86kr5" Mar 09 09:59:49 crc kubenswrapper[4971]: I0309 09:59:49.034120 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/779de251-8b36-4a76-9a74-216ac9d7ffff-etc-swift\") pod \"swift-ring-rebalance-debug-86kr5\" (UID: \"779de251-8b36-4a76-9a74-216ac9d7ffff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-86kr5" Mar 09 09:59:49 crc kubenswrapper[4971]: I0309 09:59:49.034170 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/779de251-8b36-4a76-9a74-216ac9d7ffff-scripts\") pod \"swift-ring-rebalance-debug-86kr5\" (UID: \"779de251-8b36-4a76-9a74-216ac9d7ffff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-86kr5" Mar 09 09:59:49 crc kubenswrapper[4971]: I0309 09:59:49.034195 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/779de251-8b36-4a76-9a74-216ac9d7ffff-ring-data-devices\") pod \"swift-ring-rebalance-debug-86kr5\" (UID: \"779de251-8b36-4a76-9a74-216ac9d7ffff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-86kr5" Mar 09 09:59:49 crc kubenswrapper[4971]: I0309 09:59:49.034225 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/779de251-8b36-4a76-9a74-216ac9d7ffff-dispersionconf\") pod \"swift-ring-rebalance-debug-86kr5\" (UID: \"779de251-8b36-4a76-9a74-216ac9d7ffff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-86kr5" Mar 09 09:59:49 crc kubenswrapper[4971]: I0309 09:59:49.034251 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxlsr\" (UniqueName: \"kubernetes.io/projected/779de251-8b36-4a76-9a74-216ac9d7ffff-kube-api-access-mxlsr\") pod \"swift-ring-rebalance-debug-86kr5\" (UID: \"779de251-8b36-4a76-9a74-216ac9d7ffff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-86kr5" Mar 09 09:59:49 crc kubenswrapper[4971]: I0309 09:59:49.034273 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/779de251-8b36-4a76-9a74-216ac9d7ffff-swiftconf\") pod \"swift-ring-rebalance-debug-86kr5\" (UID: \"779de251-8b36-4a76-9a74-216ac9d7ffff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-86kr5" Mar 09 09:59:49 crc kubenswrapper[4971]: I0309 09:59:49.034809 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/779de251-8b36-4a76-9a74-216ac9d7ffff-etc-swift\") pod \"swift-ring-rebalance-debug-86kr5\" (UID: \"779de251-8b36-4a76-9a74-216ac9d7ffff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-86kr5" Mar 09 09:59:49 crc kubenswrapper[4971]: I0309 09:59:49.035245 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/779de251-8b36-4a76-9a74-216ac9d7ffff-ring-data-devices\") pod \"swift-ring-rebalance-debug-86kr5\" (UID: \"779de251-8b36-4a76-9a74-216ac9d7ffff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-86kr5" Mar 09 09:59:49 crc kubenswrapper[4971]: I0309 09:59:49.035320 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/779de251-8b36-4a76-9a74-216ac9d7ffff-scripts\") pod \"swift-ring-rebalance-debug-86kr5\" (UID: \"779de251-8b36-4a76-9a74-216ac9d7ffff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-86kr5" Mar 09 09:59:49 crc kubenswrapper[4971]: I0309 09:59:49.040011 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/779de251-8b36-4a76-9a74-216ac9d7ffff-dispersionconf\") pod \"swift-ring-rebalance-debug-86kr5\" (UID: \"779de251-8b36-4a76-9a74-216ac9d7ffff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-86kr5" Mar 09 09:59:49 crc kubenswrapper[4971]: I0309 09:59:49.040194 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/779de251-8b36-4a76-9a74-216ac9d7ffff-swiftconf\") pod \"swift-ring-rebalance-debug-86kr5\" (UID: \"779de251-8b36-4a76-9a74-216ac9d7ffff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-86kr5" Mar 09 09:59:49 crc kubenswrapper[4971]: I0309 09:59:49.055428 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxlsr\" (UniqueName: \"kubernetes.io/projected/779de251-8b36-4a76-9a74-216ac9d7ffff-kube-api-access-mxlsr\") pod \"swift-ring-rebalance-debug-86kr5\" (UID: \"779de251-8b36-4a76-9a74-216ac9d7ffff\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-86kr5" Mar 09 09:59:49 crc kubenswrapper[4971]: I0309 09:59:49.068314 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-86kr5" Mar 09 09:59:49 crc kubenswrapper[4971]: I0309 09:59:49.163452 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5c01b16-5e73-4637-88c9-d82f3fb7a747" path="/var/lib/kubelet/pods/b5c01b16-5e73-4637-88c9-d82f3fb7a747/volumes" Mar 09 09:59:49 crc kubenswrapper[4971]: I0309 09:59:49.529571 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-86kr5"] Mar 09 09:59:49 crc kubenswrapper[4971]: W0309 09:59:49.530448 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod779de251_8b36_4a76_9a74_216ac9d7ffff.slice/crio-ea5b69600519741e5a5bb6c4aae1ac31531159f9cef0b29c85ffaa7af65504eb WatchSource:0}: Error finding container ea5b69600519741e5a5bb6c4aae1ac31531159f9cef0b29c85ffaa7af65504eb: Status 404 returned error can't find the container with id ea5b69600519741e5a5bb6c4aae1ac31531159f9cef0b29c85ffaa7af65504eb Mar 09 09:59:50 crc kubenswrapper[4971]: I0309 09:59:50.349756 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-86kr5" event={"ID":"779de251-8b36-4a76-9a74-216ac9d7ffff","Type":"ContainerStarted","Data":"52a2135aa2d082c6ff82036cad7a145dd0e9c438ae526d1d37fc5f01e664bfc5"} Mar 09 09:59:50 crc kubenswrapper[4971]: I0309 09:59:50.350094 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-86kr5" event={"ID":"779de251-8b36-4a76-9a74-216ac9d7ffff","Type":"ContainerStarted","Data":"ea5b69600519741e5a5bb6c4aae1ac31531159f9cef0b29c85ffaa7af65504eb"} Mar 09 09:59:50 crc kubenswrapper[4971]: I0309 09:59:50.369877 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-86kr5" podStartSLOduration=2.369852412 podStartE2EDuration="2.369852412s" podCreationTimestamp="2026-03-09 09:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:59:50.366993033 +0000 UTC m=+2393.926920843" watchObservedRunningTime="2026-03-09 09:59:50.369852412 +0000 UTC m=+2393.929780212" Mar 09 09:59:51 crc kubenswrapper[4971]: I0309 09:59:51.359408 4971 generic.go:334] "Generic (PLEG): container finished" podID="779de251-8b36-4a76-9a74-216ac9d7ffff" containerID="52a2135aa2d082c6ff82036cad7a145dd0e9c438ae526d1d37fc5f01e664bfc5" exitCode=0 Mar 09 09:59:51 crc kubenswrapper[4971]: I0309 09:59:51.359463 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-86kr5" event={"ID":"779de251-8b36-4a76-9a74-216ac9d7ffff","Type":"ContainerDied","Data":"52a2135aa2d082c6ff82036cad7a145dd0e9c438ae526d1d37fc5f01e664bfc5"} Mar 09 09:59:52 crc kubenswrapper[4971]: I0309 09:59:52.625579 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-86kr5" Mar 09 09:59:52 crc kubenswrapper[4971]: I0309 09:59:52.659144 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-86kr5"] Mar 09 09:59:52 crc kubenswrapper[4971]: I0309 09:59:52.663551 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-86kr5"] Mar 09 09:59:52 crc kubenswrapper[4971]: I0309 09:59:52.788432 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/779de251-8b36-4a76-9a74-216ac9d7ffff-swiftconf\") pod \"779de251-8b36-4a76-9a74-216ac9d7ffff\" (UID: \"779de251-8b36-4a76-9a74-216ac9d7ffff\") " Mar 09 09:59:52 crc kubenswrapper[4971]: I0309 09:59:52.788498 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/779de251-8b36-4a76-9a74-216ac9d7ffff-etc-swift\") pod \"779de251-8b36-4a76-9a74-216ac9d7ffff\" (UID: \"779de251-8b36-4a76-9a74-216ac9d7ffff\") " Mar 09 09:59:52 crc kubenswrapper[4971]: I0309 09:59:52.788592 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/779de251-8b36-4a76-9a74-216ac9d7ffff-dispersionconf\") pod \"779de251-8b36-4a76-9a74-216ac9d7ffff\" (UID: \"779de251-8b36-4a76-9a74-216ac9d7ffff\") " Mar 09 09:59:52 crc kubenswrapper[4971]: I0309 09:59:52.788634 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/779de251-8b36-4a76-9a74-216ac9d7ffff-scripts\") pod \"779de251-8b36-4a76-9a74-216ac9d7ffff\" (UID: \"779de251-8b36-4a76-9a74-216ac9d7ffff\") " Mar 09 09:59:52 crc kubenswrapper[4971]: I0309 09:59:52.788697 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxlsr\" (UniqueName: \"kubernetes.io/projected/779de251-8b36-4a76-9a74-216ac9d7ffff-kube-api-access-mxlsr\") pod \"779de251-8b36-4a76-9a74-216ac9d7ffff\" (UID: \"779de251-8b36-4a76-9a74-216ac9d7ffff\") " Mar 09 09:59:52 crc kubenswrapper[4971]: I0309 09:59:52.788729 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/779de251-8b36-4a76-9a74-216ac9d7ffff-ring-data-devices\") pod \"779de251-8b36-4a76-9a74-216ac9d7ffff\" (UID: \"779de251-8b36-4a76-9a74-216ac9d7ffff\") " Mar 09 09:59:52 crc kubenswrapper[4971]: I0309 09:59:52.789638 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/779de251-8b36-4a76-9a74-216ac9d7ffff-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "779de251-8b36-4a76-9a74-216ac9d7ffff" (UID: "779de251-8b36-4a76-9a74-216ac9d7ffff"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:59:52 crc kubenswrapper[4971]: I0309 09:59:52.789765 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/779de251-8b36-4a76-9a74-216ac9d7ffff-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "779de251-8b36-4a76-9a74-216ac9d7ffff" (UID: "779de251-8b36-4a76-9a74-216ac9d7ffff"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:59:52 crc kubenswrapper[4971]: I0309 09:59:52.796761 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/779de251-8b36-4a76-9a74-216ac9d7ffff-kube-api-access-mxlsr" (OuterVolumeSpecName: "kube-api-access-mxlsr") pod "779de251-8b36-4a76-9a74-216ac9d7ffff" (UID: "779de251-8b36-4a76-9a74-216ac9d7ffff"). InnerVolumeSpecName "kube-api-access-mxlsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:59:52 crc kubenswrapper[4971]: I0309 09:59:52.810593 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/779de251-8b36-4a76-9a74-216ac9d7ffff-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "779de251-8b36-4a76-9a74-216ac9d7ffff" (UID: "779de251-8b36-4a76-9a74-216ac9d7ffff"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:59:52 crc kubenswrapper[4971]: I0309 09:59:52.811162 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/779de251-8b36-4a76-9a74-216ac9d7ffff-scripts" (OuterVolumeSpecName: "scripts") pod "779de251-8b36-4a76-9a74-216ac9d7ffff" (UID: "779de251-8b36-4a76-9a74-216ac9d7ffff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:59:52 crc kubenswrapper[4971]: I0309 09:59:52.811904 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/779de251-8b36-4a76-9a74-216ac9d7ffff-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "779de251-8b36-4a76-9a74-216ac9d7ffff" (UID: "779de251-8b36-4a76-9a74-216ac9d7ffff"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:59:52 crc kubenswrapper[4971]: I0309 09:59:52.890217 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/779de251-8b36-4a76-9a74-216ac9d7ffff-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:52 crc kubenswrapper[4971]: I0309 09:59:52.890257 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/779de251-8b36-4a76-9a74-216ac9d7ffff-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:52 crc kubenswrapper[4971]: I0309 09:59:52.890270 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxlsr\" (UniqueName: \"kubernetes.io/projected/779de251-8b36-4a76-9a74-216ac9d7ffff-kube-api-access-mxlsr\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:52 crc kubenswrapper[4971]: I0309 09:59:52.890284 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/779de251-8b36-4a76-9a74-216ac9d7ffff-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:52 crc kubenswrapper[4971]: I0309 09:59:52.890295 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/779de251-8b36-4a76-9a74-216ac9d7ffff-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:52 crc kubenswrapper[4971]: I0309 09:59:52.890306 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/779de251-8b36-4a76-9a74-216ac9d7ffff-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.166811 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="779de251-8b36-4a76-9a74-216ac9d7ffff" path="/var/lib/kubelet/pods/779de251-8b36-4a76-9a74-216ac9d7ffff/volumes" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.383948 4971 scope.go:117] "RemoveContainer" containerID="52a2135aa2d082c6ff82036cad7a145dd0e9c438ae526d1d37fc5f01e664bfc5" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.383999 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-86kr5" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.795935 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz"] Mar 09 09:59:53 crc kubenswrapper[4971]: E0309 09:59:53.796275 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="779de251-8b36-4a76-9a74-216ac9d7ffff" containerName="swift-ring-rebalance" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.796290 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="779de251-8b36-4a76-9a74-216ac9d7ffff" containerName="swift-ring-rebalance" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.796490 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="779de251-8b36-4a76-9a74-216ac9d7ffff" containerName="swift-ring-rebalance" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.797041 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.798743 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.798783 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.801714 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7584ca2-4434-45f4-8189-551c5afc25f8-ring-data-devices\") pod \"swift-ring-rebalance-debug-k2qmz\" (UID: \"b7584ca2-4434-45f4-8189-551c5afc25f8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.801774 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7584ca2-4434-45f4-8189-551c5afc25f8-etc-swift\") pod \"swift-ring-rebalance-debug-k2qmz\" (UID: \"b7584ca2-4434-45f4-8189-551c5afc25f8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.801814 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7584ca2-4434-45f4-8189-551c5afc25f8-swiftconf\") pod \"swift-ring-rebalance-debug-k2qmz\" (UID: \"b7584ca2-4434-45f4-8189-551c5afc25f8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.801885 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7584ca2-4434-45f4-8189-551c5afc25f8-dispersionconf\") pod \"swift-ring-rebalance-debug-k2qmz\" (UID: \"b7584ca2-4434-45f4-8189-551c5afc25f8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.801925 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7584ca2-4434-45f4-8189-551c5afc25f8-scripts\") pod \"swift-ring-rebalance-debug-k2qmz\" (UID: \"b7584ca2-4434-45f4-8189-551c5afc25f8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.801956 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvxhj\" (UniqueName: \"kubernetes.io/projected/b7584ca2-4434-45f4-8189-551c5afc25f8-kube-api-access-lvxhj\") pod \"swift-ring-rebalance-debug-k2qmz\" (UID: \"b7584ca2-4434-45f4-8189-551c5afc25f8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.808174 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz"] Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.903022 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7584ca2-4434-45f4-8189-551c5afc25f8-ring-data-devices\") pod \"swift-ring-rebalance-debug-k2qmz\" (UID: \"b7584ca2-4434-45f4-8189-551c5afc25f8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.903219 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7584ca2-4434-45f4-8189-551c5afc25f8-etc-swift\") pod \"swift-ring-rebalance-debug-k2qmz\" (UID: \"b7584ca2-4434-45f4-8189-551c5afc25f8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.903266 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7584ca2-4434-45f4-8189-551c5afc25f8-swiftconf\") pod \"swift-ring-rebalance-debug-k2qmz\" (UID: \"b7584ca2-4434-45f4-8189-551c5afc25f8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.903333 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7584ca2-4434-45f4-8189-551c5afc25f8-dispersionconf\") pod \"swift-ring-rebalance-debug-k2qmz\" (UID: \"b7584ca2-4434-45f4-8189-551c5afc25f8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.903420 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7584ca2-4434-45f4-8189-551c5afc25f8-scripts\") pod \"swift-ring-rebalance-debug-k2qmz\" (UID: \"b7584ca2-4434-45f4-8189-551c5afc25f8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.903449 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvxhj\" (UniqueName: \"kubernetes.io/projected/b7584ca2-4434-45f4-8189-551c5afc25f8-kube-api-access-lvxhj\") pod \"swift-ring-rebalance-debug-k2qmz\" (UID: \"b7584ca2-4434-45f4-8189-551c5afc25f8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.904321 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7584ca2-4434-45f4-8189-551c5afc25f8-scripts\") pod \"swift-ring-rebalance-debug-k2qmz\" (UID: \"b7584ca2-4434-45f4-8189-551c5afc25f8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.904820 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7584ca2-4434-45f4-8189-551c5afc25f8-ring-data-devices\") pod \"swift-ring-rebalance-debug-k2qmz\" (UID: \"b7584ca2-4434-45f4-8189-551c5afc25f8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.905486 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7584ca2-4434-45f4-8189-551c5afc25f8-etc-swift\") pod \"swift-ring-rebalance-debug-k2qmz\" (UID: \"b7584ca2-4434-45f4-8189-551c5afc25f8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.908133 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7584ca2-4434-45f4-8189-551c5afc25f8-dispersionconf\") pod \"swift-ring-rebalance-debug-k2qmz\" (UID: \"b7584ca2-4434-45f4-8189-551c5afc25f8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.911709 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7584ca2-4434-45f4-8189-551c5afc25f8-swiftconf\") pod \"swift-ring-rebalance-debug-k2qmz\" (UID: \"b7584ca2-4434-45f4-8189-551c5afc25f8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz" Mar 09 09:59:53 crc kubenswrapper[4971]: I0309 09:59:53.924974 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvxhj\" (UniqueName: \"kubernetes.io/projected/b7584ca2-4434-45f4-8189-551c5afc25f8-kube-api-access-lvxhj\") pod \"swift-ring-rebalance-debug-k2qmz\" (UID: \"b7584ca2-4434-45f4-8189-551c5afc25f8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz" Mar 09 09:59:54 crc kubenswrapper[4971]: I0309 09:59:54.118812 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz" Mar 09 09:59:54 crc kubenswrapper[4971]: I0309 09:59:54.625630 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz"] Mar 09 09:59:55 crc kubenswrapper[4971]: I0309 09:59:55.403069 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz" event={"ID":"b7584ca2-4434-45f4-8189-551c5afc25f8","Type":"ContainerStarted","Data":"d52b25a01b3bc92c585e33660df738331f5a5cd1546de767cbaf368a82f8761a"} Mar 09 09:59:55 crc kubenswrapper[4971]: I0309 09:59:55.403540 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz" event={"ID":"b7584ca2-4434-45f4-8189-551c5afc25f8","Type":"ContainerStarted","Data":"d91929604194318c8ae04e93cdac4c5ce5199679e29ab2ef146b2dec2f226c81"} Mar 09 09:59:55 crc kubenswrapper[4971]: I0309 09:59:55.428130 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz" podStartSLOduration=2.428109818 podStartE2EDuration="2.428109818s" podCreationTimestamp="2026-03-09 09:59:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:59:55.423199153 +0000 UTC m=+2398.983126973" watchObservedRunningTime="2026-03-09 09:59:55.428109818 +0000 UTC m=+2398.988037628" Mar 09 09:59:56 crc kubenswrapper[4971]: I0309 09:59:56.414606 4971 generic.go:334] "Generic (PLEG): container finished" podID="b7584ca2-4434-45f4-8189-551c5afc25f8" containerID="d52b25a01b3bc92c585e33660df738331f5a5cd1546de767cbaf368a82f8761a" exitCode=0 Mar 09 09:59:56 crc kubenswrapper[4971]: I0309 09:59:56.414686 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz" event={"ID":"b7584ca2-4434-45f4-8189-551c5afc25f8","Type":"ContainerDied","Data":"d52b25a01b3bc92c585e33660df738331f5a5cd1546de767cbaf368a82f8761a"} Mar 09 09:59:57 crc kubenswrapper[4971]: I0309 09:59:57.718708 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz" Mar 09 09:59:57 crc kubenswrapper[4971]: I0309 09:59:57.814715 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz"] Mar 09 09:59:57 crc kubenswrapper[4971]: I0309 09:59:57.820910 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz"] Mar 09 09:59:57 crc kubenswrapper[4971]: I0309 09:59:57.869143 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7584ca2-4434-45f4-8189-551c5afc25f8-dispersionconf\") pod \"b7584ca2-4434-45f4-8189-551c5afc25f8\" (UID: \"b7584ca2-4434-45f4-8189-551c5afc25f8\") " Mar 09 09:59:57 crc kubenswrapper[4971]: I0309 09:59:57.869287 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvxhj\" (UniqueName: \"kubernetes.io/projected/b7584ca2-4434-45f4-8189-551c5afc25f8-kube-api-access-lvxhj\") pod \"b7584ca2-4434-45f4-8189-551c5afc25f8\" (UID: \"b7584ca2-4434-45f4-8189-551c5afc25f8\") " Mar 09 09:59:57 crc kubenswrapper[4971]: I0309 09:59:57.869416 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7584ca2-4434-45f4-8189-551c5afc25f8-ring-data-devices\") pod \"b7584ca2-4434-45f4-8189-551c5afc25f8\" (UID: \"b7584ca2-4434-45f4-8189-551c5afc25f8\") " Mar 09 09:59:57 crc kubenswrapper[4971]: I0309 09:59:57.869524 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7584ca2-4434-45f4-8189-551c5afc25f8-scripts\") pod \"b7584ca2-4434-45f4-8189-551c5afc25f8\" (UID: \"b7584ca2-4434-45f4-8189-551c5afc25f8\") " Mar 09 09:59:57 crc kubenswrapper[4971]: I0309 09:59:57.869614 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7584ca2-4434-45f4-8189-551c5afc25f8-etc-swift\") pod \"b7584ca2-4434-45f4-8189-551c5afc25f8\" (UID: \"b7584ca2-4434-45f4-8189-551c5afc25f8\") " Mar 09 09:59:57 crc kubenswrapper[4971]: I0309 09:59:57.869658 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7584ca2-4434-45f4-8189-551c5afc25f8-swiftconf\") pod \"b7584ca2-4434-45f4-8189-551c5afc25f8\" (UID: \"b7584ca2-4434-45f4-8189-551c5afc25f8\") " Mar 09 09:59:57 crc kubenswrapper[4971]: I0309 09:59:57.871216 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7584ca2-4434-45f4-8189-551c5afc25f8-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b7584ca2-4434-45f4-8189-551c5afc25f8" (UID: "b7584ca2-4434-45f4-8189-551c5afc25f8"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:59:57 crc kubenswrapper[4971]: I0309 09:59:57.871846 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7584ca2-4434-45f4-8189-551c5afc25f8-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b7584ca2-4434-45f4-8189-551c5afc25f8" (UID: "b7584ca2-4434-45f4-8189-551c5afc25f8"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:59:57 crc kubenswrapper[4971]: I0309 09:59:57.877777 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7584ca2-4434-45f4-8189-551c5afc25f8-kube-api-access-lvxhj" (OuterVolumeSpecName: "kube-api-access-lvxhj") pod "b7584ca2-4434-45f4-8189-551c5afc25f8" (UID: "b7584ca2-4434-45f4-8189-551c5afc25f8"). InnerVolumeSpecName "kube-api-access-lvxhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:59:57 crc kubenswrapper[4971]: I0309 09:59:57.895462 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7584ca2-4434-45f4-8189-551c5afc25f8-scripts" (OuterVolumeSpecName: "scripts") pod "b7584ca2-4434-45f4-8189-551c5afc25f8" (UID: "b7584ca2-4434-45f4-8189-551c5afc25f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:59:57 crc kubenswrapper[4971]: I0309 09:59:57.898043 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7584ca2-4434-45f4-8189-551c5afc25f8-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b7584ca2-4434-45f4-8189-551c5afc25f8" (UID: "b7584ca2-4434-45f4-8189-551c5afc25f8"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:59:57 crc kubenswrapper[4971]: I0309 09:59:57.899994 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7584ca2-4434-45f4-8189-551c5afc25f8-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b7584ca2-4434-45f4-8189-551c5afc25f8" (UID: "b7584ca2-4434-45f4-8189-551c5afc25f8"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:59:57 crc kubenswrapper[4971]: I0309 09:59:57.971942 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7584ca2-4434-45f4-8189-551c5afc25f8-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:57 crc kubenswrapper[4971]: I0309 09:59:57.972004 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7584ca2-4434-45f4-8189-551c5afc25f8-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:57 crc kubenswrapper[4971]: I0309 09:59:57.972018 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7584ca2-4434-45f4-8189-551c5afc25f8-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:57 crc kubenswrapper[4971]: I0309 09:59:57.972033 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7584ca2-4434-45f4-8189-551c5afc25f8-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:57 crc kubenswrapper[4971]: I0309 09:59:57.972046 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7584ca2-4434-45f4-8189-551c5afc25f8-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:57 crc kubenswrapper[4971]: I0309 09:59:57.972077 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvxhj\" (UniqueName: \"kubernetes.io/projected/b7584ca2-4434-45f4-8189-551c5afc25f8-kube-api-access-lvxhj\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:58 crc kubenswrapper[4971]: I0309 09:59:58.433394 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d91929604194318c8ae04e93cdac4c5ce5199679e29ab2ef146b2dec2f226c81" Mar 09 09:59:58 crc kubenswrapper[4971]: I0309 09:59:58.433466 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-k2qmz" Mar 09 09:59:58 crc kubenswrapper[4971]: I0309 09:59:58.952550 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr"] Mar 09 09:59:58 crc kubenswrapper[4971]: E0309 09:59:58.952924 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7584ca2-4434-45f4-8189-551c5afc25f8" containerName="swift-ring-rebalance" Mar 09 09:59:58 crc kubenswrapper[4971]: I0309 09:59:58.952940 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7584ca2-4434-45f4-8189-551c5afc25f8" containerName="swift-ring-rebalance" Mar 09 09:59:58 crc kubenswrapper[4971]: I0309 09:59:58.953117 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7584ca2-4434-45f4-8189-551c5afc25f8" containerName="swift-ring-rebalance" Mar 09 09:59:58 crc kubenswrapper[4971]: I0309 09:59:58.953743 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr" Mar 09 09:59:58 crc kubenswrapper[4971]: I0309 09:59:58.955681 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 09:59:58 crc kubenswrapper[4971]: I0309 09:59:58.955967 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 09:59:58 crc kubenswrapper[4971]: I0309 09:59:58.966052 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr"] Mar 09 09:59:59 crc kubenswrapper[4971]: I0309 09:59:59.086919 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-dispersionconf\") pod \"swift-ring-rebalance-debug-5dpqr\" (UID: \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr" Mar 09 09:59:59 crc kubenswrapper[4971]: I0309 09:59:59.086990 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-etc-swift\") pod \"swift-ring-rebalance-debug-5dpqr\" (UID: \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr" Mar 09 09:59:59 crc kubenswrapper[4971]: I0309 09:59:59.087012 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-ring-data-devices\") pod \"swift-ring-rebalance-debug-5dpqr\" (UID: \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr" Mar 09 09:59:59 crc kubenswrapper[4971]: I0309 09:59:59.087246 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-scripts\") pod \"swift-ring-rebalance-debug-5dpqr\" (UID: \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr" Mar 09 09:59:59 crc kubenswrapper[4971]: I0309 09:59:59.087301 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-swiftconf\") pod \"swift-ring-rebalance-debug-5dpqr\" (UID: \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr" Mar 09 09:59:59 crc kubenswrapper[4971]: I0309 09:59:59.087396 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn4dx\" (UniqueName: \"kubernetes.io/projected/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-kube-api-access-hn4dx\") pod \"swift-ring-rebalance-debug-5dpqr\" (UID: \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr" Mar 09 09:59:59 crc kubenswrapper[4971]: I0309 09:59:59.160527 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7584ca2-4434-45f4-8189-551c5afc25f8" path="/var/lib/kubelet/pods/b7584ca2-4434-45f4-8189-551c5afc25f8/volumes" Mar 09 09:59:59 crc kubenswrapper[4971]: I0309 09:59:59.188935 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-dispersionconf\") pod \"swift-ring-rebalance-debug-5dpqr\" (UID: \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr" Mar 09 09:59:59 crc kubenswrapper[4971]: I0309 09:59:59.188999 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-etc-swift\") pod \"swift-ring-rebalance-debug-5dpqr\" (UID: \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr" Mar 09 09:59:59 crc kubenswrapper[4971]: I0309 09:59:59.189025 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-ring-data-devices\") pod \"swift-ring-rebalance-debug-5dpqr\" (UID: \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr" Mar 09 09:59:59 crc kubenswrapper[4971]: I0309 09:59:59.189335 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-scripts\") pod \"swift-ring-rebalance-debug-5dpqr\" (UID: \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr" Mar 09 09:59:59 crc kubenswrapper[4971]: I0309 09:59:59.189413 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-swiftconf\") pod \"swift-ring-rebalance-debug-5dpqr\" (UID: \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr" Mar 09 09:59:59 crc kubenswrapper[4971]: I0309 09:59:59.189456 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn4dx\" (UniqueName: \"kubernetes.io/projected/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-kube-api-access-hn4dx\") pod \"swift-ring-rebalance-debug-5dpqr\" (UID: \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr" Mar 09 09:59:59 crc kubenswrapper[4971]: I0309 09:59:59.189648 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-etc-swift\") pod \"swift-ring-rebalance-debug-5dpqr\" (UID: \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr" Mar 09 09:59:59 crc kubenswrapper[4971]: I0309 09:59:59.189859 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-ring-data-devices\") pod \"swift-ring-rebalance-debug-5dpqr\" (UID: \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr" Mar 09 09:59:59 crc kubenswrapper[4971]: I0309 09:59:59.190194 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-scripts\") pod \"swift-ring-rebalance-debug-5dpqr\" (UID: \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr" Mar 09 09:59:59 crc kubenswrapper[4971]: I0309 09:59:59.194281 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-swiftconf\") pod \"swift-ring-rebalance-debug-5dpqr\" (UID: \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr" Mar 09 09:59:59 crc kubenswrapper[4971]: I0309 09:59:59.195181 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-dispersionconf\") pod \"swift-ring-rebalance-debug-5dpqr\" (UID: \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr" Mar 09 09:59:59 crc kubenswrapper[4971]: I0309 09:59:59.208081 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn4dx\" (UniqueName: \"kubernetes.io/projected/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-kube-api-access-hn4dx\") pod \"swift-ring-rebalance-debug-5dpqr\" (UID: \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr" Mar 09 09:59:59 crc kubenswrapper[4971]: I0309 09:59:59.280861 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr" Mar 09 09:59:59 crc kubenswrapper[4971]: I0309 09:59:59.679108 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr"] Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.138895 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550840-td6gm"] Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.140184 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550840-td6gm" Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.142281 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.142550 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.142745 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-xhrv2" Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.147378 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550840-td6gm"] Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.237996 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550840-slnbt"] Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.238900 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-slnbt" Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.242526 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.246041 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.248512 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550840-slnbt"] Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.314469 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjvv2\" (UniqueName: \"kubernetes.io/projected/eb66c60e-a138-4804-9aaf-db389174e600-kube-api-access-zjvv2\") pod \"auto-csr-approver-29550840-td6gm\" (UID: \"eb66c60e-a138-4804-9aaf-db389174e600\") " pod="openshift-infra/auto-csr-approver-29550840-td6gm" Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.416265 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cltkj\" (UniqueName: \"kubernetes.io/projected/b21c459a-06b6-45eb-aeb3-67e236cfc358-kube-api-access-cltkj\") pod \"collect-profiles-29550840-slnbt\" (UID: \"b21c459a-06b6-45eb-aeb3-67e236cfc358\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-slnbt" Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.416519 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b21c459a-06b6-45eb-aeb3-67e236cfc358-secret-volume\") pod \"collect-profiles-29550840-slnbt\" (UID: \"b21c459a-06b6-45eb-aeb3-67e236cfc358\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-slnbt" Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.416813 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b21c459a-06b6-45eb-aeb3-67e236cfc358-config-volume\") pod \"collect-profiles-29550840-slnbt\" (UID: \"b21c459a-06b6-45eb-aeb3-67e236cfc358\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-slnbt" Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.416986 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjvv2\" (UniqueName: \"kubernetes.io/projected/eb66c60e-a138-4804-9aaf-db389174e600-kube-api-access-zjvv2\") pod \"auto-csr-approver-29550840-td6gm\" (UID: \"eb66c60e-a138-4804-9aaf-db389174e600\") " pod="openshift-infra/auto-csr-approver-29550840-td6gm" Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.448323 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjvv2\" (UniqueName: \"kubernetes.io/projected/eb66c60e-a138-4804-9aaf-db389174e600-kube-api-access-zjvv2\") pod \"auto-csr-approver-29550840-td6gm\" (UID: \"eb66c60e-a138-4804-9aaf-db389174e600\") " pod="openshift-infra/auto-csr-approver-29550840-td6gm" Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.454372 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr" event={"ID":"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337","Type":"ContainerStarted","Data":"c571fdc5da8640ef8bae6dd0d3a68120e54b02268ac517bb78522740c37eedd9"} Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.454428 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr" event={"ID":"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337","Type":"ContainerStarted","Data":"972a7d8ec26e45352f8bf24774f809d6796aea0e95e982fd000f20fb29c13909"} Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.459337 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550840-td6gm" Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.477802 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr" podStartSLOduration=2.477778808 podStartE2EDuration="2.477778808s" podCreationTimestamp="2026-03-09 09:59:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:00:00.473589293 +0000 UTC m=+2404.033517103" watchObservedRunningTime="2026-03-09 10:00:00.477778808 +0000 UTC m=+2404.037706618" Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.518761 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cltkj\" (UniqueName: \"kubernetes.io/projected/b21c459a-06b6-45eb-aeb3-67e236cfc358-kube-api-access-cltkj\") pod \"collect-profiles-29550840-slnbt\" (UID: \"b21c459a-06b6-45eb-aeb3-67e236cfc358\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-slnbt" Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.519506 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b21c459a-06b6-45eb-aeb3-67e236cfc358-secret-volume\") pod \"collect-profiles-29550840-slnbt\" (UID: \"b21c459a-06b6-45eb-aeb3-67e236cfc358\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-slnbt" Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.519556 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b21c459a-06b6-45eb-aeb3-67e236cfc358-config-volume\") pod \"collect-profiles-29550840-slnbt\" (UID: \"b21c459a-06b6-45eb-aeb3-67e236cfc358\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-slnbt" Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.521090 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b21c459a-06b6-45eb-aeb3-67e236cfc358-config-volume\") pod \"collect-profiles-29550840-slnbt\" (UID: \"b21c459a-06b6-45eb-aeb3-67e236cfc358\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-slnbt" Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.530866 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b21c459a-06b6-45eb-aeb3-67e236cfc358-secret-volume\") pod \"collect-profiles-29550840-slnbt\" (UID: \"b21c459a-06b6-45eb-aeb3-67e236cfc358\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-slnbt" Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.541504 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cltkj\" (UniqueName: \"kubernetes.io/projected/b21c459a-06b6-45eb-aeb3-67e236cfc358-kube-api-access-cltkj\") pod \"collect-profiles-29550840-slnbt\" (UID: \"b21c459a-06b6-45eb-aeb3-67e236cfc358\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-slnbt" Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.570670 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-slnbt" Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.799023 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550840-slnbt"] Mar 09 10:00:00 crc kubenswrapper[4971]: W0309 10:00:00.801645 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb21c459a_06b6_45eb_aeb3_67e236cfc358.slice/crio-8b6c09cc05e0744962320eb37eb316781146f00dbd9a17f86bebee9bd768852d WatchSource:0}: Error finding container 8b6c09cc05e0744962320eb37eb316781146f00dbd9a17f86bebee9bd768852d: Status 404 returned error can't find the container with id 8b6c09cc05e0744962320eb37eb316781146f00dbd9a17f86bebee9bd768852d Mar 09 10:00:00 crc kubenswrapper[4971]: I0309 10:00:00.913410 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550840-td6gm"] Mar 09 10:00:00 crc kubenswrapper[4971]: W0309 10:00:00.916155 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb66c60e_a138_4804_9aaf_db389174e600.slice/crio-2a52b80fec53826e83dd592c4a3a484261cb84320d9d727698f8db260d2febf9 WatchSource:0}: Error finding container 2a52b80fec53826e83dd592c4a3a484261cb84320d9d727698f8db260d2febf9: Status 404 returned error can't find the container with id 2a52b80fec53826e83dd592c4a3a484261cb84320d9d727698f8db260d2febf9 Mar 09 10:00:01 crc kubenswrapper[4971]: I0309 10:00:01.464796 4971 generic.go:334] "Generic (PLEG): container finished" podID="b21c459a-06b6-45eb-aeb3-67e236cfc358" containerID="b14a65946020f423e9d26ece9c0127c299766838e5655dd572cfd45ae2ede457" exitCode=0 Mar 09 10:00:01 crc kubenswrapper[4971]: I0309 10:00:01.464860 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-slnbt" event={"ID":"b21c459a-06b6-45eb-aeb3-67e236cfc358","Type":"ContainerDied","Data":"b14a65946020f423e9d26ece9c0127c299766838e5655dd572cfd45ae2ede457"} Mar 09 10:00:01 crc kubenswrapper[4971]: I0309 10:00:01.464923 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-slnbt" event={"ID":"b21c459a-06b6-45eb-aeb3-67e236cfc358","Type":"ContainerStarted","Data":"8b6c09cc05e0744962320eb37eb316781146f00dbd9a17f86bebee9bd768852d"} Mar 09 10:00:01 crc kubenswrapper[4971]: I0309 10:00:01.466627 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550840-td6gm" event={"ID":"eb66c60e-a138-4804-9aaf-db389174e600","Type":"ContainerStarted","Data":"2a52b80fec53826e83dd592c4a3a484261cb84320d9d727698f8db260d2febf9"} Mar 09 10:00:01 crc kubenswrapper[4971]: I0309 10:00:01.468302 4971 generic.go:334] "Generic (PLEG): container finished" podID="4e7d3c5b-1982-4aa5-a56f-8d120bd9e337" containerID="c571fdc5da8640ef8bae6dd0d3a68120e54b02268ac517bb78522740c37eedd9" exitCode=0 Mar 09 10:00:01 crc kubenswrapper[4971]: I0309 10:00:01.468335 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr" event={"ID":"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337","Type":"ContainerDied","Data":"c571fdc5da8640ef8bae6dd0d3a68120e54b02268ac517bb78522740c37eedd9"} Mar 09 10:00:02 crc kubenswrapper[4971]: I0309 10:00:02.788787 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-slnbt" Mar 09 10:00:02 crc kubenswrapper[4971]: I0309 10:00:02.797971 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr" Mar 09 10:00:02 crc kubenswrapper[4971]: I0309 10:00:02.843277 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr"] Mar 09 10:00:02 crc kubenswrapper[4971]: I0309 10:00:02.849102 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr"] Mar 09 10:00:02 crc kubenswrapper[4971]: I0309 10:00:02.853835 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b21c459a-06b6-45eb-aeb3-67e236cfc358-secret-volume\") pod \"b21c459a-06b6-45eb-aeb3-67e236cfc358\" (UID: \"b21c459a-06b6-45eb-aeb3-67e236cfc358\") " Mar 09 10:00:02 crc kubenswrapper[4971]: I0309 10:00:02.853897 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b21c459a-06b6-45eb-aeb3-67e236cfc358-config-volume\") pod \"b21c459a-06b6-45eb-aeb3-67e236cfc358\" (UID: \"b21c459a-06b6-45eb-aeb3-67e236cfc358\") " Mar 09 10:00:02 crc kubenswrapper[4971]: I0309 10:00:02.853928 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cltkj\" (UniqueName: \"kubernetes.io/projected/b21c459a-06b6-45eb-aeb3-67e236cfc358-kube-api-access-cltkj\") pod \"b21c459a-06b6-45eb-aeb3-67e236cfc358\" (UID: \"b21c459a-06b6-45eb-aeb3-67e236cfc358\") " Mar 09 10:00:02 crc kubenswrapper[4971]: I0309 10:00:02.854671 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b21c459a-06b6-45eb-aeb3-67e236cfc358-config-volume" (OuterVolumeSpecName: "config-volume") pod "b21c459a-06b6-45eb-aeb3-67e236cfc358" (UID: "b21c459a-06b6-45eb-aeb3-67e236cfc358"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:02 crc kubenswrapper[4971]: I0309 10:00:02.854788 4971 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b21c459a-06b6-45eb-aeb3-67e236cfc358-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:02 crc kubenswrapper[4971]: I0309 10:00:02.858995 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b21c459a-06b6-45eb-aeb3-67e236cfc358-kube-api-access-cltkj" (OuterVolumeSpecName: "kube-api-access-cltkj") pod "b21c459a-06b6-45eb-aeb3-67e236cfc358" (UID: "b21c459a-06b6-45eb-aeb3-67e236cfc358"). InnerVolumeSpecName "kube-api-access-cltkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:02 crc kubenswrapper[4971]: I0309 10:00:02.859015 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b21c459a-06b6-45eb-aeb3-67e236cfc358-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b21c459a-06b6-45eb-aeb3-67e236cfc358" (UID: "b21c459a-06b6-45eb-aeb3-67e236cfc358"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:02 crc kubenswrapper[4971]: I0309 10:00:02.955936 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-ring-data-devices\") pod \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\" (UID: \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\") " Mar 09 10:00:02 crc kubenswrapper[4971]: I0309 10:00:02.955985 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn4dx\" (UniqueName: \"kubernetes.io/projected/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-kube-api-access-hn4dx\") pod \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\" (UID: \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\") " Mar 09 10:00:02 crc kubenswrapper[4971]: I0309 10:00:02.956041 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-etc-swift\") pod \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\" (UID: \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\") " Mar 09 10:00:02 crc kubenswrapper[4971]: I0309 10:00:02.956178 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-dispersionconf\") pod \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\" (UID: \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\") " Mar 09 10:00:02 crc kubenswrapper[4971]: I0309 10:00:02.956222 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-scripts\") pod \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\" (UID: \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\") " Mar 09 10:00:02 crc kubenswrapper[4971]: I0309 10:00:02.956247 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-swiftconf\") pod \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\" (UID: \"4e7d3c5b-1982-4aa5-a56f-8d120bd9e337\") " Mar 09 10:00:02 crc kubenswrapper[4971]: I0309 10:00:02.956642 4971 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b21c459a-06b6-45eb-aeb3-67e236cfc358-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:02 crc kubenswrapper[4971]: I0309 10:00:02.956668 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cltkj\" (UniqueName: \"kubernetes.io/projected/b21c459a-06b6-45eb-aeb3-67e236cfc358-kube-api-access-cltkj\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:02 crc kubenswrapper[4971]: I0309 10:00:02.956868 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "4e7d3c5b-1982-4aa5-a56f-8d120bd9e337" (UID: "4e7d3c5b-1982-4aa5-a56f-8d120bd9e337"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:02 crc kubenswrapper[4971]: I0309 10:00:02.957283 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4e7d3c5b-1982-4aa5-a56f-8d120bd9e337" (UID: "4e7d3c5b-1982-4aa5-a56f-8d120bd9e337"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:00:02 crc kubenswrapper[4971]: I0309 10:00:02.960892 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-kube-api-access-hn4dx" (OuterVolumeSpecName: "kube-api-access-hn4dx") pod "4e7d3c5b-1982-4aa5-a56f-8d120bd9e337" (UID: "4e7d3c5b-1982-4aa5-a56f-8d120bd9e337"). InnerVolumeSpecName "kube-api-access-hn4dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:02 crc kubenswrapper[4971]: I0309 10:00:02.978737 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "4e7d3c5b-1982-4aa5-a56f-8d120bd9e337" (UID: "4e7d3c5b-1982-4aa5-a56f-8d120bd9e337"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:02 crc kubenswrapper[4971]: I0309 10:00:02.978929 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "4e7d3c5b-1982-4aa5-a56f-8d120bd9e337" (UID: "4e7d3c5b-1982-4aa5-a56f-8d120bd9e337"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:02 crc kubenswrapper[4971]: I0309 10:00:02.978976 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-scripts" (OuterVolumeSpecName: "scripts") pod "4e7d3c5b-1982-4aa5-a56f-8d120bd9e337" (UID: "4e7d3c5b-1982-4aa5-a56f-8d120bd9e337"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:03 crc kubenswrapper[4971]: I0309 10:00:03.058237 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:03 crc kubenswrapper[4971]: I0309 10:00:03.058289 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:03 crc kubenswrapper[4971]: I0309 10:00:03.058338 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:03 crc kubenswrapper[4971]: I0309 10:00:03.058392 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:03 crc kubenswrapper[4971]: I0309 10:00:03.058406 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:03 crc kubenswrapper[4971]: I0309 10:00:03.058418 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn4dx\" (UniqueName: \"kubernetes.io/projected/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337-kube-api-access-hn4dx\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:03 crc kubenswrapper[4971]: I0309 10:00:03.162689 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e7d3c5b-1982-4aa5-a56f-8d120bd9e337" path="/var/lib/kubelet/pods/4e7d3c5b-1982-4aa5-a56f-8d120bd9e337/volumes" Mar 09 10:00:03 crc kubenswrapper[4971]: I0309 10:00:03.488046 4971 scope.go:117] "RemoveContainer" containerID="c571fdc5da8640ef8bae6dd0d3a68120e54b02268ac517bb78522740c37eedd9" Mar 09 10:00:03 crc kubenswrapper[4971]: I0309 10:00:03.488055 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5dpqr" Mar 09 10:00:03 crc kubenswrapper[4971]: I0309 10:00:03.491069 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-slnbt" event={"ID":"b21c459a-06b6-45eb-aeb3-67e236cfc358","Type":"ContainerDied","Data":"8b6c09cc05e0744962320eb37eb316781146f00dbd9a17f86bebee9bd768852d"} Mar 09 10:00:03 crc kubenswrapper[4971]: I0309 10:00:03.491108 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b6c09cc05e0744962320eb37eb316781146f00dbd9a17f86bebee9bd768852d" Mar 09 10:00:03 crc kubenswrapper[4971]: I0309 10:00:03.491160 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-slnbt" Mar 09 10:00:03 crc kubenswrapper[4971]: I0309 10:00:03.853283 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550795-nmbwp"] Mar 09 10:00:03 crc kubenswrapper[4971]: I0309 10:00:03.859620 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550795-nmbwp"] Mar 09 10:00:03 crc kubenswrapper[4971]: I0309 10:00:03.990294 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-n9v67"] Mar 09 10:00:03 crc kubenswrapper[4971]: E0309 10:00:03.990876 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e7d3c5b-1982-4aa5-a56f-8d120bd9e337" containerName="swift-ring-rebalance" Mar 09 10:00:03 crc kubenswrapper[4971]: I0309 10:00:03.990890 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e7d3c5b-1982-4aa5-a56f-8d120bd9e337" containerName="swift-ring-rebalance" Mar 09 10:00:03 crc kubenswrapper[4971]: E0309 10:00:03.990920 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b21c459a-06b6-45eb-aeb3-67e236cfc358" containerName="collect-profiles" Mar 09 10:00:03 crc kubenswrapper[4971]: I0309 10:00:03.990927 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b21c459a-06b6-45eb-aeb3-67e236cfc358" containerName="collect-profiles" Mar 09 10:00:03 crc kubenswrapper[4971]: I0309 10:00:03.991084 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e7d3c5b-1982-4aa5-a56f-8d120bd9e337" containerName="swift-ring-rebalance" Mar 09 10:00:03 crc kubenswrapper[4971]: I0309 10:00:03.991107 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b21c459a-06b6-45eb-aeb3-67e236cfc358" containerName="collect-profiles" Mar 09 10:00:03 crc kubenswrapper[4971]: I0309 10:00:03.991685 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n9v67" Mar 09 10:00:03 crc kubenswrapper[4971]: I0309 10:00:03.993611 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 10:00:03 crc kubenswrapper[4971]: I0309 10:00:03.994588 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 10:00:04 crc kubenswrapper[4971]: I0309 10:00:04.016388 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-n9v67"] Mar 09 10:00:04 crc kubenswrapper[4971]: I0309 10:00:04.073665 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1dca8b63-995b-4b16-a762-c8af5620cb44-swiftconf\") pod \"swift-ring-rebalance-debug-n9v67\" (UID: \"1dca8b63-995b-4b16-a762-c8af5620cb44\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n9v67" Mar 09 10:00:04 crc kubenswrapper[4971]: I0309 10:00:04.073734 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45rmr\" (UniqueName: \"kubernetes.io/projected/1dca8b63-995b-4b16-a762-c8af5620cb44-kube-api-access-45rmr\") pod \"swift-ring-rebalance-debug-n9v67\" (UID: \"1dca8b63-995b-4b16-a762-c8af5620cb44\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n9v67" Mar 09 10:00:04 crc kubenswrapper[4971]: I0309 10:00:04.073760 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1dca8b63-995b-4b16-a762-c8af5620cb44-etc-swift\") pod \"swift-ring-rebalance-debug-n9v67\" (UID: \"1dca8b63-995b-4b16-a762-c8af5620cb44\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n9v67" Mar 09 10:00:04 crc kubenswrapper[4971]: I0309 10:00:04.073788 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1dca8b63-995b-4b16-a762-c8af5620cb44-scripts\") pod \"swift-ring-rebalance-debug-n9v67\" (UID: \"1dca8b63-995b-4b16-a762-c8af5620cb44\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n9v67" Mar 09 10:00:04 crc kubenswrapper[4971]: I0309 10:00:04.073802 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1dca8b63-995b-4b16-a762-c8af5620cb44-ring-data-devices\") pod \"swift-ring-rebalance-debug-n9v67\" (UID: \"1dca8b63-995b-4b16-a762-c8af5620cb44\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n9v67" Mar 09 10:00:04 crc kubenswrapper[4971]: I0309 10:00:04.073872 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1dca8b63-995b-4b16-a762-c8af5620cb44-dispersionconf\") pod \"swift-ring-rebalance-debug-n9v67\" (UID: \"1dca8b63-995b-4b16-a762-c8af5620cb44\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n9v67" Mar 09 10:00:04 crc kubenswrapper[4971]: I0309 10:00:04.174947 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1dca8b63-995b-4b16-a762-c8af5620cb44-swiftconf\") pod \"swift-ring-rebalance-debug-n9v67\" (UID: \"1dca8b63-995b-4b16-a762-c8af5620cb44\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n9v67" Mar 09 10:00:04 crc kubenswrapper[4971]: I0309 10:00:04.175021 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45rmr\" (UniqueName: \"kubernetes.io/projected/1dca8b63-995b-4b16-a762-c8af5620cb44-kube-api-access-45rmr\") pod \"swift-ring-rebalance-debug-n9v67\" (UID: \"1dca8b63-995b-4b16-a762-c8af5620cb44\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n9v67" Mar 09 10:00:04 crc kubenswrapper[4971]: I0309 10:00:04.175041 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1dca8b63-995b-4b16-a762-c8af5620cb44-etc-swift\") pod \"swift-ring-rebalance-debug-n9v67\" (UID: \"1dca8b63-995b-4b16-a762-c8af5620cb44\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n9v67" Mar 09 10:00:04 crc kubenswrapper[4971]: I0309 10:00:04.175082 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1dca8b63-995b-4b16-a762-c8af5620cb44-scripts\") pod \"swift-ring-rebalance-debug-n9v67\" (UID: \"1dca8b63-995b-4b16-a762-c8af5620cb44\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n9v67" Mar 09 10:00:04 crc kubenswrapper[4971]: I0309 10:00:04.175098 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1dca8b63-995b-4b16-a762-c8af5620cb44-ring-data-devices\") pod \"swift-ring-rebalance-debug-n9v67\" (UID: \"1dca8b63-995b-4b16-a762-c8af5620cb44\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n9v67" Mar 09 10:00:04 crc kubenswrapper[4971]: I0309 10:00:04.175138 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1dca8b63-995b-4b16-a762-c8af5620cb44-dispersionconf\") pod \"swift-ring-rebalance-debug-n9v67\" (UID: \"1dca8b63-995b-4b16-a762-c8af5620cb44\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n9v67" Mar 09 10:00:04 crc kubenswrapper[4971]: I0309 10:00:04.175719 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1dca8b63-995b-4b16-a762-c8af5620cb44-etc-swift\") pod \"swift-ring-rebalance-debug-n9v67\" (UID: \"1dca8b63-995b-4b16-a762-c8af5620cb44\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n9v67" Mar 09 10:00:04 crc kubenswrapper[4971]: I0309 10:00:04.176055 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1dca8b63-995b-4b16-a762-c8af5620cb44-scripts\") pod \"swift-ring-rebalance-debug-n9v67\" (UID: \"1dca8b63-995b-4b16-a762-c8af5620cb44\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n9v67" Mar 09 10:00:04 crc kubenswrapper[4971]: I0309 10:00:04.176167 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1dca8b63-995b-4b16-a762-c8af5620cb44-ring-data-devices\") pod \"swift-ring-rebalance-debug-n9v67\" (UID: \"1dca8b63-995b-4b16-a762-c8af5620cb44\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n9v67" Mar 09 10:00:04 crc kubenswrapper[4971]: I0309 10:00:04.182100 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1dca8b63-995b-4b16-a762-c8af5620cb44-swiftconf\") pod \"swift-ring-rebalance-debug-n9v67\" (UID: \"1dca8b63-995b-4b16-a762-c8af5620cb44\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n9v67" Mar 09 10:00:04 crc kubenswrapper[4971]: I0309 10:00:04.182867 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1dca8b63-995b-4b16-a762-c8af5620cb44-dispersionconf\") pod \"swift-ring-rebalance-debug-n9v67\" (UID: \"1dca8b63-995b-4b16-a762-c8af5620cb44\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n9v67" Mar 09 10:00:04 crc kubenswrapper[4971]: I0309 10:00:04.192733 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45rmr\" (UniqueName: \"kubernetes.io/projected/1dca8b63-995b-4b16-a762-c8af5620cb44-kube-api-access-45rmr\") pod \"swift-ring-rebalance-debug-n9v67\" (UID: \"1dca8b63-995b-4b16-a762-c8af5620cb44\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n9v67" Mar 09 10:00:04 crc kubenswrapper[4971]: I0309 10:00:04.336869 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n9v67" Mar 09 10:00:04 crc kubenswrapper[4971]: I0309 10:00:04.502148 4971 generic.go:334] "Generic (PLEG): container finished" podID="eb66c60e-a138-4804-9aaf-db389174e600" containerID="68f513651164bac4e873985650e6dd7c50049c37cb341a47f3d6f1edf2bd97a4" exitCode=0 Mar 09 10:00:04 crc kubenswrapper[4971]: I0309 10:00:04.502630 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550840-td6gm" event={"ID":"eb66c60e-a138-4804-9aaf-db389174e600","Type":"ContainerDied","Data":"68f513651164bac4e873985650e6dd7c50049c37cb341a47f3d6f1edf2bd97a4"} Mar 09 10:00:04 crc kubenswrapper[4971]: I0309 10:00:04.557231 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-n9v67"] Mar 09 10:00:04 crc kubenswrapper[4971]: W0309 10:00:04.562302 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1dca8b63_995b_4b16_a762_c8af5620cb44.slice/crio-ae67caf36f9327a0e3ac73cbf4313863d73b2bfaecff3e684656f3b01eb2ad87 WatchSource:0}: Error finding container ae67caf36f9327a0e3ac73cbf4313863d73b2bfaecff3e684656f3b01eb2ad87: Status 404 returned error can't find the container with id ae67caf36f9327a0e3ac73cbf4313863d73b2bfaecff3e684656f3b01eb2ad87 Mar 09 10:00:05 crc kubenswrapper[4971]: I0309 10:00:05.165195 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1707bff4-eb31-4ed0-bbc5-054813b1a34a" path="/var/lib/kubelet/pods/1707bff4-eb31-4ed0-bbc5-054813b1a34a/volumes" Mar 09 10:00:05 crc kubenswrapper[4971]: I0309 10:00:05.526655 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n9v67" event={"ID":"1dca8b63-995b-4b16-a762-c8af5620cb44","Type":"ContainerStarted","Data":"73a86e24c36b93d5ca44dc44599927bcf62f23e1605c0315854cbf4e70734cb6"} Mar 09 10:00:05 crc kubenswrapper[4971]: I0309 10:00:05.526696 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n9v67" event={"ID":"1dca8b63-995b-4b16-a762-c8af5620cb44","Type":"ContainerStarted","Data":"ae67caf36f9327a0e3ac73cbf4313863d73b2bfaecff3e684656f3b01eb2ad87"} Mar 09 10:00:05 crc kubenswrapper[4971]: I0309 10:00:05.540676 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n9v67" podStartSLOduration=2.540658783 podStartE2EDuration="2.540658783s" podCreationTimestamp="2026-03-09 10:00:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:00:05.539628054 +0000 UTC m=+2409.099555864" watchObservedRunningTime="2026-03-09 10:00:05.540658783 +0000 UTC m=+2409.100586593" Mar 09 10:00:05 crc kubenswrapper[4971]: I0309 10:00:05.790605 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550840-td6gm" Mar 09 10:00:05 crc kubenswrapper[4971]: I0309 10:00:05.902557 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjvv2\" (UniqueName: \"kubernetes.io/projected/eb66c60e-a138-4804-9aaf-db389174e600-kube-api-access-zjvv2\") pod \"eb66c60e-a138-4804-9aaf-db389174e600\" (UID: \"eb66c60e-a138-4804-9aaf-db389174e600\") " Mar 09 10:00:05 crc kubenswrapper[4971]: I0309 10:00:05.908759 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb66c60e-a138-4804-9aaf-db389174e600-kube-api-access-zjvv2" (OuterVolumeSpecName: "kube-api-access-zjvv2") pod "eb66c60e-a138-4804-9aaf-db389174e600" (UID: "eb66c60e-a138-4804-9aaf-db389174e600"). InnerVolumeSpecName "kube-api-access-zjvv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:06 crc kubenswrapper[4971]: I0309 10:00:06.004276 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjvv2\" (UniqueName: \"kubernetes.io/projected/eb66c60e-a138-4804-9aaf-db389174e600-kube-api-access-zjvv2\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:06 crc kubenswrapper[4971]: I0309 10:00:06.537768 4971 generic.go:334] "Generic (PLEG): container finished" podID="1dca8b63-995b-4b16-a762-c8af5620cb44" containerID="73a86e24c36b93d5ca44dc44599927bcf62f23e1605c0315854cbf4e70734cb6" exitCode=0 Mar 09 10:00:06 crc kubenswrapper[4971]: I0309 10:00:06.537843 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n9v67" event={"ID":"1dca8b63-995b-4b16-a762-c8af5620cb44","Type":"ContainerDied","Data":"73a86e24c36b93d5ca44dc44599927bcf62f23e1605c0315854cbf4e70734cb6"} Mar 09 10:00:06 crc kubenswrapper[4971]: I0309 10:00:06.540488 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550840-td6gm" event={"ID":"eb66c60e-a138-4804-9aaf-db389174e600","Type":"ContainerDied","Data":"2a52b80fec53826e83dd592c4a3a484261cb84320d9d727698f8db260d2febf9"} Mar 09 10:00:06 crc kubenswrapper[4971]: I0309 10:00:06.540519 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a52b80fec53826e83dd592c4a3a484261cb84320d9d727698f8db260d2febf9" Mar 09 10:00:06 crc kubenswrapper[4971]: I0309 10:00:06.540566 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550840-td6gm" Mar 09 10:00:06 crc kubenswrapper[4971]: I0309 10:00:06.845983 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550834-5m28w"] Mar 09 10:00:06 crc kubenswrapper[4971]: I0309 10:00:06.852255 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550834-5m28w"] Mar 09 10:00:07 crc kubenswrapper[4971]: I0309 10:00:07.098783 4971 scope.go:117] "RemoveContainer" containerID="1d2d9f4d574f0f9f5b31c9f7e0a87f1817f41fbe020e6e08463e724497da4a8d" Mar 09 10:00:07 crc kubenswrapper[4971]: I0309 10:00:07.126003 4971 scope.go:117] "RemoveContainer" containerID="47815d72534e469733e7130be9a8d78589dcbf3ab1208bc7459ae3ec23a27e59" Mar 09 10:00:07 crc kubenswrapper[4971]: I0309 10:00:07.164490 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afb29bf2-443e-4d7b-956a-001dcf4455e9" path="/var/lib/kubelet/pods/afb29bf2-443e-4d7b-956a-001dcf4455e9/volumes" Mar 09 10:00:07 crc kubenswrapper[4971]: I0309 10:00:07.171288 4971 scope.go:117] "RemoveContainer" containerID="3148319edd60aa94712d85ca00a82a4c1396f1a74d3c620745666993e43a563a" Mar 09 10:00:07 crc kubenswrapper[4971]: I0309 10:00:07.216421 4971 scope.go:117] "RemoveContainer" containerID="bfc6c65baef6ec6a03c68b7f356c51de958ad81a4f46d048aad101a602259734" Mar 09 10:00:07 crc kubenswrapper[4971]: I0309 10:00:07.244539 4971 scope.go:117] "RemoveContainer" containerID="37bd5fa24ff71a172d90b87134a7011fd18934e20dd6fa4857ba70b1e0b2ac47" Mar 09 10:00:07 crc kubenswrapper[4971]: I0309 10:00:07.296035 4971 scope.go:117] "RemoveContainer" containerID="47fdddbf06817cefed5a211ccd3459396039eaf1b67b3218d67c3d5dcc40ac18" Mar 09 10:00:07 crc kubenswrapper[4971]: I0309 10:00:07.317015 4971 scope.go:117] "RemoveContainer" containerID="3a3bf38e04e48b829d4c1a02c9e5f7d78c8251844e6871eec6d3e9709c9b61a8" Mar 09 10:00:07 crc kubenswrapper[4971]: I0309 10:00:07.340861 4971 scope.go:117] "RemoveContainer" containerID="34361f49e943722262d3935de3cc3fb473a0c544160ef918c9a89065e6e29ad2" Mar 09 10:00:07 crc kubenswrapper[4971]: I0309 10:00:07.788535 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n9v67" Mar 09 10:00:07 crc kubenswrapper[4971]: I0309 10:00:07.824395 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-n9v67"] Mar 09 10:00:07 crc kubenswrapper[4971]: I0309 10:00:07.832768 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-n9v67"] Mar 09 10:00:07 crc kubenswrapper[4971]: I0309 10:00:07.931933 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45rmr\" (UniqueName: \"kubernetes.io/projected/1dca8b63-995b-4b16-a762-c8af5620cb44-kube-api-access-45rmr\") pod \"1dca8b63-995b-4b16-a762-c8af5620cb44\" (UID: \"1dca8b63-995b-4b16-a762-c8af5620cb44\") " Mar 09 10:00:07 crc kubenswrapper[4971]: I0309 10:00:07.932034 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1dca8b63-995b-4b16-a762-c8af5620cb44-dispersionconf\") pod \"1dca8b63-995b-4b16-a762-c8af5620cb44\" (UID: \"1dca8b63-995b-4b16-a762-c8af5620cb44\") " Mar 09 10:00:07 crc kubenswrapper[4971]: I0309 10:00:07.932111 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1dca8b63-995b-4b16-a762-c8af5620cb44-ring-data-devices\") pod \"1dca8b63-995b-4b16-a762-c8af5620cb44\" (UID: \"1dca8b63-995b-4b16-a762-c8af5620cb44\") " Mar 09 10:00:07 crc kubenswrapper[4971]: I0309 10:00:07.932148 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1dca8b63-995b-4b16-a762-c8af5620cb44-swiftconf\") pod \"1dca8b63-995b-4b16-a762-c8af5620cb44\" (UID: \"1dca8b63-995b-4b16-a762-c8af5620cb44\") " Mar 09 10:00:07 crc kubenswrapper[4971]: I0309 10:00:07.932184 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1dca8b63-995b-4b16-a762-c8af5620cb44-etc-swift\") pod \"1dca8b63-995b-4b16-a762-c8af5620cb44\" (UID: \"1dca8b63-995b-4b16-a762-c8af5620cb44\") " Mar 09 10:00:07 crc kubenswrapper[4971]: I0309 10:00:07.932206 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1dca8b63-995b-4b16-a762-c8af5620cb44-scripts\") pod \"1dca8b63-995b-4b16-a762-c8af5620cb44\" (UID: \"1dca8b63-995b-4b16-a762-c8af5620cb44\") " Mar 09 10:00:07 crc kubenswrapper[4971]: I0309 10:00:07.933045 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dca8b63-995b-4b16-a762-c8af5620cb44-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "1dca8b63-995b-4b16-a762-c8af5620cb44" (UID: "1dca8b63-995b-4b16-a762-c8af5620cb44"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:07 crc kubenswrapper[4971]: I0309 10:00:07.933113 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dca8b63-995b-4b16-a762-c8af5620cb44-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1dca8b63-995b-4b16-a762-c8af5620cb44" (UID: "1dca8b63-995b-4b16-a762-c8af5620cb44"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:00:07 crc kubenswrapper[4971]: I0309 10:00:07.937760 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dca8b63-995b-4b16-a762-c8af5620cb44-kube-api-access-45rmr" (OuterVolumeSpecName: "kube-api-access-45rmr") pod "1dca8b63-995b-4b16-a762-c8af5620cb44" (UID: "1dca8b63-995b-4b16-a762-c8af5620cb44"). InnerVolumeSpecName "kube-api-access-45rmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:07 crc kubenswrapper[4971]: I0309 10:00:07.951403 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dca8b63-995b-4b16-a762-c8af5620cb44-scripts" (OuterVolumeSpecName: "scripts") pod "1dca8b63-995b-4b16-a762-c8af5620cb44" (UID: "1dca8b63-995b-4b16-a762-c8af5620cb44"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:07 crc kubenswrapper[4971]: I0309 10:00:07.953872 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dca8b63-995b-4b16-a762-c8af5620cb44-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "1dca8b63-995b-4b16-a762-c8af5620cb44" (UID: "1dca8b63-995b-4b16-a762-c8af5620cb44"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:07 crc kubenswrapper[4971]: I0309 10:00:07.954774 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dca8b63-995b-4b16-a762-c8af5620cb44-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "1dca8b63-995b-4b16-a762-c8af5620cb44" (UID: "1dca8b63-995b-4b16-a762-c8af5620cb44"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:08 crc kubenswrapper[4971]: I0309 10:00:08.033691 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1dca8b63-995b-4b16-a762-c8af5620cb44-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:08 crc kubenswrapper[4971]: I0309 10:00:08.033741 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1dca8b63-995b-4b16-a762-c8af5620cb44-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:08 crc kubenswrapper[4971]: I0309 10:00:08.033751 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45rmr\" (UniqueName: \"kubernetes.io/projected/1dca8b63-995b-4b16-a762-c8af5620cb44-kube-api-access-45rmr\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:08 crc kubenswrapper[4971]: I0309 10:00:08.033760 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1dca8b63-995b-4b16-a762-c8af5620cb44-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:08 crc kubenswrapper[4971]: I0309 10:00:08.033769 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1dca8b63-995b-4b16-a762-c8af5620cb44-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:08 crc kubenswrapper[4971]: I0309 10:00:08.033781 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1dca8b63-995b-4b16-a762-c8af5620cb44-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:08 crc kubenswrapper[4971]: I0309 10:00:08.556972 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae67caf36f9327a0e3ac73cbf4313863d73b2bfaecff3e684656f3b01eb2ad87" Mar 09 10:00:08 crc kubenswrapper[4971]: I0309 10:00:08.556998 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n9v67" Mar 09 10:00:08 crc kubenswrapper[4971]: I0309 10:00:08.956138 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc"] Mar 09 10:00:08 crc kubenswrapper[4971]: E0309 10:00:08.956437 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dca8b63-995b-4b16-a762-c8af5620cb44" containerName="swift-ring-rebalance" Mar 09 10:00:08 crc kubenswrapper[4971]: I0309 10:00:08.956452 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dca8b63-995b-4b16-a762-c8af5620cb44" containerName="swift-ring-rebalance" Mar 09 10:00:08 crc kubenswrapper[4971]: E0309 10:00:08.956471 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb66c60e-a138-4804-9aaf-db389174e600" containerName="oc" Mar 09 10:00:08 crc kubenswrapper[4971]: I0309 10:00:08.956479 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb66c60e-a138-4804-9aaf-db389174e600" containerName="oc" Mar 09 10:00:08 crc kubenswrapper[4971]: I0309 10:00:08.956623 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dca8b63-995b-4b16-a762-c8af5620cb44" containerName="swift-ring-rebalance" Mar 09 10:00:08 crc kubenswrapper[4971]: I0309 10:00:08.956645 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb66c60e-a138-4804-9aaf-db389174e600" containerName="oc" Mar 09 10:00:08 crc kubenswrapper[4971]: I0309 10:00:08.957102 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc" Mar 09 10:00:08 crc kubenswrapper[4971]: I0309 10:00:08.959934 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 10:00:08 crc kubenswrapper[4971]: I0309 10:00:08.962465 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 10:00:08 crc kubenswrapper[4971]: I0309 10:00:08.967210 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc"] Mar 09 10:00:09 crc kubenswrapper[4971]: I0309 10:00:09.046957 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-scripts\") pod \"swift-ring-rebalance-debug-rpgkc\" (UID: \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc" Mar 09 10:00:09 crc kubenswrapper[4971]: I0309 10:00:09.047039 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-dispersionconf\") pod \"swift-ring-rebalance-debug-rpgkc\" (UID: \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc" Mar 09 10:00:09 crc kubenswrapper[4971]: I0309 10:00:09.047073 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-etc-swift\") pod \"swift-ring-rebalance-debug-rpgkc\" (UID: \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc" Mar 09 10:00:09 crc kubenswrapper[4971]: I0309 10:00:09.047090 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-ring-data-devices\") pod \"swift-ring-rebalance-debug-rpgkc\" (UID: \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc" Mar 09 10:00:09 crc kubenswrapper[4971]: I0309 10:00:09.047118 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-swiftconf\") pod \"swift-ring-rebalance-debug-rpgkc\" (UID: \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc" Mar 09 10:00:09 crc kubenswrapper[4971]: I0309 10:00:09.047152 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsgsl\" (UniqueName: \"kubernetes.io/projected/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-kube-api-access-gsgsl\") pod \"swift-ring-rebalance-debug-rpgkc\" (UID: \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc" Mar 09 10:00:09 crc kubenswrapper[4971]: I0309 10:00:09.148547 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-dispersionconf\") pod \"swift-ring-rebalance-debug-rpgkc\" (UID: \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc" Mar 09 10:00:09 crc kubenswrapper[4971]: I0309 10:00:09.148614 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-etc-swift\") pod \"swift-ring-rebalance-debug-rpgkc\" (UID: \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc" Mar 09 10:00:09 crc kubenswrapper[4971]: I0309 10:00:09.148641 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-ring-data-devices\") pod \"swift-ring-rebalance-debug-rpgkc\" (UID: \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc" Mar 09 10:00:09 crc kubenswrapper[4971]: I0309 10:00:09.148675 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-swiftconf\") pod \"swift-ring-rebalance-debug-rpgkc\" (UID: \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc" Mar 09 10:00:09 crc kubenswrapper[4971]: I0309 10:00:09.148722 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsgsl\" (UniqueName: \"kubernetes.io/projected/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-kube-api-access-gsgsl\") pod \"swift-ring-rebalance-debug-rpgkc\" (UID: \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc" Mar 09 10:00:09 crc kubenswrapper[4971]: I0309 10:00:09.148778 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-scripts\") pod \"swift-ring-rebalance-debug-rpgkc\" (UID: \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc" Mar 09 10:00:09 crc kubenswrapper[4971]: I0309 10:00:09.149194 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-etc-swift\") pod \"swift-ring-rebalance-debug-rpgkc\" (UID: \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc" Mar 09 10:00:09 crc kubenswrapper[4971]: I0309 10:00:09.149600 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-ring-data-devices\") pod \"swift-ring-rebalance-debug-rpgkc\" (UID: \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc" Mar 09 10:00:09 crc kubenswrapper[4971]: I0309 10:00:09.149700 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-scripts\") pod \"swift-ring-rebalance-debug-rpgkc\" (UID: \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc" Mar 09 10:00:09 crc kubenswrapper[4971]: I0309 10:00:09.156693 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-dispersionconf\") pod \"swift-ring-rebalance-debug-rpgkc\" (UID: \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc" Mar 09 10:00:09 crc kubenswrapper[4971]: I0309 10:00:09.161281 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dca8b63-995b-4b16-a762-c8af5620cb44" path="/var/lib/kubelet/pods/1dca8b63-995b-4b16-a762-c8af5620cb44/volumes" Mar 09 10:00:09 crc kubenswrapper[4971]: I0309 10:00:09.168516 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsgsl\" (UniqueName: \"kubernetes.io/projected/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-kube-api-access-gsgsl\") pod \"swift-ring-rebalance-debug-rpgkc\" (UID: \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc" Mar 09 10:00:09 crc kubenswrapper[4971]: I0309 10:00:09.169312 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-swiftconf\") pod \"swift-ring-rebalance-debug-rpgkc\" (UID: \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc" Mar 09 10:00:09 crc kubenswrapper[4971]: I0309 10:00:09.272834 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc" Mar 09 10:00:09 crc kubenswrapper[4971]: I0309 10:00:09.737783 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc"] Mar 09 10:00:10 crc kubenswrapper[4971]: I0309 10:00:10.588158 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc" event={"ID":"bd27aca6-88cd-42b3-93f1-8a35dd011ab7","Type":"ContainerStarted","Data":"53e80d3834b2cfa27323b3ec5f8484f2c6919479f40f841028401dfd86844cd7"} Mar 09 10:00:10 crc kubenswrapper[4971]: I0309 10:00:10.588785 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc" event={"ID":"bd27aca6-88cd-42b3-93f1-8a35dd011ab7","Type":"ContainerStarted","Data":"5ebc6c0ae89aaa3c03da14c969812fc54d9fa613e34b1085a92ab8b99007ed93"} Mar 09 10:00:10 crc kubenswrapper[4971]: I0309 10:00:10.609097 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc" podStartSLOduration=2.609081508 podStartE2EDuration="2.609081508s" podCreationTimestamp="2026-03-09 10:00:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:00:10.607094053 +0000 UTC m=+2414.167021883" watchObservedRunningTime="2026-03-09 10:00:10.609081508 +0000 UTC m=+2414.169009318" Mar 09 10:00:11 crc kubenswrapper[4971]: I0309 10:00:11.597796 4971 generic.go:334] "Generic (PLEG): container finished" podID="bd27aca6-88cd-42b3-93f1-8a35dd011ab7" containerID="53e80d3834b2cfa27323b3ec5f8484f2c6919479f40f841028401dfd86844cd7" exitCode=0 Mar 09 10:00:11 crc kubenswrapper[4971]: I0309 10:00:11.597863 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc" event={"ID":"bd27aca6-88cd-42b3-93f1-8a35dd011ab7","Type":"ContainerDied","Data":"53e80d3834b2cfa27323b3ec5f8484f2c6919479f40f841028401dfd86844cd7"} Mar 09 10:00:12 crc kubenswrapper[4971]: I0309 10:00:12.861779 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc" Mar 09 10:00:12 crc kubenswrapper[4971]: I0309 10:00:12.890687 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc"] Mar 09 10:00:12 crc kubenswrapper[4971]: I0309 10:00:12.895742 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc"] Mar 09 10:00:13 crc kubenswrapper[4971]: I0309 10:00:13.024141 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-etc-swift\") pod \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\" (UID: \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\") " Mar 09 10:00:13 crc kubenswrapper[4971]: I0309 10:00:13.024308 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-scripts\") pod \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\" (UID: \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\") " Mar 09 10:00:13 crc kubenswrapper[4971]: I0309 10:00:13.024343 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-swiftconf\") pod \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\" (UID: \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\") " Mar 09 10:00:13 crc kubenswrapper[4971]: I0309 10:00:13.024402 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsgsl\" (UniqueName: \"kubernetes.io/projected/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-kube-api-access-gsgsl\") pod \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\" (UID: \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\") " Mar 09 10:00:13 crc kubenswrapper[4971]: I0309 10:00:13.024426 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-ring-data-devices\") pod \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\" (UID: \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\") " Mar 09 10:00:13 crc kubenswrapper[4971]: I0309 10:00:13.024450 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-dispersionconf\") pod \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\" (UID: \"bd27aca6-88cd-42b3-93f1-8a35dd011ab7\") " Mar 09 10:00:13 crc kubenswrapper[4971]: I0309 10:00:13.025601 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "bd27aca6-88cd-42b3-93f1-8a35dd011ab7" (UID: "bd27aca6-88cd-42b3-93f1-8a35dd011ab7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:00:13 crc kubenswrapper[4971]: I0309 10:00:13.026321 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "bd27aca6-88cd-42b3-93f1-8a35dd011ab7" (UID: "bd27aca6-88cd-42b3-93f1-8a35dd011ab7"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:13 crc kubenswrapper[4971]: I0309 10:00:13.030200 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-kube-api-access-gsgsl" (OuterVolumeSpecName: "kube-api-access-gsgsl") pod "bd27aca6-88cd-42b3-93f1-8a35dd011ab7" (UID: "bd27aca6-88cd-42b3-93f1-8a35dd011ab7"). InnerVolumeSpecName "kube-api-access-gsgsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:13 crc kubenswrapper[4971]: I0309 10:00:13.048865 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "bd27aca6-88cd-42b3-93f1-8a35dd011ab7" (UID: "bd27aca6-88cd-42b3-93f1-8a35dd011ab7"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:13 crc kubenswrapper[4971]: I0309 10:00:13.049089 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-scripts" (OuterVolumeSpecName: "scripts") pod "bd27aca6-88cd-42b3-93f1-8a35dd011ab7" (UID: "bd27aca6-88cd-42b3-93f1-8a35dd011ab7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:13 crc kubenswrapper[4971]: I0309 10:00:13.049610 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "bd27aca6-88cd-42b3-93f1-8a35dd011ab7" (UID: "bd27aca6-88cd-42b3-93f1-8a35dd011ab7"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:13 crc kubenswrapper[4971]: I0309 10:00:13.126635 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsgsl\" (UniqueName: \"kubernetes.io/projected/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-kube-api-access-gsgsl\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:13 crc kubenswrapper[4971]: I0309 10:00:13.126678 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:13 crc kubenswrapper[4971]: I0309 10:00:13.126692 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:13 crc kubenswrapper[4971]: I0309 10:00:13.126704 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:13 crc kubenswrapper[4971]: I0309 10:00:13.126716 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:13 crc kubenswrapper[4971]: I0309 10:00:13.126727 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bd27aca6-88cd-42b3-93f1-8a35dd011ab7-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:13 crc kubenswrapper[4971]: I0309 10:00:13.160210 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd27aca6-88cd-42b3-93f1-8a35dd011ab7" path="/var/lib/kubelet/pods/bd27aca6-88cd-42b3-93f1-8a35dd011ab7/volumes" Mar 09 10:00:13 crc kubenswrapper[4971]: I0309 10:00:13.615695 4971 scope.go:117] "RemoveContainer" containerID="53e80d3834b2cfa27323b3ec5f8484f2c6919479f40f841028401dfd86844cd7" Mar 09 10:00:13 crc kubenswrapper[4971]: I0309 10:00:13.615760 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rpgkc" Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.024291 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h"] Mar 09 10:00:14 crc kubenswrapper[4971]: E0309 10:00:14.024611 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd27aca6-88cd-42b3-93f1-8a35dd011ab7" containerName="swift-ring-rebalance" Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.024623 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd27aca6-88cd-42b3-93f1-8a35dd011ab7" containerName="swift-ring-rebalance" Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.024787 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd27aca6-88cd-42b3-93f1-8a35dd011ab7" containerName="swift-ring-rebalance" Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.025298 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h" Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.027225 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.028054 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.038811 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h"] Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.141130 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4c2b3448-b79b-4b19-a440-9bf05788e77b-etc-swift\") pod \"swift-ring-rebalance-debug-8bv2h\" (UID: \"4c2b3448-b79b-4b19-a440-9bf05788e77b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h" Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.141184 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4c2b3448-b79b-4b19-a440-9bf05788e77b-dispersionconf\") pod \"swift-ring-rebalance-debug-8bv2h\" (UID: \"4c2b3448-b79b-4b19-a440-9bf05788e77b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h" Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.141248 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c2b3448-b79b-4b19-a440-9bf05788e77b-scripts\") pod \"swift-ring-rebalance-debug-8bv2h\" (UID: \"4c2b3448-b79b-4b19-a440-9bf05788e77b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h" Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.141276 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4c2b3448-b79b-4b19-a440-9bf05788e77b-swiftconf\") pod \"swift-ring-rebalance-debug-8bv2h\" (UID: \"4c2b3448-b79b-4b19-a440-9bf05788e77b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h" Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.141325 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjxhb\" (UniqueName: \"kubernetes.io/projected/4c2b3448-b79b-4b19-a440-9bf05788e77b-kube-api-access-mjxhb\") pod \"swift-ring-rebalance-debug-8bv2h\" (UID: \"4c2b3448-b79b-4b19-a440-9bf05788e77b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h" Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.141361 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4c2b3448-b79b-4b19-a440-9bf05788e77b-ring-data-devices\") pod \"swift-ring-rebalance-debug-8bv2h\" (UID: \"4c2b3448-b79b-4b19-a440-9bf05788e77b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h" Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.242099 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c2b3448-b79b-4b19-a440-9bf05788e77b-scripts\") pod \"swift-ring-rebalance-debug-8bv2h\" (UID: \"4c2b3448-b79b-4b19-a440-9bf05788e77b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h" Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.242145 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4c2b3448-b79b-4b19-a440-9bf05788e77b-swiftconf\") pod \"swift-ring-rebalance-debug-8bv2h\" (UID: \"4c2b3448-b79b-4b19-a440-9bf05788e77b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h" Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.242190 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjxhb\" (UniqueName: \"kubernetes.io/projected/4c2b3448-b79b-4b19-a440-9bf05788e77b-kube-api-access-mjxhb\") pod \"swift-ring-rebalance-debug-8bv2h\" (UID: \"4c2b3448-b79b-4b19-a440-9bf05788e77b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h" Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.242214 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4c2b3448-b79b-4b19-a440-9bf05788e77b-ring-data-devices\") pod \"swift-ring-rebalance-debug-8bv2h\" (UID: \"4c2b3448-b79b-4b19-a440-9bf05788e77b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h" Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.242301 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4c2b3448-b79b-4b19-a440-9bf05788e77b-etc-swift\") pod \"swift-ring-rebalance-debug-8bv2h\" (UID: \"4c2b3448-b79b-4b19-a440-9bf05788e77b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h" Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.242326 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4c2b3448-b79b-4b19-a440-9bf05788e77b-dispersionconf\") pod \"swift-ring-rebalance-debug-8bv2h\" (UID: \"4c2b3448-b79b-4b19-a440-9bf05788e77b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h" Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.243151 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4c2b3448-b79b-4b19-a440-9bf05788e77b-ring-data-devices\") pod \"swift-ring-rebalance-debug-8bv2h\" (UID: \"4c2b3448-b79b-4b19-a440-9bf05788e77b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h" Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.243535 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c2b3448-b79b-4b19-a440-9bf05788e77b-scripts\") pod \"swift-ring-rebalance-debug-8bv2h\" (UID: \"4c2b3448-b79b-4b19-a440-9bf05788e77b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h" Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.244475 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4c2b3448-b79b-4b19-a440-9bf05788e77b-etc-swift\") pod \"swift-ring-rebalance-debug-8bv2h\" (UID: \"4c2b3448-b79b-4b19-a440-9bf05788e77b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h" Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.246965 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4c2b3448-b79b-4b19-a440-9bf05788e77b-dispersionconf\") pod \"swift-ring-rebalance-debug-8bv2h\" (UID: \"4c2b3448-b79b-4b19-a440-9bf05788e77b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h" Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.249590 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4c2b3448-b79b-4b19-a440-9bf05788e77b-swiftconf\") pod \"swift-ring-rebalance-debug-8bv2h\" (UID: \"4c2b3448-b79b-4b19-a440-9bf05788e77b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h" Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.264242 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjxhb\" (UniqueName: \"kubernetes.io/projected/4c2b3448-b79b-4b19-a440-9bf05788e77b-kube-api-access-mjxhb\") pod \"swift-ring-rebalance-debug-8bv2h\" (UID: \"4c2b3448-b79b-4b19-a440-9bf05788e77b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h" Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.427259 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h" Mar 09 10:00:14 crc kubenswrapper[4971]: I0309 10:00:14.853405 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h"] Mar 09 10:00:15 crc kubenswrapper[4971]: I0309 10:00:15.635146 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h" event={"ID":"4c2b3448-b79b-4b19-a440-9bf05788e77b","Type":"ContainerStarted","Data":"cd636d6025b257e9f53085f75309c5876201edfae28121ba7a1624559675e6a4"} Mar 09 10:00:15 crc kubenswrapper[4971]: I0309 10:00:15.635708 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h" event={"ID":"4c2b3448-b79b-4b19-a440-9bf05788e77b","Type":"ContainerStarted","Data":"d2912cd6275bcde7513da1e7b53b6f1dede50909bfab609fe6ffae434be73e9d"} Mar 09 10:00:15 crc kubenswrapper[4971]: I0309 10:00:15.651960 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h" podStartSLOduration=1.651941242 podStartE2EDuration="1.651941242s" podCreationTimestamp="2026-03-09 10:00:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:00:15.649436713 +0000 UTC m=+2419.209364533" watchObservedRunningTime="2026-03-09 10:00:15.651941242 +0000 UTC m=+2419.211869052" Mar 09 10:00:16 crc kubenswrapper[4971]: I0309 10:00:16.676668 4971 generic.go:334] "Generic (PLEG): container finished" podID="4c2b3448-b79b-4b19-a440-9bf05788e77b" containerID="cd636d6025b257e9f53085f75309c5876201edfae28121ba7a1624559675e6a4" exitCode=0 Mar 09 10:00:16 crc kubenswrapper[4971]: I0309 10:00:16.676713 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h" event={"ID":"4c2b3448-b79b-4b19-a440-9bf05788e77b","Type":"ContainerDied","Data":"cd636d6025b257e9f53085f75309c5876201edfae28121ba7a1624559675e6a4"} Mar 09 10:00:17 crc kubenswrapper[4971]: I0309 10:00:17.969167 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h" Mar 09 10:00:18 crc kubenswrapper[4971]: I0309 10:00:18.007790 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h"] Mar 09 10:00:18 crc kubenswrapper[4971]: I0309 10:00:18.015523 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h"] Mar 09 10:00:18 crc kubenswrapper[4971]: I0309 10:00:18.096565 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4c2b3448-b79b-4b19-a440-9bf05788e77b-swiftconf\") pod \"4c2b3448-b79b-4b19-a440-9bf05788e77b\" (UID: \"4c2b3448-b79b-4b19-a440-9bf05788e77b\") " Mar 09 10:00:18 crc kubenswrapper[4971]: I0309 10:00:18.096622 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjxhb\" (UniqueName: \"kubernetes.io/projected/4c2b3448-b79b-4b19-a440-9bf05788e77b-kube-api-access-mjxhb\") pod \"4c2b3448-b79b-4b19-a440-9bf05788e77b\" (UID: \"4c2b3448-b79b-4b19-a440-9bf05788e77b\") " Mar 09 10:00:18 crc kubenswrapper[4971]: I0309 10:00:18.096646 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4c2b3448-b79b-4b19-a440-9bf05788e77b-ring-data-devices\") pod \"4c2b3448-b79b-4b19-a440-9bf05788e77b\" (UID: \"4c2b3448-b79b-4b19-a440-9bf05788e77b\") " Mar 09 10:00:18 crc kubenswrapper[4971]: I0309 10:00:18.096674 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4c2b3448-b79b-4b19-a440-9bf05788e77b-dispersionconf\") pod \"4c2b3448-b79b-4b19-a440-9bf05788e77b\" (UID: \"4c2b3448-b79b-4b19-a440-9bf05788e77b\") " Mar 09 10:00:18 crc kubenswrapper[4971]: I0309 10:00:18.096721 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4c2b3448-b79b-4b19-a440-9bf05788e77b-etc-swift\") pod \"4c2b3448-b79b-4b19-a440-9bf05788e77b\" (UID: \"4c2b3448-b79b-4b19-a440-9bf05788e77b\") " Mar 09 10:00:18 crc kubenswrapper[4971]: I0309 10:00:18.096828 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c2b3448-b79b-4b19-a440-9bf05788e77b-scripts\") pod \"4c2b3448-b79b-4b19-a440-9bf05788e77b\" (UID: \"4c2b3448-b79b-4b19-a440-9bf05788e77b\") " Mar 09 10:00:18 crc kubenswrapper[4971]: I0309 10:00:18.097856 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c2b3448-b79b-4b19-a440-9bf05788e77b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4c2b3448-b79b-4b19-a440-9bf05788e77b" (UID: "4c2b3448-b79b-4b19-a440-9bf05788e77b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:00:18 crc kubenswrapper[4971]: I0309 10:00:18.098095 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c2b3448-b79b-4b19-a440-9bf05788e77b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "4c2b3448-b79b-4b19-a440-9bf05788e77b" (UID: "4c2b3448-b79b-4b19-a440-9bf05788e77b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:18 crc kubenswrapper[4971]: I0309 10:00:18.107093 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c2b3448-b79b-4b19-a440-9bf05788e77b-kube-api-access-mjxhb" (OuterVolumeSpecName: "kube-api-access-mjxhb") pod "4c2b3448-b79b-4b19-a440-9bf05788e77b" (UID: "4c2b3448-b79b-4b19-a440-9bf05788e77b"). InnerVolumeSpecName "kube-api-access-mjxhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:18 crc kubenswrapper[4971]: I0309 10:00:18.121959 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c2b3448-b79b-4b19-a440-9bf05788e77b-scripts" (OuterVolumeSpecName: "scripts") pod "4c2b3448-b79b-4b19-a440-9bf05788e77b" (UID: "4c2b3448-b79b-4b19-a440-9bf05788e77b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:18 crc kubenswrapper[4971]: I0309 10:00:18.122560 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c2b3448-b79b-4b19-a440-9bf05788e77b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "4c2b3448-b79b-4b19-a440-9bf05788e77b" (UID: "4c2b3448-b79b-4b19-a440-9bf05788e77b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:18 crc kubenswrapper[4971]: I0309 10:00:18.137297 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c2b3448-b79b-4b19-a440-9bf05788e77b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "4c2b3448-b79b-4b19-a440-9bf05788e77b" (UID: "4c2b3448-b79b-4b19-a440-9bf05788e77b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:18 crc kubenswrapper[4971]: I0309 10:00:18.203803 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c2b3448-b79b-4b19-a440-9bf05788e77b-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:18 crc kubenswrapper[4971]: I0309 10:00:18.204050 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4c2b3448-b79b-4b19-a440-9bf05788e77b-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:18 crc kubenswrapper[4971]: I0309 10:00:18.204126 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjxhb\" (UniqueName: \"kubernetes.io/projected/4c2b3448-b79b-4b19-a440-9bf05788e77b-kube-api-access-mjxhb\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:18 crc kubenswrapper[4971]: I0309 10:00:18.204198 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4c2b3448-b79b-4b19-a440-9bf05788e77b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:18 crc kubenswrapper[4971]: I0309 10:00:18.204267 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4c2b3448-b79b-4b19-a440-9bf05788e77b-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:18 crc kubenswrapper[4971]: I0309 10:00:18.204408 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4c2b3448-b79b-4b19-a440-9bf05788e77b-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:18 crc kubenswrapper[4971]: I0309 10:00:18.696651 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2912cd6275bcde7513da1e7b53b6f1dede50909bfab609fe6ffae434be73e9d" Mar 09 10:00:18 crc kubenswrapper[4971]: I0309 10:00:18.696730 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8bv2h" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.141239 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp"] Mar 09 10:00:19 crc kubenswrapper[4971]: E0309 10:00:19.141593 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c2b3448-b79b-4b19-a440-9bf05788e77b" containerName="swift-ring-rebalance" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.141609 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c2b3448-b79b-4b19-a440-9bf05788e77b" containerName="swift-ring-rebalance" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.141773 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c2b3448-b79b-4b19-a440-9bf05788e77b" containerName="swift-ring-rebalance" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.142369 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.144399 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.145842 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.163790 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c2b3448-b79b-4b19-a440-9bf05788e77b" path="/var/lib/kubelet/pods/4c2b3448-b79b-4b19-a440-9bf05788e77b/volumes" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.164381 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp"] Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.219538 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-dispersionconf\") pod \"swift-ring-rebalance-debug-l6dzp\" (UID: \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.219624 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-etc-swift\") pod \"swift-ring-rebalance-debug-l6dzp\" (UID: \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.219654 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-ring-data-devices\") pod \"swift-ring-rebalance-debug-l6dzp\" (UID: \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.219682 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-scripts\") pod \"swift-ring-rebalance-debug-l6dzp\" (UID: \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.219702 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f24z\" (UniqueName: \"kubernetes.io/projected/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-kube-api-access-4f24z\") pod \"swift-ring-rebalance-debug-l6dzp\" (UID: \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.219779 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-swiftconf\") pod \"swift-ring-rebalance-debug-l6dzp\" (UID: \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.321815 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-swiftconf\") pod \"swift-ring-rebalance-debug-l6dzp\" (UID: \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.321881 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-dispersionconf\") pod \"swift-ring-rebalance-debug-l6dzp\" (UID: \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.321917 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-etc-swift\") pod \"swift-ring-rebalance-debug-l6dzp\" (UID: \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.321939 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-ring-data-devices\") pod \"swift-ring-rebalance-debug-l6dzp\" (UID: \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.322280 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-scripts\") pod \"swift-ring-rebalance-debug-l6dzp\" (UID: \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.322307 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f24z\" (UniqueName: \"kubernetes.io/projected/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-kube-api-access-4f24z\") pod \"swift-ring-rebalance-debug-l6dzp\" (UID: \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.323321 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-etc-swift\") pod \"swift-ring-rebalance-debug-l6dzp\" (UID: \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.323776 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-scripts\") pod \"swift-ring-rebalance-debug-l6dzp\" (UID: \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.323819 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-ring-data-devices\") pod \"swift-ring-rebalance-debug-l6dzp\" (UID: \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.329196 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-dispersionconf\") pod \"swift-ring-rebalance-debug-l6dzp\" (UID: \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.329259 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-swiftconf\") pod \"swift-ring-rebalance-debug-l6dzp\" (UID: \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.341477 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f24z\" (UniqueName: \"kubernetes.io/projected/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-kube-api-access-4f24z\") pod \"swift-ring-rebalance-debug-l6dzp\" (UID: \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.462002 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp" Mar 09 10:00:19 crc kubenswrapper[4971]: I0309 10:00:19.852266 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp"] Mar 09 10:00:20 crc kubenswrapper[4971]: I0309 10:00:20.713708 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp" event={"ID":"31f9cbd5-d5da-47c4-ba9a-581d538b6d82","Type":"ContainerStarted","Data":"963769de7c319cb0b89205ddfd9e1438980703d770db444c28b70839237fed57"} Mar 09 10:00:20 crc kubenswrapper[4971]: I0309 10:00:20.714088 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp" event={"ID":"31f9cbd5-d5da-47c4-ba9a-581d538b6d82","Type":"ContainerStarted","Data":"7719758ae38c58b69d70986229396e6edf4c40334da56d2ee2664671c4644a8b"} Mar 09 10:00:20 crc kubenswrapper[4971]: I0309 10:00:20.729870 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp" podStartSLOduration=1.729851419 podStartE2EDuration="1.729851419s" podCreationTimestamp="2026-03-09 10:00:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:00:20.727437472 +0000 UTC m=+2424.287365302" watchObservedRunningTime="2026-03-09 10:00:20.729851419 +0000 UTC m=+2424.289779229" Mar 09 10:00:21 crc kubenswrapper[4971]: I0309 10:00:21.725221 4971 generic.go:334] "Generic (PLEG): container finished" podID="31f9cbd5-d5da-47c4-ba9a-581d538b6d82" containerID="963769de7c319cb0b89205ddfd9e1438980703d770db444c28b70839237fed57" exitCode=0 Mar 09 10:00:21 crc kubenswrapper[4971]: I0309 10:00:21.725263 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp" event={"ID":"31f9cbd5-d5da-47c4-ba9a-581d538b6d82","Type":"ContainerDied","Data":"963769de7c319cb0b89205ddfd9e1438980703d770db444c28b70839237fed57"} Mar 09 10:00:23 crc kubenswrapper[4971]: I0309 10:00:23.013151 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp" Mar 09 10:00:23 crc kubenswrapper[4971]: I0309 10:00:23.053160 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp"] Mar 09 10:00:23 crc kubenswrapper[4971]: I0309 10:00:23.070149 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp"] Mar 09 10:00:23 crc kubenswrapper[4971]: I0309 10:00:23.076966 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-dispersionconf\") pod \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\" (UID: \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\") " Mar 09 10:00:23 crc kubenswrapper[4971]: I0309 10:00:23.077079 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-ring-data-devices\") pod \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\" (UID: \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\") " Mar 09 10:00:23 crc kubenswrapper[4971]: I0309 10:00:23.077176 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-etc-swift\") pod \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\" (UID: \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\") " Mar 09 10:00:23 crc kubenswrapper[4971]: I0309 10:00:23.077262 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-swiftconf\") pod \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\" (UID: \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\") " Mar 09 10:00:23 crc kubenswrapper[4971]: I0309 10:00:23.077302 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f24z\" (UniqueName: \"kubernetes.io/projected/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-kube-api-access-4f24z\") pod \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\" (UID: \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\") " Mar 09 10:00:23 crc kubenswrapper[4971]: I0309 10:00:23.077385 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-scripts\") pod \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\" (UID: \"31f9cbd5-d5da-47c4-ba9a-581d538b6d82\") " Mar 09 10:00:23 crc kubenswrapper[4971]: I0309 10:00:23.078450 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "31f9cbd5-d5da-47c4-ba9a-581d538b6d82" (UID: "31f9cbd5-d5da-47c4-ba9a-581d538b6d82"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:23 crc kubenswrapper[4971]: I0309 10:00:23.078486 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "31f9cbd5-d5da-47c4-ba9a-581d538b6d82" (UID: "31f9cbd5-d5da-47c4-ba9a-581d538b6d82"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:00:23 crc kubenswrapper[4971]: I0309 10:00:23.083132 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-kube-api-access-4f24z" (OuterVolumeSpecName: "kube-api-access-4f24z") pod "31f9cbd5-d5da-47c4-ba9a-581d538b6d82" (UID: "31f9cbd5-d5da-47c4-ba9a-581d538b6d82"). InnerVolumeSpecName "kube-api-access-4f24z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:23 crc kubenswrapper[4971]: I0309 10:00:23.098421 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-scripts" (OuterVolumeSpecName: "scripts") pod "31f9cbd5-d5da-47c4-ba9a-581d538b6d82" (UID: "31f9cbd5-d5da-47c4-ba9a-581d538b6d82"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:23 crc kubenswrapper[4971]: I0309 10:00:23.099060 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "31f9cbd5-d5da-47c4-ba9a-581d538b6d82" (UID: "31f9cbd5-d5da-47c4-ba9a-581d538b6d82"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:23 crc kubenswrapper[4971]: I0309 10:00:23.099145 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "31f9cbd5-d5da-47c4-ba9a-581d538b6d82" (UID: "31f9cbd5-d5da-47c4-ba9a-581d538b6d82"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:23 crc kubenswrapper[4971]: I0309 10:00:23.165653 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31f9cbd5-d5da-47c4-ba9a-581d538b6d82" path="/var/lib/kubelet/pods/31f9cbd5-d5da-47c4-ba9a-581d538b6d82/volumes" Mar 09 10:00:23 crc kubenswrapper[4971]: I0309 10:00:23.178898 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:23 crc kubenswrapper[4971]: I0309 10:00:23.178930 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:23 crc kubenswrapper[4971]: I0309 10:00:23.178940 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f24z\" (UniqueName: \"kubernetes.io/projected/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-kube-api-access-4f24z\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:23 crc kubenswrapper[4971]: I0309 10:00:23.178949 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:23 crc kubenswrapper[4971]: I0309 10:00:23.178957 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:23 crc kubenswrapper[4971]: I0309 10:00:23.178965 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/31f9cbd5-d5da-47c4-ba9a-581d538b6d82-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:23 crc kubenswrapper[4971]: I0309 10:00:23.749919 4971 scope.go:117] "RemoveContainer" containerID="963769de7c319cb0b89205ddfd9e1438980703d770db444c28b70839237fed57" Mar 09 10:00:23 crc kubenswrapper[4971]: I0309 10:00:23.749981 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l6dzp" Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.183026 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2z472"] Mar 09 10:00:24 crc kubenswrapper[4971]: E0309 10:00:24.183762 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31f9cbd5-d5da-47c4-ba9a-581d538b6d82" containerName="swift-ring-rebalance" Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.183777 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="31f9cbd5-d5da-47c4-ba9a-581d538b6d82" containerName="swift-ring-rebalance" Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.183957 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="31f9cbd5-d5da-47c4-ba9a-581d538b6d82" containerName="swift-ring-rebalance" Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.184445 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2z472" Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.187015 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.187447 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.190540 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2z472"] Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.293611 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2d94ee6f-e763-45a3-98e4-ae3fb529389e-dispersionconf\") pod \"swift-ring-rebalance-debug-2z472\" (UID: \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2z472" Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.293692 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4rlk\" (UniqueName: \"kubernetes.io/projected/2d94ee6f-e763-45a3-98e4-ae3fb529389e-kube-api-access-s4rlk\") pod \"swift-ring-rebalance-debug-2z472\" (UID: \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2z472" Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.293753 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2d94ee6f-e763-45a3-98e4-ae3fb529389e-ring-data-devices\") pod \"swift-ring-rebalance-debug-2z472\" (UID: \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2z472" Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.293773 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d94ee6f-e763-45a3-98e4-ae3fb529389e-scripts\") pod \"swift-ring-rebalance-debug-2z472\" (UID: \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2z472" Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.293796 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2d94ee6f-e763-45a3-98e4-ae3fb529389e-etc-swift\") pod \"swift-ring-rebalance-debug-2z472\" (UID: \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2z472" Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.293824 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2d94ee6f-e763-45a3-98e4-ae3fb529389e-swiftconf\") pod \"swift-ring-rebalance-debug-2z472\" (UID: \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2z472" Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.394981 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2d94ee6f-e763-45a3-98e4-ae3fb529389e-dispersionconf\") pod \"swift-ring-rebalance-debug-2z472\" (UID: \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2z472" Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.395075 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4rlk\" (UniqueName: \"kubernetes.io/projected/2d94ee6f-e763-45a3-98e4-ae3fb529389e-kube-api-access-s4rlk\") pod \"swift-ring-rebalance-debug-2z472\" (UID: \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2z472" Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.395136 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2d94ee6f-e763-45a3-98e4-ae3fb529389e-ring-data-devices\") pod \"swift-ring-rebalance-debug-2z472\" (UID: \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2z472" Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.395164 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d94ee6f-e763-45a3-98e4-ae3fb529389e-scripts\") pod \"swift-ring-rebalance-debug-2z472\" (UID: \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2z472" Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.395195 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2d94ee6f-e763-45a3-98e4-ae3fb529389e-etc-swift\") pod \"swift-ring-rebalance-debug-2z472\" (UID: \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2z472" Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.395228 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2d94ee6f-e763-45a3-98e4-ae3fb529389e-swiftconf\") pod \"swift-ring-rebalance-debug-2z472\" (UID: \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2z472" Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.395788 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2d94ee6f-e763-45a3-98e4-ae3fb529389e-etc-swift\") pod \"swift-ring-rebalance-debug-2z472\" (UID: \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2z472" Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.396108 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d94ee6f-e763-45a3-98e4-ae3fb529389e-scripts\") pod \"swift-ring-rebalance-debug-2z472\" (UID: \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2z472" Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.396122 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2d94ee6f-e763-45a3-98e4-ae3fb529389e-ring-data-devices\") pod \"swift-ring-rebalance-debug-2z472\" (UID: \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2z472" Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.400100 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2d94ee6f-e763-45a3-98e4-ae3fb529389e-dispersionconf\") pod \"swift-ring-rebalance-debug-2z472\" (UID: \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2z472" Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.407762 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2d94ee6f-e763-45a3-98e4-ae3fb529389e-swiftconf\") pod \"swift-ring-rebalance-debug-2z472\" (UID: \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2z472" Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.411194 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4rlk\" (UniqueName: \"kubernetes.io/projected/2d94ee6f-e763-45a3-98e4-ae3fb529389e-kube-api-access-s4rlk\") pod \"swift-ring-rebalance-debug-2z472\" (UID: \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2z472" Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.504226 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2z472" Mar 09 10:00:24 crc kubenswrapper[4971]: I0309 10:00:24.934488 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2z472"] Mar 09 10:00:24 crc kubenswrapper[4971]: W0309 10:00:24.951338 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d94ee6f_e763_45a3_98e4_ae3fb529389e.slice/crio-2e44bdab43e13a73210ef36b3b73072abec397b33da7223ab4cdc10fd0601efc WatchSource:0}: Error finding container 2e44bdab43e13a73210ef36b3b73072abec397b33da7223ab4cdc10fd0601efc: Status 404 returned error can't find the container with id 2e44bdab43e13a73210ef36b3b73072abec397b33da7223ab4cdc10fd0601efc Mar 09 10:00:25 crc kubenswrapper[4971]: I0309 10:00:25.766271 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2z472" event={"ID":"2d94ee6f-e763-45a3-98e4-ae3fb529389e","Type":"ContainerStarted","Data":"482a4801c0c1b311da565025675e1a6aaa1ff8dbfd62aa7bd77d2c675b688b64"} Mar 09 10:00:25 crc kubenswrapper[4971]: I0309 10:00:25.766653 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2z472" event={"ID":"2d94ee6f-e763-45a3-98e4-ae3fb529389e","Type":"ContainerStarted","Data":"2e44bdab43e13a73210ef36b3b73072abec397b33da7223ab4cdc10fd0601efc"} Mar 09 10:00:25 crc kubenswrapper[4971]: I0309 10:00:25.786225 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2z472" podStartSLOduration=1.786206983 podStartE2EDuration="1.786206983s" podCreationTimestamp="2026-03-09 10:00:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:00:25.78282586 +0000 UTC m=+2429.342753670" watchObservedRunningTime="2026-03-09 10:00:25.786206983 +0000 UTC m=+2429.346134793" Mar 09 10:00:26 crc kubenswrapper[4971]: I0309 10:00:26.786894 4971 generic.go:334] "Generic (PLEG): container finished" podID="2d94ee6f-e763-45a3-98e4-ae3fb529389e" containerID="482a4801c0c1b311da565025675e1a6aaa1ff8dbfd62aa7bd77d2c675b688b64" exitCode=0 Mar 09 10:00:26 crc kubenswrapper[4971]: I0309 10:00:26.786941 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2z472" event={"ID":"2d94ee6f-e763-45a3-98e4-ae3fb529389e","Type":"ContainerDied","Data":"482a4801c0c1b311da565025675e1a6aaa1ff8dbfd62aa7bd77d2c675b688b64"} Mar 09 10:00:28 crc kubenswrapper[4971]: I0309 10:00:28.124491 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2z472" Mar 09 10:00:28 crc kubenswrapper[4971]: I0309 10:00:28.152158 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2z472"] Mar 09 10:00:28 crc kubenswrapper[4971]: I0309 10:00:28.157509 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2z472"] Mar 09 10:00:28 crc kubenswrapper[4971]: I0309 10:00:28.250964 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2d94ee6f-e763-45a3-98e4-ae3fb529389e-swiftconf\") pod \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\" (UID: \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\") " Mar 09 10:00:28 crc kubenswrapper[4971]: I0309 10:00:28.251012 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4rlk\" (UniqueName: \"kubernetes.io/projected/2d94ee6f-e763-45a3-98e4-ae3fb529389e-kube-api-access-s4rlk\") pod \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\" (UID: \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\") " Mar 09 10:00:28 crc kubenswrapper[4971]: I0309 10:00:28.251097 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d94ee6f-e763-45a3-98e4-ae3fb529389e-scripts\") pod \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\" (UID: \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\") " Mar 09 10:00:28 crc kubenswrapper[4971]: I0309 10:00:28.251120 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2d94ee6f-e763-45a3-98e4-ae3fb529389e-etc-swift\") pod \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\" (UID: \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\") " Mar 09 10:00:28 crc kubenswrapper[4971]: I0309 10:00:28.251237 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2d94ee6f-e763-45a3-98e4-ae3fb529389e-dispersionconf\") pod \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\" (UID: \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\") " Mar 09 10:00:28 crc kubenswrapper[4971]: I0309 10:00:28.251387 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2d94ee6f-e763-45a3-98e4-ae3fb529389e-ring-data-devices\") pod \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\" (UID: \"2d94ee6f-e763-45a3-98e4-ae3fb529389e\") " Mar 09 10:00:28 crc kubenswrapper[4971]: I0309 10:00:28.252411 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d94ee6f-e763-45a3-98e4-ae3fb529389e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "2d94ee6f-e763-45a3-98e4-ae3fb529389e" (UID: "2d94ee6f-e763-45a3-98e4-ae3fb529389e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:28 crc kubenswrapper[4971]: I0309 10:00:28.252571 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d94ee6f-e763-45a3-98e4-ae3fb529389e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2d94ee6f-e763-45a3-98e4-ae3fb529389e" (UID: "2d94ee6f-e763-45a3-98e4-ae3fb529389e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:00:28 crc kubenswrapper[4971]: I0309 10:00:28.255955 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d94ee6f-e763-45a3-98e4-ae3fb529389e-kube-api-access-s4rlk" (OuterVolumeSpecName: "kube-api-access-s4rlk") pod "2d94ee6f-e763-45a3-98e4-ae3fb529389e" (UID: "2d94ee6f-e763-45a3-98e4-ae3fb529389e"). InnerVolumeSpecName "kube-api-access-s4rlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:28 crc kubenswrapper[4971]: I0309 10:00:28.270281 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d94ee6f-e763-45a3-98e4-ae3fb529389e-scripts" (OuterVolumeSpecName: "scripts") pod "2d94ee6f-e763-45a3-98e4-ae3fb529389e" (UID: "2d94ee6f-e763-45a3-98e4-ae3fb529389e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:28 crc kubenswrapper[4971]: I0309 10:00:28.272895 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d94ee6f-e763-45a3-98e4-ae3fb529389e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "2d94ee6f-e763-45a3-98e4-ae3fb529389e" (UID: "2d94ee6f-e763-45a3-98e4-ae3fb529389e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:28 crc kubenswrapper[4971]: I0309 10:00:28.274216 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d94ee6f-e763-45a3-98e4-ae3fb529389e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "2d94ee6f-e763-45a3-98e4-ae3fb529389e" (UID: "2d94ee6f-e763-45a3-98e4-ae3fb529389e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:28 crc kubenswrapper[4971]: I0309 10:00:28.352743 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2d94ee6f-e763-45a3-98e4-ae3fb529389e-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:28 crc kubenswrapper[4971]: I0309 10:00:28.352790 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4rlk\" (UniqueName: \"kubernetes.io/projected/2d94ee6f-e763-45a3-98e4-ae3fb529389e-kube-api-access-s4rlk\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:28 crc kubenswrapper[4971]: I0309 10:00:28.352803 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d94ee6f-e763-45a3-98e4-ae3fb529389e-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:28 crc kubenswrapper[4971]: I0309 10:00:28.352812 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2d94ee6f-e763-45a3-98e4-ae3fb529389e-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:28 crc kubenswrapper[4971]: I0309 10:00:28.352821 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2d94ee6f-e763-45a3-98e4-ae3fb529389e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:28 crc kubenswrapper[4971]: I0309 10:00:28.352831 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2d94ee6f-e763-45a3-98e4-ae3fb529389e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:28 crc kubenswrapper[4971]: I0309 10:00:28.814540 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e44bdab43e13a73210ef36b3b73072abec397b33da7223ab4cdc10fd0601efc" Mar 09 10:00:28 crc kubenswrapper[4971]: I0309 10:00:28.814591 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2z472" Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.161695 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d94ee6f-e763-45a3-98e4-ae3fb529389e" path="/var/lib/kubelet/pods/2d94ee6f-e763-45a3-98e4-ae3fb529389e/volumes" Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.292034 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vk87m"] Mar 09 10:00:29 crc kubenswrapper[4971]: E0309 10:00:29.292555 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d94ee6f-e763-45a3-98e4-ae3fb529389e" containerName="swift-ring-rebalance" Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.292577 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d94ee6f-e763-45a3-98e4-ae3fb529389e" containerName="swift-ring-rebalance" Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.292735 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d94ee6f-e763-45a3-98e4-ae3fb529389e" containerName="swift-ring-rebalance" Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.293285 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vk87m" Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.294900 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.295830 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.302639 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vk87m"] Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.367910 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/005c8ee8-a75f-4130-b461-557341a6f693-etc-swift\") pod \"swift-ring-rebalance-debug-vk87m\" (UID: \"005c8ee8-a75f-4130-b461-557341a6f693\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vk87m" Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.367967 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/005c8ee8-a75f-4130-b461-557341a6f693-swiftconf\") pod \"swift-ring-rebalance-debug-vk87m\" (UID: \"005c8ee8-a75f-4130-b461-557341a6f693\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vk87m" Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.368005 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/005c8ee8-a75f-4130-b461-557341a6f693-ring-data-devices\") pod \"swift-ring-rebalance-debug-vk87m\" (UID: \"005c8ee8-a75f-4130-b461-557341a6f693\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vk87m" Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.368063 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/005c8ee8-a75f-4130-b461-557341a6f693-dispersionconf\") pod \"swift-ring-rebalance-debug-vk87m\" (UID: \"005c8ee8-a75f-4130-b461-557341a6f693\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vk87m" Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.368103 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xv82\" (UniqueName: \"kubernetes.io/projected/005c8ee8-a75f-4130-b461-557341a6f693-kube-api-access-2xv82\") pod \"swift-ring-rebalance-debug-vk87m\" (UID: \"005c8ee8-a75f-4130-b461-557341a6f693\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vk87m" Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.368142 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/005c8ee8-a75f-4130-b461-557341a6f693-scripts\") pod \"swift-ring-rebalance-debug-vk87m\" (UID: \"005c8ee8-a75f-4130-b461-557341a6f693\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vk87m" Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.469295 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/005c8ee8-a75f-4130-b461-557341a6f693-swiftconf\") pod \"swift-ring-rebalance-debug-vk87m\" (UID: \"005c8ee8-a75f-4130-b461-557341a6f693\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vk87m" Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.469660 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/005c8ee8-a75f-4130-b461-557341a6f693-ring-data-devices\") pod \"swift-ring-rebalance-debug-vk87m\" (UID: \"005c8ee8-a75f-4130-b461-557341a6f693\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vk87m" Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.469801 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/005c8ee8-a75f-4130-b461-557341a6f693-dispersionconf\") pod \"swift-ring-rebalance-debug-vk87m\" (UID: \"005c8ee8-a75f-4130-b461-557341a6f693\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vk87m" Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.469895 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xv82\" (UniqueName: \"kubernetes.io/projected/005c8ee8-a75f-4130-b461-557341a6f693-kube-api-access-2xv82\") pod \"swift-ring-rebalance-debug-vk87m\" (UID: \"005c8ee8-a75f-4130-b461-557341a6f693\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vk87m" Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.470051 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/005c8ee8-a75f-4130-b461-557341a6f693-scripts\") pod \"swift-ring-rebalance-debug-vk87m\" (UID: \"005c8ee8-a75f-4130-b461-557341a6f693\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vk87m" Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.470232 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/005c8ee8-a75f-4130-b461-557341a6f693-etc-swift\") pod \"swift-ring-rebalance-debug-vk87m\" (UID: \"005c8ee8-a75f-4130-b461-557341a6f693\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vk87m" Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.470733 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/005c8ee8-a75f-4130-b461-557341a6f693-etc-swift\") pod \"swift-ring-rebalance-debug-vk87m\" (UID: \"005c8ee8-a75f-4130-b461-557341a6f693\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vk87m" Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.470934 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/005c8ee8-a75f-4130-b461-557341a6f693-ring-data-devices\") pod \"swift-ring-rebalance-debug-vk87m\" (UID: \"005c8ee8-a75f-4130-b461-557341a6f693\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vk87m" Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.471172 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/005c8ee8-a75f-4130-b461-557341a6f693-scripts\") pod \"swift-ring-rebalance-debug-vk87m\" (UID: \"005c8ee8-a75f-4130-b461-557341a6f693\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vk87m" Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.477126 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/005c8ee8-a75f-4130-b461-557341a6f693-swiftconf\") pod \"swift-ring-rebalance-debug-vk87m\" (UID: \"005c8ee8-a75f-4130-b461-557341a6f693\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vk87m" Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.477176 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/005c8ee8-a75f-4130-b461-557341a6f693-dispersionconf\") pod \"swift-ring-rebalance-debug-vk87m\" (UID: \"005c8ee8-a75f-4130-b461-557341a6f693\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vk87m" Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.492549 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xv82\" (UniqueName: \"kubernetes.io/projected/005c8ee8-a75f-4130-b461-557341a6f693-kube-api-access-2xv82\") pod \"swift-ring-rebalance-debug-vk87m\" (UID: \"005c8ee8-a75f-4130-b461-557341a6f693\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vk87m" Mar 09 10:00:29 crc kubenswrapper[4971]: I0309 10:00:29.610076 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vk87m" Mar 09 10:00:30 crc kubenswrapper[4971]: W0309 10:00:30.063903 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod005c8ee8_a75f_4130_b461_557341a6f693.slice/crio-7623c97deba123a80ab23edbf5a3a478ba2adea7dca3a9cf0a2fbb6d6fb27c77 WatchSource:0}: Error finding container 7623c97deba123a80ab23edbf5a3a478ba2adea7dca3a9cf0a2fbb6d6fb27c77: Status 404 returned error can't find the container with id 7623c97deba123a80ab23edbf5a3a478ba2adea7dca3a9cf0a2fbb6d6fb27c77 Mar 09 10:00:30 crc kubenswrapper[4971]: I0309 10:00:30.064167 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vk87m"] Mar 09 10:00:30 crc kubenswrapper[4971]: I0309 10:00:30.835081 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vk87m" event={"ID":"005c8ee8-a75f-4130-b461-557341a6f693","Type":"ContainerStarted","Data":"d93110eeddc1858a96ea5f8cbc6ca309fbf6b963d629c9ffde23ec8cdece4f04"} Mar 09 10:00:30 crc kubenswrapper[4971]: I0309 10:00:30.835161 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vk87m" event={"ID":"005c8ee8-a75f-4130-b461-557341a6f693","Type":"ContainerStarted","Data":"7623c97deba123a80ab23edbf5a3a478ba2adea7dca3a9cf0a2fbb6d6fb27c77"} Mar 09 10:00:30 crc kubenswrapper[4971]: I0309 10:00:30.852754 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vk87m" podStartSLOduration=1.852732226 podStartE2EDuration="1.852732226s" podCreationTimestamp="2026-03-09 10:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:00:30.847914034 +0000 UTC m=+2434.407841844" watchObservedRunningTime="2026-03-09 10:00:30.852732226 +0000 UTC m=+2434.412660036" Mar 09 10:00:31 crc kubenswrapper[4971]: I0309 10:00:31.843871 4971 generic.go:334] "Generic (PLEG): container finished" podID="005c8ee8-a75f-4130-b461-557341a6f693" containerID="d93110eeddc1858a96ea5f8cbc6ca309fbf6b963d629c9ffde23ec8cdece4f04" exitCode=0 Mar 09 10:00:31 crc kubenswrapper[4971]: I0309 10:00:31.843916 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vk87m" event={"ID":"005c8ee8-a75f-4130-b461-557341a6f693","Type":"ContainerDied","Data":"d93110eeddc1858a96ea5f8cbc6ca309fbf6b963d629c9ffde23ec8cdece4f04"} Mar 09 10:00:33 crc kubenswrapper[4971]: I0309 10:00:33.131402 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vk87m" Mar 09 10:00:33 crc kubenswrapper[4971]: I0309 10:00:33.173801 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vk87m"] Mar 09 10:00:33 crc kubenswrapper[4971]: I0309 10:00:33.174788 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vk87m"] Mar 09 10:00:33 crc kubenswrapper[4971]: I0309 10:00:33.226095 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/005c8ee8-a75f-4130-b461-557341a6f693-scripts\") pod \"005c8ee8-a75f-4130-b461-557341a6f693\" (UID: \"005c8ee8-a75f-4130-b461-557341a6f693\") " Mar 09 10:00:33 crc kubenswrapper[4971]: I0309 10:00:33.226201 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xv82\" (UniqueName: \"kubernetes.io/projected/005c8ee8-a75f-4130-b461-557341a6f693-kube-api-access-2xv82\") pod \"005c8ee8-a75f-4130-b461-557341a6f693\" (UID: \"005c8ee8-a75f-4130-b461-557341a6f693\") " Mar 09 10:00:33 crc kubenswrapper[4971]: I0309 10:00:33.226301 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/005c8ee8-a75f-4130-b461-557341a6f693-swiftconf\") pod \"005c8ee8-a75f-4130-b461-557341a6f693\" (UID: \"005c8ee8-a75f-4130-b461-557341a6f693\") " Mar 09 10:00:33 crc kubenswrapper[4971]: I0309 10:00:33.226398 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/005c8ee8-a75f-4130-b461-557341a6f693-ring-data-devices\") pod \"005c8ee8-a75f-4130-b461-557341a6f693\" (UID: \"005c8ee8-a75f-4130-b461-557341a6f693\") " Mar 09 10:00:33 crc kubenswrapper[4971]: I0309 10:00:33.226422 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/005c8ee8-a75f-4130-b461-557341a6f693-etc-swift\") pod \"005c8ee8-a75f-4130-b461-557341a6f693\" (UID: \"005c8ee8-a75f-4130-b461-557341a6f693\") " Mar 09 10:00:33 crc kubenswrapper[4971]: I0309 10:00:33.226459 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/005c8ee8-a75f-4130-b461-557341a6f693-dispersionconf\") pod \"005c8ee8-a75f-4130-b461-557341a6f693\" (UID: \"005c8ee8-a75f-4130-b461-557341a6f693\") " Mar 09 10:00:33 crc kubenswrapper[4971]: I0309 10:00:33.227657 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/005c8ee8-a75f-4130-b461-557341a6f693-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "005c8ee8-a75f-4130-b461-557341a6f693" (UID: "005c8ee8-a75f-4130-b461-557341a6f693"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:33 crc kubenswrapper[4971]: I0309 10:00:33.227974 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/005c8ee8-a75f-4130-b461-557341a6f693-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "005c8ee8-a75f-4130-b461-557341a6f693" (UID: "005c8ee8-a75f-4130-b461-557341a6f693"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:00:33 crc kubenswrapper[4971]: I0309 10:00:33.231444 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/005c8ee8-a75f-4130-b461-557341a6f693-kube-api-access-2xv82" (OuterVolumeSpecName: "kube-api-access-2xv82") pod "005c8ee8-a75f-4130-b461-557341a6f693" (UID: "005c8ee8-a75f-4130-b461-557341a6f693"). InnerVolumeSpecName "kube-api-access-2xv82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:33 crc kubenswrapper[4971]: I0309 10:00:33.247841 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/005c8ee8-a75f-4130-b461-557341a6f693-scripts" (OuterVolumeSpecName: "scripts") pod "005c8ee8-a75f-4130-b461-557341a6f693" (UID: "005c8ee8-a75f-4130-b461-557341a6f693"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:33 crc kubenswrapper[4971]: I0309 10:00:33.248207 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/005c8ee8-a75f-4130-b461-557341a6f693-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "005c8ee8-a75f-4130-b461-557341a6f693" (UID: "005c8ee8-a75f-4130-b461-557341a6f693"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:33 crc kubenswrapper[4971]: I0309 10:00:33.254232 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/005c8ee8-a75f-4130-b461-557341a6f693-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "005c8ee8-a75f-4130-b461-557341a6f693" (UID: "005c8ee8-a75f-4130-b461-557341a6f693"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:33 crc kubenswrapper[4971]: I0309 10:00:33.328459 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/005c8ee8-a75f-4130-b461-557341a6f693-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:33 crc kubenswrapper[4971]: I0309 10:00:33.328662 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/005c8ee8-a75f-4130-b461-557341a6f693-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:33 crc kubenswrapper[4971]: I0309 10:00:33.328722 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/005c8ee8-a75f-4130-b461-557341a6f693-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:33 crc kubenswrapper[4971]: I0309 10:00:33.328777 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/005c8ee8-a75f-4130-b461-557341a6f693-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:33 crc kubenswrapper[4971]: I0309 10:00:33.328861 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xv82\" (UniqueName: \"kubernetes.io/projected/005c8ee8-a75f-4130-b461-557341a6f693-kube-api-access-2xv82\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:33 crc kubenswrapper[4971]: I0309 10:00:33.328920 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/005c8ee8-a75f-4130-b461-557341a6f693-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:33 crc kubenswrapper[4971]: I0309 10:00:33.862611 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7623c97deba123a80ab23edbf5a3a478ba2adea7dca3a9cf0a2fbb6d6fb27c77" Mar 09 10:00:33 crc kubenswrapper[4971]: I0309 10:00:33.862898 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vk87m" Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.161000 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="005c8ee8-a75f-4130-b461-557341a6f693" path="/var/lib/kubelet/pods/005c8ee8-a75f-4130-b461-557341a6f693/volumes" Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.373883 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t"] Mar 09 10:00:35 crc kubenswrapper[4971]: E0309 10:00:35.374230 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="005c8ee8-a75f-4130-b461-557341a6f693" containerName="swift-ring-rebalance" Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.374249 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="005c8ee8-a75f-4130-b461-557341a6f693" containerName="swift-ring-rebalance" Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.374396 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="005c8ee8-a75f-4130-b461-557341a6f693" containerName="swift-ring-rebalance" Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.374890 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t" Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.376537 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.376893 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.396561 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t"] Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.462569 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af55bf4a-75e9-41e4-8d55-6fca30c24135-scripts\") pod \"swift-ring-rebalance-debug-5fm8t\" (UID: \"af55bf4a-75e9-41e4-8d55-6fca30c24135\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t" Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.462689 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/af55bf4a-75e9-41e4-8d55-6fca30c24135-dispersionconf\") pod \"swift-ring-rebalance-debug-5fm8t\" (UID: \"af55bf4a-75e9-41e4-8d55-6fca30c24135\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t" Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.462735 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v79g6\" (UniqueName: \"kubernetes.io/projected/af55bf4a-75e9-41e4-8d55-6fca30c24135-kube-api-access-v79g6\") pod \"swift-ring-rebalance-debug-5fm8t\" (UID: \"af55bf4a-75e9-41e4-8d55-6fca30c24135\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t" Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.462774 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/af55bf4a-75e9-41e4-8d55-6fca30c24135-ring-data-devices\") pod \"swift-ring-rebalance-debug-5fm8t\" (UID: \"af55bf4a-75e9-41e4-8d55-6fca30c24135\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t" Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.462843 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/af55bf4a-75e9-41e4-8d55-6fca30c24135-etc-swift\") pod \"swift-ring-rebalance-debug-5fm8t\" (UID: \"af55bf4a-75e9-41e4-8d55-6fca30c24135\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t" Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.462893 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/af55bf4a-75e9-41e4-8d55-6fca30c24135-swiftconf\") pod \"swift-ring-rebalance-debug-5fm8t\" (UID: \"af55bf4a-75e9-41e4-8d55-6fca30c24135\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t" Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.563878 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/af55bf4a-75e9-41e4-8d55-6fca30c24135-etc-swift\") pod \"swift-ring-rebalance-debug-5fm8t\" (UID: \"af55bf4a-75e9-41e4-8d55-6fca30c24135\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t" Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.563981 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/af55bf4a-75e9-41e4-8d55-6fca30c24135-swiftconf\") pod \"swift-ring-rebalance-debug-5fm8t\" (UID: \"af55bf4a-75e9-41e4-8d55-6fca30c24135\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t" Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.564034 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af55bf4a-75e9-41e4-8d55-6fca30c24135-scripts\") pod \"swift-ring-rebalance-debug-5fm8t\" (UID: \"af55bf4a-75e9-41e4-8d55-6fca30c24135\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t" Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.564083 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/af55bf4a-75e9-41e4-8d55-6fca30c24135-dispersionconf\") pod \"swift-ring-rebalance-debug-5fm8t\" (UID: \"af55bf4a-75e9-41e4-8d55-6fca30c24135\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t" Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.564105 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v79g6\" (UniqueName: \"kubernetes.io/projected/af55bf4a-75e9-41e4-8d55-6fca30c24135-kube-api-access-v79g6\") pod \"swift-ring-rebalance-debug-5fm8t\" (UID: \"af55bf4a-75e9-41e4-8d55-6fca30c24135\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t" Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.564130 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/af55bf4a-75e9-41e4-8d55-6fca30c24135-ring-data-devices\") pod \"swift-ring-rebalance-debug-5fm8t\" (UID: \"af55bf4a-75e9-41e4-8d55-6fca30c24135\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t" Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.564423 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/af55bf4a-75e9-41e4-8d55-6fca30c24135-etc-swift\") pod \"swift-ring-rebalance-debug-5fm8t\" (UID: \"af55bf4a-75e9-41e4-8d55-6fca30c24135\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t" Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.564932 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/af55bf4a-75e9-41e4-8d55-6fca30c24135-ring-data-devices\") pod \"swift-ring-rebalance-debug-5fm8t\" (UID: \"af55bf4a-75e9-41e4-8d55-6fca30c24135\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t" Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.564983 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af55bf4a-75e9-41e4-8d55-6fca30c24135-scripts\") pod \"swift-ring-rebalance-debug-5fm8t\" (UID: \"af55bf4a-75e9-41e4-8d55-6fca30c24135\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t" Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.569691 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/af55bf4a-75e9-41e4-8d55-6fca30c24135-dispersionconf\") pod \"swift-ring-rebalance-debug-5fm8t\" (UID: \"af55bf4a-75e9-41e4-8d55-6fca30c24135\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t" Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.573858 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/af55bf4a-75e9-41e4-8d55-6fca30c24135-swiftconf\") pod \"swift-ring-rebalance-debug-5fm8t\" (UID: \"af55bf4a-75e9-41e4-8d55-6fca30c24135\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t" Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.582693 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v79g6\" (UniqueName: \"kubernetes.io/projected/af55bf4a-75e9-41e4-8d55-6fca30c24135-kube-api-access-v79g6\") pod \"swift-ring-rebalance-debug-5fm8t\" (UID: \"af55bf4a-75e9-41e4-8d55-6fca30c24135\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t" Mar 09 10:00:35 crc kubenswrapper[4971]: I0309 10:00:35.696455 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t" Mar 09 10:00:36 crc kubenswrapper[4971]: I0309 10:00:36.164430 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t"] Mar 09 10:00:36 crc kubenswrapper[4971]: I0309 10:00:36.900537 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t" event={"ID":"af55bf4a-75e9-41e4-8d55-6fca30c24135","Type":"ContainerStarted","Data":"293423da4e0a9a0ae438579ef410d5337f4b42f07585d8bd88b623446084b8f1"} Mar 09 10:00:36 crc kubenswrapper[4971]: I0309 10:00:36.900913 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t" event={"ID":"af55bf4a-75e9-41e4-8d55-6fca30c24135","Type":"ContainerStarted","Data":"3bc21d79ba4bdc6c21d3690af0b2bf3a15aad85d984cdd2a111150c1de39ea42"} Mar 09 10:00:36 crc kubenswrapper[4971]: I0309 10:00:36.924636 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t" podStartSLOduration=1.924610326 podStartE2EDuration="1.924610326s" podCreationTimestamp="2026-03-09 10:00:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:00:36.91712402 +0000 UTC m=+2440.477051830" watchObservedRunningTime="2026-03-09 10:00:36.924610326 +0000 UTC m=+2440.484538136" Mar 09 10:00:37 crc kubenswrapper[4971]: I0309 10:00:37.909343 4971 generic.go:334] "Generic (PLEG): container finished" podID="af55bf4a-75e9-41e4-8d55-6fca30c24135" containerID="293423da4e0a9a0ae438579ef410d5337f4b42f07585d8bd88b623446084b8f1" exitCode=0 Mar 09 10:00:37 crc kubenswrapper[4971]: I0309 10:00:37.909485 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t" event={"ID":"af55bf4a-75e9-41e4-8d55-6fca30c24135","Type":"ContainerDied","Data":"293423da4e0a9a0ae438579ef410d5337f4b42f07585d8bd88b623446084b8f1"} Mar 09 10:00:39 crc kubenswrapper[4971]: I0309 10:00:39.217581 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t" Mar 09 10:00:39 crc kubenswrapper[4971]: I0309 10:00:39.249066 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t"] Mar 09 10:00:39 crc kubenswrapper[4971]: I0309 10:00:39.255294 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t"] Mar 09 10:00:39 crc kubenswrapper[4971]: I0309 10:00:39.315019 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/af55bf4a-75e9-41e4-8d55-6fca30c24135-ring-data-devices\") pod \"af55bf4a-75e9-41e4-8d55-6fca30c24135\" (UID: \"af55bf4a-75e9-41e4-8d55-6fca30c24135\") " Mar 09 10:00:39 crc kubenswrapper[4971]: I0309 10:00:39.315086 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/af55bf4a-75e9-41e4-8d55-6fca30c24135-swiftconf\") pod \"af55bf4a-75e9-41e4-8d55-6fca30c24135\" (UID: \"af55bf4a-75e9-41e4-8d55-6fca30c24135\") " Mar 09 10:00:39 crc kubenswrapper[4971]: I0309 10:00:39.315121 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v79g6\" (UniqueName: \"kubernetes.io/projected/af55bf4a-75e9-41e4-8d55-6fca30c24135-kube-api-access-v79g6\") pod \"af55bf4a-75e9-41e4-8d55-6fca30c24135\" (UID: \"af55bf4a-75e9-41e4-8d55-6fca30c24135\") " Mar 09 10:00:39 crc kubenswrapper[4971]: I0309 10:00:39.315167 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af55bf4a-75e9-41e4-8d55-6fca30c24135-scripts\") pod \"af55bf4a-75e9-41e4-8d55-6fca30c24135\" (UID: \"af55bf4a-75e9-41e4-8d55-6fca30c24135\") " Mar 09 10:00:39 crc kubenswrapper[4971]: I0309 10:00:39.315275 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/af55bf4a-75e9-41e4-8d55-6fca30c24135-dispersionconf\") pod \"af55bf4a-75e9-41e4-8d55-6fca30c24135\" (UID: \"af55bf4a-75e9-41e4-8d55-6fca30c24135\") " Mar 09 10:00:39 crc kubenswrapper[4971]: I0309 10:00:39.315382 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/af55bf4a-75e9-41e4-8d55-6fca30c24135-etc-swift\") pod \"af55bf4a-75e9-41e4-8d55-6fca30c24135\" (UID: \"af55bf4a-75e9-41e4-8d55-6fca30c24135\") " Mar 09 10:00:39 crc kubenswrapper[4971]: I0309 10:00:39.315906 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af55bf4a-75e9-41e4-8d55-6fca30c24135-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "af55bf4a-75e9-41e4-8d55-6fca30c24135" (UID: "af55bf4a-75e9-41e4-8d55-6fca30c24135"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:39 crc kubenswrapper[4971]: I0309 10:00:39.316846 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af55bf4a-75e9-41e4-8d55-6fca30c24135-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "af55bf4a-75e9-41e4-8d55-6fca30c24135" (UID: "af55bf4a-75e9-41e4-8d55-6fca30c24135"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:00:39 crc kubenswrapper[4971]: I0309 10:00:39.326634 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af55bf4a-75e9-41e4-8d55-6fca30c24135-kube-api-access-v79g6" (OuterVolumeSpecName: "kube-api-access-v79g6") pod "af55bf4a-75e9-41e4-8d55-6fca30c24135" (UID: "af55bf4a-75e9-41e4-8d55-6fca30c24135"). InnerVolumeSpecName "kube-api-access-v79g6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:39 crc kubenswrapper[4971]: I0309 10:00:39.334626 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af55bf4a-75e9-41e4-8d55-6fca30c24135-scripts" (OuterVolumeSpecName: "scripts") pod "af55bf4a-75e9-41e4-8d55-6fca30c24135" (UID: "af55bf4a-75e9-41e4-8d55-6fca30c24135"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:39 crc kubenswrapper[4971]: I0309 10:00:39.339219 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af55bf4a-75e9-41e4-8d55-6fca30c24135-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "af55bf4a-75e9-41e4-8d55-6fca30c24135" (UID: "af55bf4a-75e9-41e4-8d55-6fca30c24135"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:39 crc kubenswrapper[4971]: I0309 10:00:39.339620 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af55bf4a-75e9-41e4-8d55-6fca30c24135-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "af55bf4a-75e9-41e4-8d55-6fca30c24135" (UID: "af55bf4a-75e9-41e4-8d55-6fca30c24135"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:39 crc kubenswrapper[4971]: I0309 10:00:39.417555 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/af55bf4a-75e9-41e4-8d55-6fca30c24135-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:39 crc kubenswrapper[4971]: I0309 10:00:39.417602 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/af55bf4a-75e9-41e4-8d55-6fca30c24135-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:39 crc kubenswrapper[4971]: I0309 10:00:39.417614 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/af55bf4a-75e9-41e4-8d55-6fca30c24135-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:39 crc kubenswrapper[4971]: I0309 10:00:39.417623 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v79g6\" (UniqueName: \"kubernetes.io/projected/af55bf4a-75e9-41e4-8d55-6fca30c24135-kube-api-access-v79g6\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:39 crc kubenswrapper[4971]: I0309 10:00:39.417635 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af55bf4a-75e9-41e4-8d55-6fca30c24135-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:39 crc kubenswrapper[4971]: I0309 10:00:39.417642 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/af55bf4a-75e9-41e4-8d55-6fca30c24135-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:39 crc kubenswrapper[4971]: I0309 10:00:39.926217 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bc21d79ba4bdc6c21d3690af0b2bf3a15aad85d984cdd2a111150c1de39ea42" Mar 09 10:00:39 crc kubenswrapper[4971]: I0309 10:00:39.926265 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5fm8t" Mar 09 10:00:40 crc kubenswrapper[4971]: I0309 10:00:40.388391 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj"] Mar 09 10:00:40 crc kubenswrapper[4971]: E0309 10:00:40.388707 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af55bf4a-75e9-41e4-8d55-6fca30c24135" containerName="swift-ring-rebalance" Mar 09 10:00:40 crc kubenswrapper[4971]: I0309 10:00:40.388719 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="af55bf4a-75e9-41e4-8d55-6fca30c24135" containerName="swift-ring-rebalance" Mar 09 10:00:40 crc kubenswrapper[4971]: I0309 10:00:40.388876 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="af55bf4a-75e9-41e4-8d55-6fca30c24135" containerName="swift-ring-rebalance" Mar 09 10:00:40 crc kubenswrapper[4971]: I0309 10:00:40.389327 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj" Mar 09 10:00:40 crc kubenswrapper[4971]: I0309 10:00:40.391879 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 10:00:40 crc kubenswrapper[4971]: I0309 10:00:40.392368 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 10:00:40 crc kubenswrapper[4971]: I0309 10:00:40.396485 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj"] Mar 09 10:00:40 crc kubenswrapper[4971]: I0309 10:00:40.429985 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7a01867-fc5b-4cd2-8a17-7b281058d870-ring-data-devices\") pod \"swift-ring-rebalance-debug-7bvgj\" (UID: \"b7a01867-fc5b-4cd2-8a17-7b281058d870\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj" Mar 09 10:00:40 crc kubenswrapper[4971]: I0309 10:00:40.430104 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7a01867-fc5b-4cd2-8a17-7b281058d870-scripts\") pod \"swift-ring-rebalance-debug-7bvgj\" (UID: \"b7a01867-fc5b-4cd2-8a17-7b281058d870\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj" Mar 09 10:00:40 crc kubenswrapper[4971]: I0309 10:00:40.430225 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zww86\" (UniqueName: \"kubernetes.io/projected/b7a01867-fc5b-4cd2-8a17-7b281058d870-kube-api-access-zww86\") pod \"swift-ring-rebalance-debug-7bvgj\" (UID: \"b7a01867-fc5b-4cd2-8a17-7b281058d870\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj" Mar 09 10:00:40 crc kubenswrapper[4971]: I0309 10:00:40.430285 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7a01867-fc5b-4cd2-8a17-7b281058d870-dispersionconf\") pod \"swift-ring-rebalance-debug-7bvgj\" (UID: \"b7a01867-fc5b-4cd2-8a17-7b281058d870\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj" Mar 09 10:00:40 crc kubenswrapper[4971]: I0309 10:00:40.430338 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7a01867-fc5b-4cd2-8a17-7b281058d870-swiftconf\") pod \"swift-ring-rebalance-debug-7bvgj\" (UID: \"b7a01867-fc5b-4cd2-8a17-7b281058d870\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj" Mar 09 10:00:40 crc kubenswrapper[4971]: I0309 10:00:40.430381 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7a01867-fc5b-4cd2-8a17-7b281058d870-etc-swift\") pod \"swift-ring-rebalance-debug-7bvgj\" (UID: \"b7a01867-fc5b-4cd2-8a17-7b281058d870\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj" Mar 09 10:00:40 crc kubenswrapper[4971]: I0309 10:00:40.532029 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7a01867-fc5b-4cd2-8a17-7b281058d870-swiftconf\") pod \"swift-ring-rebalance-debug-7bvgj\" (UID: \"b7a01867-fc5b-4cd2-8a17-7b281058d870\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj" Mar 09 10:00:40 crc kubenswrapper[4971]: I0309 10:00:40.532082 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7a01867-fc5b-4cd2-8a17-7b281058d870-etc-swift\") pod \"swift-ring-rebalance-debug-7bvgj\" (UID: \"b7a01867-fc5b-4cd2-8a17-7b281058d870\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj" Mar 09 10:00:40 crc kubenswrapper[4971]: I0309 10:00:40.532132 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7a01867-fc5b-4cd2-8a17-7b281058d870-ring-data-devices\") pod \"swift-ring-rebalance-debug-7bvgj\" (UID: \"b7a01867-fc5b-4cd2-8a17-7b281058d870\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj" Mar 09 10:00:40 crc kubenswrapper[4971]: I0309 10:00:40.532165 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7a01867-fc5b-4cd2-8a17-7b281058d870-scripts\") pod \"swift-ring-rebalance-debug-7bvgj\" (UID: \"b7a01867-fc5b-4cd2-8a17-7b281058d870\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj" Mar 09 10:00:40 crc kubenswrapper[4971]: I0309 10:00:40.532224 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zww86\" (UniqueName: \"kubernetes.io/projected/b7a01867-fc5b-4cd2-8a17-7b281058d870-kube-api-access-zww86\") pod \"swift-ring-rebalance-debug-7bvgj\" (UID: \"b7a01867-fc5b-4cd2-8a17-7b281058d870\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj" Mar 09 10:00:40 crc kubenswrapper[4971]: I0309 10:00:40.532249 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7a01867-fc5b-4cd2-8a17-7b281058d870-dispersionconf\") pod \"swift-ring-rebalance-debug-7bvgj\" (UID: \"b7a01867-fc5b-4cd2-8a17-7b281058d870\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj" Mar 09 10:00:40 crc kubenswrapper[4971]: I0309 10:00:40.533084 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7a01867-fc5b-4cd2-8a17-7b281058d870-etc-swift\") pod \"swift-ring-rebalance-debug-7bvgj\" (UID: \"b7a01867-fc5b-4cd2-8a17-7b281058d870\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj" Mar 09 10:00:40 crc kubenswrapper[4971]: I0309 10:00:40.533267 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7a01867-fc5b-4cd2-8a17-7b281058d870-scripts\") pod \"swift-ring-rebalance-debug-7bvgj\" (UID: \"b7a01867-fc5b-4cd2-8a17-7b281058d870\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj" Mar 09 10:00:40 crc kubenswrapper[4971]: I0309 10:00:40.533267 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7a01867-fc5b-4cd2-8a17-7b281058d870-ring-data-devices\") pod \"swift-ring-rebalance-debug-7bvgj\" (UID: \"b7a01867-fc5b-4cd2-8a17-7b281058d870\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj" Mar 09 10:00:40 crc kubenswrapper[4971]: I0309 10:00:40.536181 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7a01867-fc5b-4cd2-8a17-7b281058d870-swiftconf\") pod \"swift-ring-rebalance-debug-7bvgj\" (UID: \"b7a01867-fc5b-4cd2-8a17-7b281058d870\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj" Mar 09 10:00:40 crc kubenswrapper[4971]: I0309 10:00:40.537067 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7a01867-fc5b-4cd2-8a17-7b281058d870-dispersionconf\") pod \"swift-ring-rebalance-debug-7bvgj\" (UID: \"b7a01867-fc5b-4cd2-8a17-7b281058d870\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj" Mar 09 10:00:40 crc kubenswrapper[4971]: I0309 10:00:40.548103 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zww86\" (UniqueName: \"kubernetes.io/projected/b7a01867-fc5b-4cd2-8a17-7b281058d870-kube-api-access-zww86\") pod \"swift-ring-rebalance-debug-7bvgj\" (UID: \"b7a01867-fc5b-4cd2-8a17-7b281058d870\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj" Mar 09 10:00:40 crc kubenswrapper[4971]: I0309 10:00:40.705287 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj" Mar 09 10:00:41 crc kubenswrapper[4971]: I0309 10:00:41.109883 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj"] Mar 09 10:00:41 crc kubenswrapper[4971]: I0309 10:00:41.162876 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af55bf4a-75e9-41e4-8d55-6fca30c24135" path="/var/lib/kubelet/pods/af55bf4a-75e9-41e4-8d55-6fca30c24135/volumes" Mar 09 10:00:41 crc kubenswrapper[4971]: I0309 10:00:41.944196 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj" event={"ID":"b7a01867-fc5b-4cd2-8a17-7b281058d870","Type":"ContainerStarted","Data":"0b5f58db7841898efe0f54a28ba295479a11453ff2ee34fb515fcc92b8fe7f5e"} Mar 09 10:00:41 crc kubenswrapper[4971]: I0309 10:00:41.944468 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj" event={"ID":"b7a01867-fc5b-4cd2-8a17-7b281058d870","Type":"ContainerStarted","Data":"086b50705256754845f0871fd137f9586748b20af867249a701d8026bb8f0db3"} Mar 09 10:00:41 crc kubenswrapper[4971]: I0309 10:00:41.964446 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj" podStartSLOduration=1.964425265 podStartE2EDuration="1.964425265s" podCreationTimestamp="2026-03-09 10:00:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:00:41.96134292 +0000 UTC m=+2445.521270730" watchObservedRunningTime="2026-03-09 10:00:41.964425265 +0000 UTC m=+2445.524353075" Mar 09 10:00:42 crc kubenswrapper[4971]: I0309 10:00:42.954972 4971 generic.go:334] "Generic (PLEG): container finished" podID="b7a01867-fc5b-4cd2-8a17-7b281058d870" containerID="0b5f58db7841898efe0f54a28ba295479a11453ff2ee34fb515fcc92b8fe7f5e" exitCode=0 Mar 09 10:00:42 crc kubenswrapper[4971]: I0309 10:00:42.955034 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj" event={"ID":"b7a01867-fc5b-4cd2-8a17-7b281058d870","Type":"ContainerDied","Data":"0b5f58db7841898efe0f54a28ba295479a11453ff2ee34fb515fcc92b8fe7f5e"} Mar 09 10:00:44 crc kubenswrapper[4971]: I0309 10:00:44.222536 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj" Mar 09 10:00:44 crc kubenswrapper[4971]: I0309 10:00:44.250146 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj"] Mar 09 10:00:44 crc kubenswrapper[4971]: I0309 10:00:44.256075 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj"] Mar 09 10:00:44 crc kubenswrapper[4971]: I0309 10:00:44.290143 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zww86\" (UniqueName: \"kubernetes.io/projected/b7a01867-fc5b-4cd2-8a17-7b281058d870-kube-api-access-zww86\") pod \"b7a01867-fc5b-4cd2-8a17-7b281058d870\" (UID: \"b7a01867-fc5b-4cd2-8a17-7b281058d870\") " Mar 09 10:00:44 crc kubenswrapper[4971]: I0309 10:00:44.290462 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7a01867-fc5b-4cd2-8a17-7b281058d870-dispersionconf\") pod \"b7a01867-fc5b-4cd2-8a17-7b281058d870\" (UID: \"b7a01867-fc5b-4cd2-8a17-7b281058d870\") " Mar 09 10:00:44 crc kubenswrapper[4971]: I0309 10:00:44.290575 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7a01867-fc5b-4cd2-8a17-7b281058d870-swiftconf\") pod \"b7a01867-fc5b-4cd2-8a17-7b281058d870\" (UID: \"b7a01867-fc5b-4cd2-8a17-7b281058d870\") " Mar 09 10:00:44 crc kubenswrapper[4971]: I0309 10:00:44.290654 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7a01867-fc5b-4cd2-8a17-7b281058d870-etc-swift\") pod \"b7a01867-fc5b-4cd2-8a17-7b281058d870\" (UID: \"b7a01867-fc5b-4cd2-8a17-7b281058d870\") " Mar 09 10:00:44 crc kubenswrapper[4971]: I0309 10:00:44.290805 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7a01867-fc5b-4cd2-8a17-7b281058d870-ring-data-devices\") pod \"b7a01867-fc5b-4cd2-8a17-7b281058d870\" (UID: \"b7a01867-fc5b-4cd2-8a17-7b281058d870\") " Mar 09 10:00:44 crc kubenswrapper[4971]: I0309 10:00:44.290893 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7a01867-fc5b-4cd2-8a17-7b281058d870-scripts\") pod \"b7a01867-fc5b-4cd2-8a17-7b281058d870\" (UID: \"b7a01867-fc5b-4cd2-8a17-7b281058d870\") " Mar 09 10:00:44 crc kubenswrapper[4971]: I0309 10:00:44.291148 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7a01867-fc5b-4cd2-8a17-7b281058d870-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b7a01867-fc5b-4cd2-8a17-7b281058d870" (UID: "b7a01867-fc5b-4cd2-8a17-7b281058d870"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:44 crc kubenswrapper[4971]: I0309 10:00:44.291366 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7a01867-fc5b-4cd2-8a17-7b281058d870-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b7a01867-fc5b-4cd2-8a17-7b281058d870" (UID: "b7a01867-fc5b-4cd2-8a17-7b281058d870"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:00:44 crc kubenswrapper[4971]: I0309 10:00:44.291373 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b7a01867-fc5b-4cd2-8a17-7b281058d870-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:44 crc kubenswrapper[4971]: I0309 10:00:44.295112 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7a01867-fc5b-4cd2-8a17-7b281058d870-kube-api-access-zww86" (OuterVolumeSpecName: "kube-api-access-zww86") pod "b7a01867-fc5b-4cd2-8a17-7b281058d870" (UID: "b7a01867-fc5b-4cd2-8a17-7b281058d870"). InnerVolumeSpecName "kube-api-access-zww86". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:44 crc kubenswrapper[4971]: I0309 10:00:44.308674 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7a01867-fc5b-4cd2-8a17-7b281058d870-scripts" (OuterVolumeSpecName: "scripts") pod "b7a01867-fc5b-4cd2-8a17-7b281058d870" (UID: "b7a01867-fc5b-4cd2-8a17-7b281058d870"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:44 crc kubenswrapper[4971]: I0309 10:00:44.312016 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7a01867-fc5b-4cd2-8a17-7b281058d870-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b7a01867-fc5b-4cd2-8a17-7b281058d870" (UID: "b7a01867-fc5b-4cd2-8a17-7b281058d870"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:44 crc kubenswrapper[4971]: I0309 10:00:44.314856 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7a01867-fc5b-4cd2-8a17-7b281058d870-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b7a01867-fc5b-4cd2-8a17-7b281058d870" (UID: "b7a01867-fc5b-4cd2-8a17-7b281058d870"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:44 crc kubenswrapper[4971]: I0309 10:00:44.393262 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7a01867-fc5b-4cd2-8a17-7b281058d870-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:44 crc kubenswrapper[4971]: I0309 10:00:44.393303 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zww86\" (UniqueName: \"kubernetes.io/projected/b7a01867-fc5b-4cd2-8a17-7b281058d870-kube-api-access-zww86\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:44 crc kubenswrapper[4971]: I0309 10:00:44.393318 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b7a01867-fc5b-4cd2-8a17-7b281058d870-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:44 crc kubenswrapper[4971]: I0309 10:00:44.393328 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b7a01867-fc5b-4cd2-8a17-7b281058d870-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:44 crc kubenswrapper[4971]: I0309 10:00:44.393338 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b7a01867-fc5b-4cd2-8a17-7b281058d870-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:44 crc kubenswrapper[4971]: I0309 10:00:44.794309 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:00:44 crc kubenswrapper[4971]: I0309 10:00:44.794457 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.001963 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="086b50705256754845f0871fd137f9586748b20af867249a701d8026bb8f0db3" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.002074 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7bvgj" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.161122 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7a01867-fc5b-4cd2-8a17-7b281058d870" path="/var/lib/kubelet/pods/b7a01867-fc5b-4cd2-8a17-7b281058d870/volumes" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.374339 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-27wp5"] Mar 09 10:00:45 crc kubenswrapper[4971]: E0309 10:00:45.374651 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a01867-fc5b-4cd2-8a17-7b281058d870" containerName="swift-ring-rebalance" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.374663 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a01867-fc5b-4cd2-8a17-7b281058d870" containerName="swift-ring-rebalance" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.374811 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7a01867-fc5b-4cd2-8a17-7b281058d870" containerName="swift-ring-rebalance" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.375255 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-27wp5" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.383865 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-27wp5"] Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.384783 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.384850 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.508126 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-scripts\") pod \"swift-ring-rebalance-debug-27wp5\" (UID: \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-27wp5" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.508171 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-etc-swift\") pod \"swift-ring-rebalance-debug-27wp5\" (UID: \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-27wp5" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.508196 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-dispersionconf\") pod \"swift-ring-rebalance-debug-27wp5\" (UID: \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-27wp5" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.508279 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-swiftconf\") pod \"swift-ring-rebalance-debug-27wp5\" (UID: \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-27wp5" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.508329 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqrbr\" (UniqueName: \"kubernetes.io/projected/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-kube-api-access-jqrbr\") pod \"swift-ring-rebalance-debug-27wp5\" (UID: \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-27wp5" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.508382 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-ring-data-devices\") pod \"swift-ring-rebalance-debug-27wp5\" (UID: \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-27wp5" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.609996 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-scripts\") pod \"swift-ring-rebalance-debug-27wp5\" (UID: \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-27wp5" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.610042 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-etc-swift\") pod \"swift-ring-rebalance-debug-27wp5\" (UID: \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-27wp5" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.610097 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-dispersionconf\") pod \"swift-ring-rebalance-debug-27wp5\" (UID: \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-27wp5" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.610123 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-swiftconf\") pod \"swift-ring-rebalance-debug-27wp5\" (UID: \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-27wp5" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.610153 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqrbr\" (UniqueName: \"kubernetes.io/projected/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-kube-api-access-jqrbr\") pod \"swift-ring-rebalance-debug-27wp5\" (UID: \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-27wp5" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.610174 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-ring-data-devices\") pod \"swift-ring-rebalance-debug-27wp5\" (UID: \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-27wp5" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.610864 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-ring-data-devices\") pod \"swift-ring-rebalance-debug-27wp5\" (UID: \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-27wp5" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.611015 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-scripts\") pod \"swift-ring-rebalance-debug-27wp5\" (UID: \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-27wp5" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.611061 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-etc-swift\") pod \"swift-ring-rebalance-debug-27wp5\" (UID: \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-27wp5" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.621467 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-swiftconf\") pod \"swift-ring-rebalance-debug-27wp5\" (UID: \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-27wp5" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.630744 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-dispersionconf\") pod \"swift-ring-rebalance-debug-27wp5\" (UID: \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-27wp5" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.635631 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqrbr\" (UniqueName: \"kubernetes.io/projected/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-kube-api-access-jqrbr\") pod \"swift-ring-rebalance-debug-27wp5\" (UID: \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-27wp5" Mar 09 10:00:45 crc kubenswrapper[4971]: I0309 10:00:45.697328 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-27wp5" Mar 09 10:00:46 crc kubenswrapper[4971]: I0309 10:00:46.119788 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-27wp5"] Mar 09 10:00:46 crc kubenswrapper[4971]: W0309 10:00:46.131525 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod680c3ab2_991a_4f86_9c6e_eb9fba0075f1.slice/crio-6a504a09ea9448fed0180c13ca7180559ae683948f4428d754f6a7697128b1ae WatchSource:0}: Error finding container 6a504a09ea9448fed0180c13ca7180559ae683948f4428d754f6a7697128b1ae: Status 404 returned error can't find the container with id 6a504a09ea9448fed0180c13ca7180559ae683948f4428d754f6a7697128b1ae Mar 09 10:00:47 crc kubenswrapper[4971]: I0309 10:00:47.042286 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-27wp5" event={"ID":"680c3ab2-991a-4f86-9c6e-eb9fba0075f1","Type":"ContainerStarted","Data":"e0c8f2818a23065647eea7145c2d2ebeac0f8d490c56efca08f9a76471e68c7d"} Mar 09 10:00:47 crc kubenswrapper[4971]: I0309 10:00:47.042693 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-27wp5" event={"ID":"680c3ab2-991a-4f86-9c6e-eb9fba0075f1","Type":"ContainerStarted","Data":"6a504a09ea9448fed0180c13ca7180559ae683948f4428d754f6a7697128b1ae"} Mar 09 10:00:47 crc kubenswrapper[4971]: I0309 10:00:47.063203 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-27wp5" podStartSLOduration=2.063182733 podStartE2EDuration="2.063182733s" podCreationTimestamp="2026-03-09 10:00:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:00:47.061322292 +0000 UTC m=+2450.621250122" watchObservedRunningTime="2026-03-09 10:00:47.063182733 +0000 UTC m=+2450.623110553" Mar 09 10:00:48 crc kubenswrapper[4971]: I0309 10:00:48.052586 4971 generic.go:334] "Generic (PLEG): container finished" podID="680c3ab2-991a-4f86-9c6e-eb9fba0075f1" containerID="e0c8f2818a23065647eea7145c2d2ebeac0f8d490c56efca08f9a76471e68c7d" exitCode=0 Mar 09 10:00:48 crc kubenswrapper[4971]: I0309 10:00:48.052649 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-27wp5" event={"ID":"680c3ab2-991a-4f86-9c6e-eb9fba0075f1","Type":"ContainerDied","Data":"e0c8f2818a23065647eea7145c2d2ebeac0f8d490c56efca08f9a76471e68c7d"} Mar 09 10:00:49 crc kubenswrapper[4971]: I0309 10:00:49.350522 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-27wp5" Mar 09 10:00:49 crc kubenswrapper[4971]: I0309 10:00:49.389502 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-27wp5"] Mar 09 10:00:49 crc kubenswrapper[4971]: I0309 10:00:49.396403 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-27wp5"] Mar 09 10:00:49 crc kubenswrapper[4971]: I0309 10:00:49.470825 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-etc-swift\") pod \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\" (UID: \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\") " Mar 09 10:00:49 crc kubenswrapper[4971]: I0309 10:00:49.470886 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-ring-data-devices\") pod \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\" (UID: \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\") " Mar 09 10:00:49 crc kubenswrapper[4971]: I0309 10:00:49.470964 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-dispersionconf\") pod \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\" (UID: \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\") " Mar 09 10:00:49 crc kubenswrapper[4971]: I0309 10:00:49.471049 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqrbr\" (UniqueName: \"kubernetes.io/projected/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-kube-api-access-jqrbr\") pod \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\" (UID: \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\") " Mar 09 10:00:49 crc kubenswrapper[4971]: I0309 10:00:49.471147 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-scripts\") pod \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\" (UID: \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\") " Mar 09 10:00:49 crc kubenswrapper[4971]: I0309 10:00:49.471181 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-swiftconf\") pod \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\" (UID: \"680c3ab2-991a-4f86-9c6e-eb9fba0075f1\") " Mar 09 10:00:49 crc kubenswrapper[4971]: I0309 10:00:49.471435 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "680c3ab2-991a-4f86-9c6e-eb9fba0075f1" (UID: "680c3ab2-991a-4f86-9c6e-eb9fba0075f1"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:49 crc kubenswrapper[4971]: I0309 10:00:49.471539 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:49 crc kubenswrapper[4971]: I0309 10:00:49.471668 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "680c3ab2-991a-4f86-9c6e-eb9fba0075f1" (UID: "680c3ab2-991a-4f86-9c6e-eb9fba0075f1"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:00:49 crc kubenswrapper[4971]: I0309 10:00:49.476367 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-kube-api-access-jqrbr" (OuterVolumeSpecName: "kube-api-access-jqrbr") pod "680c3ab2-991a-4f86-9c6e-eb9fba0075f1" (UID: "680c3ab2-991a-4f86-9c6e-eb9fba0075f1"). InnerVolumeSpecName "kube-api-access-jqrbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:49 crc kubenswrapper[4971]: I0309 10:00:49.490156 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-scripts" (OuterVolumeSpecName: "scripts") pod "680c3ab2-991a-4f86-9c6e-eb9fba0075f1" (UID: "680c3ab2-991a-4f86-9c6e-eb9fba0075f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:49 crc kubenswrapper[4971]: I0309 10:00:49.492168 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "680c3ab2-991a-4f86-9c6e-eb9fba0075f1" (UID: "680c3ab2-991a-4f86-9c6e-eb9fba0075f1"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:49 crc kubenswrapper[4971]: I0309 10:00:49.493169 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "680c3ab2-991a-4f86-9c6e-eb9fba0075f1" (UID: "680c3ab2-991a-4f86-9c6e-eb9fba0075f1"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:49 crc kubenswrapper[4971]: I0309 10:00:49.572731 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqrbr\" (UniqueName: \"kubernetes.io/projected/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-kube-api-access-jqrbr\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:49 crc kubenswrapper[4971]: I0309 10:00:49.572763 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:49 crc kubenswrapper[4971]: I0309 10:00:49.572773 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:49 crc kubenswrapper[4971]: I0309 10:00:49.572783 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:49 crc kubenswrapper[4971]: I0309 10:00:49.572791 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/680c3ab2-991a-4f86-9c6e-eb9fba0075f1-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.069959 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a504a09ea9448fed0180c13ca7180559ae683948f4428d754f6a7697128b1ae" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.070008 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-27wp5" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.516273 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd"] Mar 09 10:00:50 crc kubenswrapper[4971]: E0309 10:00:50.517169 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="680c3ab2-991a-4f86-9c6e-eb9fba0075f1" containerName="swift-ring-rebalance" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.517275 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="680c3ab2-991a-4f86-9c6e-eb9fba0075f1" containerName="swift-ring-rebalance" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.517557 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="680c3ab2-991a-4f86-9c6e-eb9fba0075f1" containerName="swift-ring-rebalance" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.518072 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.521183 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.521401 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.532550 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd"] Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.584094 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/19b8a8ef-d394-4cbd-8245-bb28c47d518e-swiftconf\") pod \"swift-ring-rebalance-debug-jqlpd\" (UID: \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.584192 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/19b8a8ef-d394-4cbd-8245-bb28c47d518e-ring-data-devices\") pod \"swift-ring-rebalance-debug-jqlpd\" (UID: \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.584243 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/19b8a8ef-d394-4cbd-8245-bb28c47d518e-etc-swift\") pod \"swift-ring-rebalance-debug-jqlpd\" (UID: \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.584319 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19b8a8ef-d394-4cbd-8245-bb28c47d518e-scripts\") pod \"swift-ring-rebalance-debug-jqlpd\" (UID: \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.584398 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/19b8a8ef-d394-4cbd-8245-bb28c47d518e-dispersionconf\") pod \"swift-ring-rebalance-debug-jqlpd\" (UID: \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.584432 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc484\" (UniqueName: \"kubernetes.io/projected/19b8a8ef-d394-4cbd-8245-bb28c47d518e-kube-api-access-fc484\") pod \"swift-ring-rebalance-debug-jqlpd\" (UID: \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.686210 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/19b8a8ef-d394-4cbd-8245-bb28c47d518e-etc-swift\") pod \"swift-ring-rebalance-debug-jqlpd\" (UID: \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.686322 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19b8a8ef-d394-4cbd-8245-bb28c47d518e-scripts\") pod \"swift-ring-rebalance-debug-jqlpd\" (UID: \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.686424 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/19b8a8ef-d394-4cbd-8245-bb28c47d518e-dispersionconf\") pod \"swift-ring-rebalance-debug-jqlpd\" (UID: \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.686494 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc484\" (UniqueName: \"kubernetes.io/projected/19b8a8ef-d394-4cbd-8245-bb28c47d518e-kube-api-access-fc484\") pod \"swift-ring-rebalance-debug-jqlpd\" (UID: \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.686554 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/19b8a8ef-d394-4cbd-8245-bb28c47d518e-swiftconf\") pod \"swift-ring-rebalance-debug-jqlpd\" (UID: \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.686624 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/19b8a8ef-d394-4cbd-8245-bb28c47d518e-ring-data-devices\") pod \"swift-ring-rebalance-debug-jqlpd\" (UID: \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.686789 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/19b8a8ef-d394-4cbd-8245-bb28c47d518e-etc-swift\") pod \"swift-ring-rebalance-debug-jqlpd\" (UID: \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.687164 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19b8a8ef-d394-4cbd-8245-bb28c47d518e-scripts\") pod \"swift-ring-rebalance-debug-jqlpd\" (UID: \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.687699 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/19b8a8ef-d394-4cbd-8245-bb28c47d518e-ring-data-devices\") pod \"swift-ring-rebalance-debug-jqlpd\" (UID: \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.694334 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/19b8a8ef-d394-4cbd-8245-bb28c47d518e-dispersionconf\") pod \"swift-ring-rebalance-debug-jqlpd\" (UID: \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.694366 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/19b8a8ef-d394-4cbd-8245-bb28c47d518e-swiftconf\") pod \"swift-ring-rebalance-debug-jqlpd\" (UID: \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.707507 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc484\" (UniqueName: \"kubernetes.io/projected/19b8a8ef-d394-4cbd-8245-bb28c47d518e-kube-api-access-fc484\") pod \"swift-ring-rebalance-debug-jqlpd\" (UID: \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd" Mar 09 10:00:50 crc kubenswrapper[4971]: I0309 10:00:50.846862 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd" Mar 09 10:00:51 crc kubenswrapper[4971]: I0309 10:00:51.171009 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="680c3ab2-991a-4f86-9c6e-eb9fba0075f1" path="/var/lib/kubelet/pods/680c3ab2-991a-4f86-9c6e-eb9fba0075f1/volumes" Mar 09 10:00:51 crc kubenswrapper[4971]: I0309 10:00:51.268004 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd"] Mar 09 10:00:52 crc kubenswrapper[4971]: I0309 10:00:52.090505 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd" event={"ID":"19b8a8ef-d394-4cbd-8245-bb28c47d518e","Type":"ContainerStarted","Data":"f50802e17a85d376c9d2e653d714e35101a371e2bba637b26a23a2b18e76a606"} Mar 09 10:00:52 crc kubenswrapper[4971]: I0309 10:00:52.090904 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd" event={"ID":"19b8a8ef-d394-4cbd-8245-bb28c47d518e","Type":"ContainerStarted","Data":"2369278dc45a73747542430302399db89fd2ffdb6e49b4c8e9ea11bf52cfefe9"} Mar 09 10:00:52 crc kubenswrapper[4971]: I0309 10:00:52.112386 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd" podStartSLOduration=2.112329999 podStartE2EDuration="2.112329999s" podCreationTimestamp="2026-03-09 10:00:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:00:52.103462246 +0000 UTC m=+2455.663390056" watchObservedRunningTime="2026-03-09 10:00:52.112329999 +0000 UTC m=+2455.672257849" Mar 09 10:00:53 crc kubenswrapper[4971]: I0309 10:00:53.102236 4971 generic.go:334] "Generic (PLEG): container finished" podID="19b8a8ef-d394-4cbd-8245-bb28c47d518e" containerID="f50802e17a85d376c9d2e653d714e35101a371e2bba637b26a23a2b18e76a606" exitCode=0 Mar 09 10:00:53 crc kubenswrapper[4971]: I0309 10:00:53.102579 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd" event={"ID":"19b8a8ef-d394-4cbd-8245-bb28c47d518e","Type":"ContainerDied","Data":"f50802e17a85d376c9d2e653d714e35101a371e2bba637b26a23a2b18e76a606"} Mar 09 10:00:54 crc kubenswrapper[4971]: I0309 10:00:54.395710 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd" Mar 09 10:00:54 crc kubenswrapper[4971]: I0309 10:00:54.432798 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd"] Mar 09 10:00:54 crc kubenswrapper[4971]: I0309 10:00:54.435999 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/19b8a8ef-d394-4cbd-8245-bb28c47d518e-ring-data-devices\") pod \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\" (UID: \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\") " Mar 09 10:00:54 crc kubenswrapper[4971]: I0309 10:00:54.436098 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc484\" (UniqueName: \"kubernetes.io/projected/19b8a8ef-d394-4cbd-8245-bb28c47d518e-kube-api-access-fc484\") pod \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\" (UID: \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\") " Mar 09 10:00:54 crc kubenswrapper[4971]: I0309 10:00:54.436145 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19b8a8ef-d394-4cbd-8245-bb28c47d518e-scripts\") pod \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\" (UID: \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\") " Mar 09 10:00:54 crc kubenswrapper[4971]: I0309 10:00:54.436186 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/19b8a8ef-d394-4cbd-8245-bb28c47d518e-etc-swift\") pod \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\" (UID: \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\") " Mar 09 10:00:54 crc kubenswrapper[4971]: I0309 10:00:54.436220 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/19b8a8ef-d394-4cbd-8245-bb28c47d518e-swiftconf\") pod \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\" (UID: \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\") " Mar 09 10:00:54 crc kubenswrapper[4971]: I0309 10:00:54.436236 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/19b8a8ef-d394-4cbd-8245-bb28c47d518e-dispersionconf\") pod \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\" (UID: \"19b8a8ef-d394-4cbd-8245-bb28c47d518e\") " Mar 09 10:00:54 crc kubenswrapper[4971]: I0309 10:00:54.436485 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19b8a8ef-d394-4cbd-8245-bb28c47d518e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "19b8a8ef-d394-4cbd-8245-bb28c47d518e" (UID: "19b8a8ef-d394-4cbd-8245-bb28c47d518e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:54 crc kubenswrapper[4971]: I0309 10:00:54.437991 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19b8a8ef-d394-4cbd-8245-bb28c47d518e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "19b8a8ef-d394-4cbd-8245-bb28c47d518e" (UID: "19b8a8ef-d394-4cbd-8245-bb28c47d518e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:00:54 crc kubenswrapper[4971]: I0309 10:00:54.439878 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd"] Mar 09 10:00:54 crc kubenswrapper[4971]: I0309 10:00:54.442433 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19b8a8ef-d394-4cbd-8245-bb28c47d518e-kube-api-access-fc484" (OuterVolumeSpecName: "kube-api-access-fc484") pod "19b8a8ef-d394-4cbd-8245-bb28c47d518e" (UID: "19b8a8ef-d394-4cbd-8245-bb28c47d518e"). InnerVolumeSpecName "kube-api-access-fc484". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:54 crc kubenswrapper[4971]: I0309 10:00:54.456943 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19b8a8ef-d394-4cbd-8245-bb28c47d518e-scripts" (OuterVolumeSpecName: "scripts") pod "19b8a8ef-d394-4cbd-8245-bb28c47d518e" (UID: "19b8a8ef-d394-4cbd-8245-bb28c47d518e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:54 crc kubenswrapper[4971]: I0309 10:00:54.463609 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19b8a8ef-d394-4cbd-8245-bb28c47d518e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "19b8a8ef-d394-4cbd-8245-bb28c47d518e" (UID: "19b8a8ef-d394-4cbd-8245-bb28c47d518e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:54 crc kubenswrapper[4971]: I0309 10:00:54.465577 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19b8a8ef-d394-4cbd-8245-bb28c47d518e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "19b8a8ef-d394-4cbd-8245-bb28c47d518e" (UID: "19b8a8ef-d394-4cbd-8245-bb28c47d518e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:54 crc kubenswrapper[4971]: I0309 10:00:54.537894 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/19b8a8ef-d394-4cbd-8245-bb28c47d518e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:54 crc kubenswrapper[4971]: I0309 10:00:54.537921 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc484\" (UniqueName: \"kubernetes.io/projected/19b8a8ef-d394-4cbd-8245-bb28c47d518e-kube-api-access-fc484\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:54 crc kubenswrapper[4971]: I0309 10:00:54.537932 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19b8a8ef-d394-4cbd-8245-bb28c47d518e-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:54 crc kubenswrapper[4971]: I0309 10:00:54.537941 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/19b8a8ef-d394-4cbd-8245-bb28c47d518e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:54 crc kubenswrapper[4971]: I0309 10:00:54.537949 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/19b8a8ef-d394-4cbd-8245-bb28c47d518e-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:54 crc kubenswrapper[4971]: I0309 10:00:54.537958 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/19b8a8ef-d394-4cbd-8245-bb28c47d518e-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.124822 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2369278dc45a73747542430302399db89fd2ffdb6e49b4c8e9ea11bf52cfefe9" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.124937 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jqlpd" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.169688 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19b8a8ef-d394-4cbd-8245-bb28c47d518e" path="/var/lib/kubelet/pods/19b8a8ef-d394-4cbd-8245-bb28c47d518e/volumes" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.557982 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cms7b"] Mar 09 10:00:55 crc kubenswrapper[4971]: E0309 10:00:55.558656 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b8a8ef-d394-4cbd-8245-bb28c47d518e" containerName="swift-ring-rebalance" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.558670 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b8a8ef-d394-4cbd-8245-bb28c47d518e" containerName="swift-ring-rebalance" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.558870 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="19b8a8ef-d394-4cbd-8245-bb28c47d518e" containerName="swift-ring-rebalance" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.559430 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cms7b" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.563390 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.563525 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.565999 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cms7b"] Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.654235 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/719f99c7-f205-4ced-9e1a-911032286254-scripts\") pod \"swift-ring-rebalance-debug-cms7b\" (UID: \"719f99c7-f205-4ced-9e1a-911032286254\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cms7b" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.654320 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/719f99c7-f205-4ced-9e1a-911032286254-ring-data-devices\") pod \"swift-ring-rebalance-debug-cms7b\" (UID: \"719f99c7-f205-4ced-9e1a-911032286254\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cms7b" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.654384 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/719f99c7-f205-4ced-9e1a-911032286254-etc-swift\") pod \"swift-ring-rebalance-debug-cms7b\" (UID: \"719f99c7-f205-4ced-9e1a-911032286254\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cms7b" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.654408 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/719f99c7-f205-4ced-9e1a-911032286254-dispersionconf\") pod \"swift-ring-rebalance-debug-cms7b\" (UID: \"719f99c7-f205-4ced-9e1a-911032286254\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cms7b" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.654452 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvm5v\" (UniqueName: \"kubernetes.io/projected/719f99c7-f205-4ced-9e1a-911032286254-kube-api-access-dvm5v\") pod \"swift-ring-rebalance-debug-cms7b\" (UID: \"719f99c7-f205-4ced-9e1a-911032286254\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cms7b" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.654556 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/719f99c7-f205-4ced-9e1a-911032286254-swiftconf\") pod \"swift-ring-rebalance-debug-cms7b\" (UID: \"719f99c7-f205-4ced-9e1a-911032286254\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cms7b" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.756536 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/719f99c7-f205-4ced-9e1a-911032286254-etc-swift\") pod \"swift-ring-rebalance-debug-cms7b\" (UID: \"719f99c7-f205-4ced-9e1a-911032286254\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cms7b" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.756579 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/719f99c7-f205-4ced-9e1a-911032286254-dispersionconf\") pod \"swift-ring-rebalance-debug-cms7b\" (UID: \"719f99c7-f205-4ced-9e1a-911032286254\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cms7b" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.756619 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvm5v\" (UniqueName: \"kubernetes.io/projected/719f99c7-f205-4ced-9e1a-911032286254-kube-api-access-dvm5v\") pod \"swift-ring-rebalance-debug-cms7b\" (UID: \"719f99c7-f205-4ced-9e1a-911032286254\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cms7b" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.756665 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/719f99c7-f205-4ced-9e1a-911032286254-swiftconf\") pod \"swift-ring-rebalance-debug-cms7b\" (UID: \"719f99c7-f205-4ced-9e1a-911032286254\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cms7b" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.756719 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/719f99c7-f205-4ced-9e1a-911032286254-scripts\") pod \"swift-ring-rebalance-debug-cms7b\" (UID: \"719f99c7-f205-4ced-9e1a-911032286254\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cms7b" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.756750 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/719f99c7-f205-4ced-9e1a-911032286254-ring-data-devices\") pod \"swift-ring-rebalance-debug-cms7b\" (UID: \"719f99c7-f205-4ced-9e1a-911032286254\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cms7b" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.757181 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/719f99c7-f205-4ced-9e1a-911032286254-etc-swift\") pod \"swift-ring-rebalance-debug-cms7b\" (UID: \"719f99c7-f205-4ced-9e1a-911032286254\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cms7b" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.757502 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/719f99c7-f205-4ced-9e1a-911032286254-ring-data-devices\") pod \"swift-ring-rebalance-debug-cms7b\" (UID: \"719f99c7-f205-4ced-9e1a-911032286254\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cms7b" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.758284 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/719f99c7-f205-4ced-9e1a-911032286254-scripts\") pod \"swift-ring-rebalance-debug-cms7b\" (UID: \"719f99c7-f205-4ced-9e1a-911032286254\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cms7b" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.765125 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/719f99c7-f205-4ced-9e1a-911032286254-dispersionconf\") pod \"swift-ring-rebalance-debug-cms7b\" (UID: \"719f99c7-f205-4ced-9e1a-911032286254\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cms7b" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.765208 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/719f99c7-f205-4ced-9e1a-911032286254-swiftconf\") pod \"swift-ring-rebalance-debug-cms7b\" (UID: \"719f99c7-f205-4ced-9e1a-911032286254\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cms7b" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.775279 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvm5v\" (UniqueName: \"kubernetes.io/projected/719f99c7-f205-4ced-9e1a-911032286254-kube-api-access-dvm5v\") pod \"swift-ring-rebalance-debug-cms7b\" (UID: \"719f99c7-f205-4ced-9e1a-911032286254\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cms7b" Mar 09 10:00:55 crc kubenswrapper[4971]: I0309 10:00:55.879331 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cms7b" Mar 09 10:00:56 crc kubenswrapper[4971]: I0309 10:00:56.356436 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cms7b"] Mar 09 10:00:56 crc kubenswrapper[4971]: W0309 10:00:56.361224 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod719f99c7_f205_4ced_9e1a_911032286254.slice/crio-d37f9741349d6e4deef081025b234c2f1eba4d57b0d399c98a23215b33601770 WatchSource:0}: Error finding container d37f9741349d6e4deef081025b234c2f1eba4d57b0d399c98a23215b33601770: Status 404 returned error can't find the container with id d37f9741349d6e4deef081025b234c2f1eba4d57b0d399c98a23215b33601770 Mar 09 10:00:57 crc kubenswrapper[4971]: I0309 10:00:57.143698 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cms7b" event={"ID":"719f99c7-f205-4ced-9e1a-911032286254","Type":"ContainerStarted","Data":"a8b7cce18f02193cca732d9769dc6f3e19f411a0b5ad3b777e795ebba71b99e7"} Mar 09 10:00:57 crc kubenswrapper[4971]: I0309 10:00:57.143752 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cms7b" event={"ID":"719f99c7-f205-4ced-9e1a-911032286254","Type":"ContainerStarted","Data":"d37f9741349d6e4deef081025b234c2f1eba4d57b0d399c98a23215b33601770"} Mar 09 10:00:57 crc kubenswrapper[4971]: I0309 10:00:57.168041 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cms7b" podStartSLOduration=2.168016874 podStartE2EDuration="2.168016874s" podCreationTimestamp="2026-03-09 10:00:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:00:57.163718546 +0000 UTC m=+2460.723646356" watchObservedRunningTime="2026-03-09 10:00:57.168016874 +0000 UTC m=+2460.727944684" Mar 09 10:00:58 crc kubenswrapper[4971]: I0309 10:00:58.153611 4971 generic.go:334] "Generic (PLEG): container finished" podID="719f99c7-f205-4ced-9e1a-911032286254" containerID="a8b7cce18f02193cca732d9769dc6f3e19f411a0b5ad3b777e795ebba71b99e7" exitCode=0 Mar 09 10:00:58 crc kubenswrapper[4971]: I0309 10:00:58.153757 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cms7b" event={"ID":"719f99c7-f205-4ced-9e1a-911032286254","Type":"ContainerDied","Data":"a8b7cce18f02193cca732d9769dc6f3e19f411a0b5ad3b777e795ebba71b99e7"} Mar 09 10:00:59 crc kubenswrapper[4971]: I0309 10:00:59.435946 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cms7b" Mar 09 10:00:59 crc kubenswrapper[4971]: I0309 10:00:59.478130 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cms7b"] Mar 09 10:00:59 crc kubenswrapper[4971]: I0309 10:00:59.490603 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cms7b"] Mar 09 10:00:59 crc kubenswrapper[4971]: I0309 10:00:59.512330 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/719f99c7-f205-4ced-9e1a-911032286254-ring-data-devices\") pod \"719f99c7-f205-4ced-9e1a-911032286254\" (UID: \"719f99c7-f205-4ced-9e1a-911032286254\") " Mar 09 10:00:59 crc kubenswrapper[4971]: I0309 10:00:59.512493 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/719f99c7-f205-4ced-9e1a-911032286254-scripts\") pod \"719f99c7-f205-4ced-9e1a-911032286254\" (UID: \"719f99c7-f205-4ced-9e1a-911032286254\") " Mar 09 10:00:59 crc kubenswrapper[4971]: I0309 10:00:59.512539 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/719f99c7-f205-4ced-9e1a-911032286254-etc-swift\") pod \"719f99c7-f205-4ced-9e1a-911032286254\" (UID: \"719f99c7-f205-4ced-9e1a-911032286254\") " Mar 09 10:00:59 crc kubenswrapper[4971]: I0309 10:00:59.513676 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/719f99c7-f205-4ced-9e1a-911032286254-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "719f99c7-f205-4ced-9e1a-911032286254" (UID: "719f99c7-f205-4ced-9e1a-911032286254"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:59 crc kubenswrapper[4971]: I0309 10:00:59.513736 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/719f99c7-f205-4ced-9e1a-911032286254-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "719f99c7-f205-4ced-9e1a-911032286254" (UID: "719f99c7-f205-4ced-9e1a-911032286254"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:00:59 crc kubenswrapper[4971]: I0309 10:00:59.513799 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvm5v\" (UniqueName: \"kubernetes.io/projected/719f99c7-f205-4ced-9e1a-911032286254-kube-api-access-dvm5v\") pod \"719f99c7-f205-4ced-9e1a-911032286254\" (UID: \"719f99c7-f205-4ced-9e1a-911032286254\") " Mar 09 10:00:59 crc kubenswrapper[4971]: I0309 10:00:59.513849 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/719f99c7-f205-4ced-9e1a-911032286254-swiftconf\") pod \"719f99c7-f205-4ced-9e1a-911032286254\" (UID: \"719f99c7-f205-4ced-9e1a-911032286254\") " Mar 09 10:00:59 crc kubenswrapper[4971]: I0309 10:00:59.513898 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/719f99c7-f205-4ced-9e1a-911032286254-dispersionconf\") pod \"719f99c7-f205-4ced-9e1a-911032286254\" (UID: \"719f99c7-f205-4ced-9e1a-911032286254\") " Mar 09 10:00:59 crc kubenswrapper[4971]: I0309 10:00:59.514489 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/719f99c7-f205-4ced-9e1a-911032286254-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:59 crc kubenswrapper[4971]: I0309 10:00:59.514516 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/719f99c7-f205-4ced-9e1a-911032286254-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:59 crc kubenswrapper[4971]: I0309 10:00:59.519432 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/719f99c7-f205-4ced-9e1a-911032286254-kube-api-access-dvm5v" (OuterVolumeSpecName: "kube-api-access-dvm5v") pod "719f99c7-f205-4ced-9e1a-911032286254" (UID: "719f99c7-f205-4ced-9e1a-911032286254"). InnerVolumeSpecName "kube-api-access-dvm5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:59 crc kubenswrapper[4971]: I0309 10:00:59.533044 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/719f99c7-f205-4ced-9e1a-911032286254-scripts" (OuterVolumeSpecName: "scripts") pod "719f99c7-f205-4ced-9e1a-911032286254" (UID: "719f99c7-f205-4ced-9e1a-911032286254"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:59 crc kubenswrapper[4971]: I0309 10:00:59.535658 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719f99c7-f205-4ced-9e1a-911032286254-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "719f99c7-f205-4ced-9e1a-911032286254" (UID: "719f99c7-f205-4ced-9e1a-911032286254"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:59 crc kubenswrapper[4971]: I0309 10:00:59.537735 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719f99c7-f205-4ced-9e1a-911032286254-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "719f99c7-f205-4ced-9e1a-911032286254" (UID: "719f99c7-f205-4ced-9e1a-911032286254"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:59 crc kubenswrapper[4971]: I0309 10:00:59.616394 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/719f99c7-f205-4ced-9e1a-911032286254-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:59 crc kubenswrapper[4971]: I0309 10:00:59.616441 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvm5v\" (UniqueName: \"kubernetes.io/projected/719f99c7-f205-4ced-9e1a-911032286254-kube-api-access-dvm5v\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:59 crc kubenswrapper[4971]: I0309 10:00:59.616455 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/719f99c7-f205-4ced-9e1a-911032286254-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:59 crc kubenswrapper[4971]: I0309 10:00:59.616465 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/719f99c7-f205-4ced-9e1a-911032286254-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.140888 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/keystone-cron-29550841-pnmz4"] Mar 09 10:01:00 crc kubenswrapper[4971]: E0309 10:01:00.141340 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719f99c7-f205-4ced-9e1a-911032286254" containerName="swift-ring-rebalance" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.141378 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="719f99c7-f205-4ced-9e1a-911032286254" containerName="swift-ring-rebalance" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.141563 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="719f99c7-f205-4ced-9e1a-911032286254" containerName="swift-ring-rebalance" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.142149 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-cron-29550841-pnmz4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.152207 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-cron-29550841-pnmz4"] Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.176677 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d37f9741349d6e4deef081025b234c2f1eba4d57b0d399c98a23215b33601770" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.176746 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cms7b" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.223916 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/364959d2-c613-4a79-940c-3c00d24887ca-fernet-keys\") pod \"keystone-cron-29550841-pnmz4\" (UID: \"364959d2-c613-4a79-940c-3c00d24887ca\") " pod="swift-kuttl-tests/keystone-cron-29550841-pnmz4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.223985 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/364959d2-c613-4a79-940c-3c00d24887ca-config-data\") pod \"keystone-cron-29550841-pnmz4\" (UID: \"364959d2-c613-4a79-940c-3c00d24887ca\") " pod="swift-kuttl-tests/keystone-cron-29550841-pnmz4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.224047 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmg5j\" (UniqueName: \"kubernetes.io/projected/364959d2-c613-4a79-940c-3c00d24887ca-kube-api-access-jmg5j\") pod \"keystone-cron-29550841-pnmz4\" (UID: \"364959d2-c613-4a79-940c-3c00d24887ca\") " pod="swift-kuttl-tests/keystone-cron-29550841-pnmz4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.326660 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/364959d2-c613-4a79-940c-3c00d24887ca-config-data\") pod \"keystone-cron-29550841-pnmz4\" (UID: \"364959d2-c613-4a79-940c-3c00d24887ca\") " pod="swift-kuttl-tests/keystone-cron-29550841-pnmz4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.326732 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmg5j\" (UniqueName: \"kubernetes.io/projected/364959d2-c613-4a79-940c-3c00d24887ca-kube-api-access-jmg5j\") pod \"keystone-cron-29550841-pnmz4\" (UID: \"364959d2-c613-4a79-940c-3c00d24887ca\") " pod="swift-kuttl-tests/keystone-cron-29550841-pnmz4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.326809 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/364959d2-c613-4a79-940c-3c00d24887ca-fernet-keys\") pod \"keystone-cron-29550841-pnmz4\" (UID: \"364959d2-c613-4a79-940c-3c00d24887ca\") " pod="swift-kuttl-tests/keystone-cron-29550841-pnmz4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.330798 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/364959d2-c613-4a79-940c-3c00d24887ca-config-data\") pod \"keystone-cron-29550841-pnmz4\" (UID: \"364959d2-c613-4a79-940c-3c00d24887ca\") " pod="swift-kuttl-tests/keystone-cron-29550841-pnmz4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.330906 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/364959d2-c613-4a79-940c-3c00d24887ca-fernet-keys\") pod \"keystone-cron-29550841-pnmz4\" (UID: \"364959d2-c613-4a79-940c-3c00d24887ca\") " pod="swift-kuttl-tests/keystone-cron-29550841-pnmz4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.346452 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmg5j\" (UniqueName: \"kubernetes.io/projected/364959d2-c613-4a79-940c-3c00d24887ca-kube-api-access-jmg5j\") pod \"keystone-cron-29550841-pnmz4\" (UID: \"364959d2-c613-4a79-940c-3c00d24887ca\") " pod="swift-kuttl-tests/keystone-cron-29550841-pnmz4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.462825 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-cron-29550841-pnmz4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.600083 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4"] Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.601376 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.603115 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.603616 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.612168 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4"] Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.732252 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21e4ab3c-2b43-46b8-80d9-9bd361794dae-etc-swift\") pod \"swift-ring-rebalance-debug-2mnb4\" (UID: \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.732312 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99lpm\" (UniqueName: \"kubernetes.io/projected/21e4ab3c-2b43-46b8-80d9-9bd361794dae-kube-api-access-99lpm\") pod \"swift-ring-rebalance-debug-2mnb4\" (UID: \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.732444 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21e4ab3c-2b43-46b8-80d9-9bd361794dae-ring-data-devices\") pod \"swift-ring-rebalance-debug-2mnb4\" (UID: \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.732465 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21e4ab3c-2b43-46b8-80d9-9bd361794dae-scripts\") pod \"swift-ring-rebalance-debug-2mnb4\" (UID: \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.732531 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21e4ab3c-2b43-46b8-80d9-9bd361794dae-swiftconf\") pod \"swift-ring-rebalance-debug-2mnb4\" (UID: \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.732569 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21e4ab3c-2b43-46b8-80d9-9bd361794dae-dispersionconf\") pod \"swift-ring-rebalance-debug-2mnb4\" (UID: \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.834342 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21e4ab3c-2b43-46b8-80d9-9bd361794dae-dispersionconf\") pod \"swift-ring-rebalance-debug-2mnb4\" (UID: \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.834444 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21e4ab3c-2b43-46b8-80d9-9bd361794dae-etc-swift\") pod \"swift-ring-rebalance-debug-2mnb4\" (UID: \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.834514 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99lpm\" (UniqueName: \"kubernetes.io/projected/21e4ab3c-2b43-46b8-80d9-9bd361794dae-kube-api-access-99lpm\") pod \"swift-ring-rebalance-debug-2mnb4\" (UID: \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.834573 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21e4ab3c-2b43-46b8-80d9-9bd361794dae-ring-data-devices\") pod \"swift-ring-rebalance-debug-2mnb4\" (UID: \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.834590 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21e4ab3c-2b43-46b8-80d9-9bd361794dae-scripts\") pod \"swift-ring-rebalance-debug-2mnb4\" (UID: \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.834626 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21e4ab3c-2b43-46b8-80d9-9bd361794dae-swiftconf\") pod \"swift-ring-rebalance-debug-2mnb4\" (UID: \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.835362 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21e4ab3c-2b43-46b8-80d9-9bd361794dae-etc-swift\") pod \"swift-ring-rebalance-debug-2mnb4\" (UID: \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.835840 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21e4ab3c-2b43-46b8-80d9-9bd361794dae-ring-data-devices\") pod \"swift-ring-rebalance-debug-2mnb4\" (UID: \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.835989 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21e4ab3c-2b43-46b8-80d9-9bd361794dae-scripts\") pod \"swift-ring-rebalance-debug-2mnb4\" (UID: \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.839021 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21e4ab3c-2b43-46b8-80d9-9bd361794dae-dispersionconf\") pod \"swift-ring-rebalance-debug-2mnb4\" (UID: \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.846892 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21e4ab3c-2b43-46b8-80d9-9bd361794dae-swiftconf\") pod \"swift-ring-rebalance-debug-2mnb4\" (UID: \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.851276 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99lpm\" (UniqueName: \"kubernetes.io/projected/21e4ab3c-2b43-46b8-80d9-9bd361794dae-kube-api-access-99lpm\") pod \"swift-ring-rebalance-debug-2mnb4\" (UID: \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4" Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.889829 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-cron-29550841-pnmz4"] Mar 09 10:01:00 crc kubenswrapper[4971]: W0309 10:01:00.896123 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod364959d2_c613_4a79_940c_3c00d24887ca.slice/crio-63a567775285a4b4fa2e651798f1e319e8448189c3e746863928492f05ba7f81 WatchSource:0}: Error finding container 63a567775285a4b4fa2e651798f1e319e8448189c3e746863928492f05ba7f81: Status 404 returned error can't find the container with id 63a567775285a4b4fa2e651798f1e319e8448189c3e746863928492f05ba7f81 Mar 09 10:01:00 crc kubenswrapper[4971]: I0309 10:01:00.936862 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4" Mar 09 10:01:01 crc kubenswrapper[4971]: I0309 10:01:01.163861 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="719f99c7-f205-4ced-9e1a-911032286254" path="/var/lib/kubelet/pods/719f99c7-f205-4ced-9e1a-911032286254/volumes" Mar 09 10:01:01 crc kubenswrapper[4971]: I0309 10:01:01.187580 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-cron-29550841-pnmz4" event={"ID":"364959d2-c613-4a79-940c-3c00d24887ca","Type":"ContainerStarted","Data":"731e9cf2336e0a12d0cc55475a2e56d7eec557d3c54856889ae71cc82bd618df"} Mar 09 10:01:01 crc kubenswrapper[4971]: I0309 10:01:01.187653 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-cron-29550841-pnmz4" event={"ID":"364959d2-c613-4a79-940c-3c00d24887ca","Type":"ContainerStarted","Data":"63a567775285a4b4fa2e651798f1e319e8448189c3e746863928492f05ba7f81"} Mar 09 10:01:01 crc kubenswrapper[4971]: I0309 10:01:01.205711 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/keystone-cron-29550841-pnmz4" podStartSLOduration=1.205688187 podStartE2EDuration="1.205688187s" podCreationTimestamp="2026-03-09 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:01:01.202044337 +0000 UTC m=+2464.761972147" watchObservedRunningTime="2026-03-09 10:01:01.205688187 +0000 UTC m=+2464.765615997" Mar 09 10:01:01 crc kubenswrapper[4971]: I0309 10:01:01.355666 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4"] Mar 09 10:01:01 crc kubenswrapper[4971]: W0309 10:01:01.360788 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21e4ab3c_2b43_46b8_80d9_9bd361794dae.slice/crio-014b0a611d5b1a0cf9609928acc6725cc4048878ad0bf93f0dd1e95bf89594e2 WatchSource:0}: Error finding container 014b0a611d5b1a0cf9609928acc6725cc4048878ad0bf93f0dd1e95bf89594e2: Status 404 returned error can't find the container with id 014b0a611d5b1a0cf9609928acc6725cc4048878ad0bf93f0dd1e95bf89594e2 Mar 09 10:01:02 crc kubenswrapper[4971]: I0309 10:01:02.197240 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4" event={"ID":"21e4ab3c-2b43-46b8-80d9-9bd361794dae","Type":"ContainerStarted","Data":"72ed696f10e7790978612978c1d00833714dcbf7a96e9cf3869a47b6471be3de"} Mar 09 10:01:02 crc kubenswrapper[4971]: I0309 10:01:02.197561 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4" event={"ID":"21e4ab3c-2b43-46b8-80d9-9bd361794dae","Type":"ContainerStarted","Data":"014b0a611d5b1a0cf9609928acc6725cc4048878ad0bf93f0dd1e95bf89594e2"} Mar 09 10:01:03 crc kubenswrapper[4971]: I0309 10:01:03.208677 4971 generic.go:334] "Generic (PLEG): container finished" podID="364959d2-c613-4a79-940c-3c00d24887ca" containerID="731e9cf2336e0a12d0cc55475a2e56d7eec557d3c54856889ae71cc82bd618df" exitCode=0 Mar 09 10:01:03 crc kubenswrapper[4971]: I0309 10:01:03.208777 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-cron-29550841-pnmz4" event={"ID":"364959d2-c613-4a79-940c-3c00d24887ca","Type":"ContainerDied","Data":"731e9cf2336e0a12d0cc55475a2e56d7eec557d3c54856889ae71cc82bd618df"} Mar 09 10:01:03 crc kubenswrapper[4971]: I0309 10:01:03.211614 4971 generic.go:334] "Generic (PLEG): container finished" podID="21e4ab3c-2b43-46b8-80d9-9bd361794dae" containerID="72ed696f10e7790978612978c1d00833714dcbf7a96e9cf3869a47b6471be3de" exitCode=0 Mar 09 10:01:03 crc kubenswrapper[4971]: I0309 10:01:03.211665 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4" event={"ID":"21e4ab3c-2b43-46b8-80d9-9bd361794dae","Type":"ContainerDied","Data":"72ed696f10e7790978612978c1d00833714dcbf7a96e9cf3869a47b6471be3de"} Mar 09 10:01:03 crc kubenswrapper[4971]: I0309 10:01:03.228067 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4" podStartSLOduration=3.22805048 podStartE2EDuration="3.22805048s" podCreationTimestamp="2026-03-09 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:01:02.214714294 +0000 UTC m=+2465.774642104" watchObservedRunningTime="2026-03-09 10:01:03.22805048 +0000 UTC m=+2466.787978290" Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.530085 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4" Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.536364 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-cron-29550841-pnmz4" Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.573693 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4"] Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.579397 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4"] Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.692819 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21e4ab3c-2b43-46b8-80d9-9bd361794dae-ring-data-devices\") pod \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\" (UID: \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\") " Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.692963 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21e4ab3c-2b43-46b8-80d9-9bd361794dae-swiftconf\") pod \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\" (UID: \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\") " Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.693030 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/364959d2-c613-4a79-940c-3c00d24887ca-fernet-keys\") pod \"364959d2-c613-4a79-940c-3c00d24887ca\" (UID: \"364959d2-c613-4a79-940c-3c00d24887ca\") " Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.693056 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/364959d2-c613-4a79-940c-3c00d24887ca-config-data\") pod \"364959d2-c613-4a79-940c-3c00d24887ca\" (UID: \"364959d2-c613-4a79-940c-3c00d24887ca\") " Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.693091 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21e4ab3c-2b43-46b8-80d9-9bd361794dae-etc-swift\") pod \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\" (UID: \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\") " Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.693119 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21e4ab3c-2b43-46b8-80d9-9bd361794dae-scripts\") pod \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\" (UID: \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\") " Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.693190 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99lpm\" (UniqueName: \"kubernetes.io/projected/21e4ab3c-2b43-46b8-80d9-9bd361794dae-kube-api-access-99lpm\") pod \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\" (UID: \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\") " Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.693238 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmg5j\" (UniqueName: \"kubernetes.io/projected/364959d2-c613-4a79-940c-3c00d24887ca-kube-api-access-jmg5j\") pod \"364959d2-c613-4a79-940c-3c00d24887ca\" (UID: \"364959d2-c613-4a79-940c-3c00d24887ca\") " Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.693265 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21e4ab3c-2b43-46b8-80d9-9bd361794dae-dispersionconf\") pod \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\" (UID: \"21e4ab3c-2b43-46b8-80d9-9bd361794dae\") " Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.693855 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21e4ab3c-2b43-46b8-80d9-9bd361794dae-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "21e4ab3c-2b43-46b8-80d9-9bd361794dae" (UID: "21e4ab3c-2b43-46b8-80d9-9bd361794dae"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.694068 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21e4ab3c-2b43-46b8-80d9-9bd361794dae-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "21e4ab3c-2b43-46b8-80d9-9bd361794dae" (UID: "21e4ab3c-2b43-46b8-80d9-9bd361794dae"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.699081 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21e4ab3c-2b43-46b8-80d9-9bd361794dae-kube-api-access-99lpm" (OuterVolumeSpecName: "kube-api-access-99lpm") pod "21e4ab3c-2b43-46b8-80d9-9bd361794dae" (UID: "21e4ab3c-2b43-46b8-80d9-9bd361794dae"). InnerVolumeSpecName "kube-api-access-99lpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.699157 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/364959d2-c613-4a79-940c-3c00d24887ca-kube-api-access-jmg5j" (OuterVolumeSpecName: "kube-api-access-jmg5j") pod "364959d2-c613-4a79-940c-3c00d24887ca" (UID: "364959d2-c613-4a79-940c-3c00d24887ca"). InnerVolumeSpecName "kube-api-access-jmg5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.710482 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/364959d2-c613-4a79-940c-3c00d24887ca-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "364959d2-c613-4a79-940c-3c00d24887ca" (UID: "364959d2-c613-4a79-940c-3c00d24887ca"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.716890 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21e4ab3c-2b43-46b8-80d9-9bd361794dae-scripts" (OuterVolumeSpecName: "scripts") pod "21e4ab3c-2b43-46b8-80d9-9bd361794dae" (UID: "21e4ab3c-2b43-46b8-80d9-9bd361794dae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.718959 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21e4ab3c-2b43-46b8-80d9-9bd361794dae-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "21e4ab3c-2b43-46b8-80d9-9bd361794dae" (UID: "21e4ab3c-2b43-46b8-80d9-9bd361794dae"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.721975 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21e4ab3c-2b43-46b8-80d9-9bd361794dae-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "21e4ab3c-2b43-46b8-80d9-9bd361794dae" (UID: "21e4ab3c-2b43-46b8-80d9-9bd361794dae"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.738761 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/364959d2-c613-4a79-940c-3c00d24887ca-config-data" (OuterVolumeSpecName: "config-data") pod "364959d2-c613-4a79-940c-3c00d24887ca" (UID: "364959d2-c613-4a79-940c-3c00d24887ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.794533 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99lpm\" (UniqueName: \"kubernetes.io/projected/21e4ab3c-2b43-46b8-80d9-9bd361794dae-kube-api-access-99lpm\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.794575 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmg5j\" (UniqueName: \"kubernetes.io/projected/364959d2-c613-4a79-940c-3c00d24887ca-kube-api-access-jmg5j\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.794586 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21e4ab3c-2b43-46b8-80d9-9bd361794dae-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.794599 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21e4ab3c-2b43-46b8-80d9-9bd361794dae-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.794608 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21e4ab3c-2b43-46b8-80d9-9bd361794dae-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.794617 4971 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/364959d2-c613-4a79-940c-3c00d24887ca-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.794625 4971 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/364959d2-c613-4a79-940c-3c00d24887ca-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.794633 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21e4ab3c-2b43-46b8-80d9-9bd361794dae-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:04 crc kubenswrapper[4971]: I0309 10:01:04.794641 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21e4ab3c-2b43-46b8-80d9-9bd361794dae-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.159847 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21e4ab3c-2b43-46b8-80d9-9bd361794dae" path="/var/lib/kubelet/pods/21e4ab3c-2b43-46b8-80d9-9bd361794dae/volumes" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.235272 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-cron-29550841-pnmz4" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.235470 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-cron-29550841-pnmz4" event={"ID":"364959d2-c613-4a79-940c-3c00d24887ca","Type":"ContainerDied","Data":"63a567775285a4b4fa2e651798f1e319e8448189c3e746863928492f05ba7f81"} Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.236150 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63a567775285a4b4fa2e651798f1e319e8448189c3e746863928492f05ba7f81" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.237371 4971 scope.go:117] "RemoveContainer" containerID="72ed696f10e7790978612978c1d00833714dcbf7a96e9cf3869a47b6471be3de" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.237503 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2mnb4" Mar 09 10:01:05 crc kubenswrapper[4971]: E0309 10:01:05.299907 4971 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod364959d2_c613_4a79_940c_3c00d24887ca.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21e4ab3c_2b43_46b8_80d9_9bd361794dae.slice\": RecentStats: unable to find data in memory cache]" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.744767 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-d48pd"] Mar 09 10:01:05 crc kubenswrapper[4971]: E0309 10:01:05.745066 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="364959d2-c613-4a79-940c-3c00d24887ca" containerName="keystone-cron" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.745078 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="364959d2-c613-4a79-940c-3c00d24887ca" containerName="keystone-cron" Mar 09 10:01:05 crc kubenswrapper[4971]: E0309 10:01:05.745099 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21e4ab3c-2b43-46b8-80d9-9bd361794dae" containerName="swift-ring-rebalance" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.745105 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="21e4ab3c-2b43-46b8-80d9-9bd361794dae" containerName="swift-ring-rebalance" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.745243 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="21e4ab3c-2b43-46b8-80d9-9bd361794dae" containerName="swift-ring-rebalance" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.745252 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="364959d2-c613-4a79-940c-3c00d24887ca" containerName="keystone-cron" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.745689 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-d48pd" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.748083 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.754526 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.780182 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-d48pd"] Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.815004 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-etc-swift\") pod \"swift-ring-rebalance-debug-d48pd\" (UID: \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d48pd" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.815083 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjd7r\" (UniqueName: \"kubernetes.io/projected/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-kube-api-access-rjd7r\") pod \"swift-ring-rebalance-debug-d48pd\" (UID: \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d48pd" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.815120 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-scripts\") pod \"swift-ring-rebalance-debug-d48pd\" (UID: \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d48pd" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.815930 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-ring-data-devices\") pod \"swift-ring-rebalance-debug-d48pd\" (UID: \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d48pd" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.815993 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-dispersionconf\") pod \"swift-ring-rebalance-debug-d48pd\" (UID: \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d48pd" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.816181 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-swiftconf\") pod \"swift-ring-rebalance-debug-d48pd\" (UID: \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d48pd" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.918172 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-ring-data-devices\") pod \"swift-ring-rebalance-debug-d48pd\" (UID: \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d48pd" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.918300 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-dispersionconf\") pod \"swift-ring-rebalance-debug-d48pd\" (UID: \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d48pd" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.918376 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-swiftconf\") pod \"swift-ring-rebalance-debug-d48pd\" (UID: \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d48pd" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.918434 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-etc-swift\") pod \"swift-ring-rebalance-debug-d48pd\" (UID: \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d48pd" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.918470 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjd7r\" (UniqueName: \"kubernetes.io/projected/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-kube-api-access-rjd7r\") pod \"swift-ring-rebalance-debug-d48pd\" (UID: \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d48pd" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.918497 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-scripts\") pod \"swift-ring-rebalance-debug-d48pd\" (UID: \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d48pd" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.919331 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-etc-swift\") pod \"swift-ring-rebalance-debug-d48pd\" (UID: \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d48pd" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.919378 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-ring-data-devices\") pod \"swift-ring-rebalance-debug-d48pd\" (UID: \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d48pd" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.919899 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-scripts\") pod \"swift-ring-rebalance-debug-d48pd\" (UID: \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d48pd" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.930964 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-dispersionconf\") pod \"swift-ring-rebalance-debug-d48pd\" (UID: \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d48pd" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.931074 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-swiftconf\") pod \"swift-ring-rebalance-debug-d48pd\" (UID: \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d48pd" Mar 09 10:01:05 crc kubenswrapper[4971]: I0309 10:01:05.936162 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjd7r\" (UniqueName: \"kubernetes.io/projected/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-kube-api-access-rjd7r\") pod \"swift-ring-rebalance-debug-d48pd\" (UID: \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-d48pd" Mar 09 10:01:06 crc kubenswrapper[4971]: I0309 10:01:06.082059 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-d48pd" Mar 09 10:01:06 crc kubenswrapper[4971]: I0309 10:01:06.499492 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-d48pd"] Mar 09 10:01:07 crc kubenswrapper[4971]: I0309 10:01:07.256989 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-d48pd" event={"ID":"472b53a9-3ce4-4dff-bbe5-b62a90ce955a","Type":"ContainerStarted","Data":"3a752b5a987227c9d24eae5351d193caecb4de2ac42d931bca06f7cd140065f7"} Mar 09 10:01:07 crc kubenswrapper[4971]: I0309 10:01:07.257394 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-d48pd" event={"ID":"472b53a9-3ce4-4dff-bbe5-b62a90ce955a","Type":"ContainerStarted","Data":"d4c60a439333abbf2457f8c14a0de97389e83dfab3fbd9945cc6068e101bb41e"} Mar 09 10:01:07 crc kubenswrapper[4971]: I0309 10:01:07.279018 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-d48pd" podStartSLOduration=2.278997477 podStartE2EDuration="2.278997477s" podCreationTimestamp="2026-03-09 10:01:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:01:07.270967846 +0000 UTC m=+2470.830895656" watchObservedRunningTime="2026-03-09 10:01:07.278997477 +0000 UTC m=+2470.838925297" Mar 09 10:01:07 crc kubenswrapper[4971]: I0309 10:01:07.524169 4971 scope.go:117] "RemoveContainer" containerID="de07b8af261cf7553fcab6154141ca1b2d159d0c8e55dcc8a1b5d29ff5142bca" Mar 09 10:01:07 crc kubenswrapper[4971]: I0309 10:01:07.555775 4971 scope.go:117] "RemoveContainer" containerID="401ab593de2c1adf552ea91a59a10c4422a4e147a2aad4b7488448868ee23ae0" Mar 09 10:01:07 crc kubenswrapper[4971]: I0309 10:01:07.589276 4971 scope.go:117] "RemoveContainer" containerID="e62bbf56a2d4847a368cdb1694811859dbf258a333ec4cf664bd68ab2be6cc9b" Mar 09 10:01:07 crc kubenswrapper[4971]: I0309 10:01:07.614660 4971 scope.go:117] "RemoveContainer" containerID="c10f2007aa9d1529898e8b8bebdfdbed76f2c34c81e3a788d20064acb41b463f" Mar 09 10:01:07 crc kubenswrapper[4971]: I0309 10:01:07.640118 4971 scope.go:117] "RemoveContainer" containerID="cdb8b706eb4a3e8df58f981c50ae4504f7725109f031d3d4724ec6d7c8530b43" Mar 09 10:01:07 crc kubenswrapper[4971]: I0309 10:01:07.674665 4971 scope.go:117] "RemoveContainer" containerID="f810a40392bc23040cc1f561eef6f9f4ddcd593ed4be1db42731881536b6c9e6" Mar 09 10:01:07 crc kubenswrapper[4971]: I0309 10:01:07.710801 4971 scope.go:117] "RemoveContainer" containerID="88bc415847a0320875c8b7783b8af684e29dcef469f488e1ffb54622628cf7c6" Mar 09 10:01:07 crc kubenswrapper[4971]: I0309 10:01:07.734922 4971 scope.go:117] "RemoveContainer" containerID="fc3b02182033c25c6e76bf709aacf02c7cf9741afd534ab7124f0e09f70bdf8a" Mar 09 10:01:08 crc kubenswrapper[4971]: I0309 10:01:08.265081 4971 generic.go:334] "Generic (PLEG): container finished" podID="472b53a9-3ce4-4dff-bbe5-b62a90ce955a" containerID="3a752b5a987227c9d24eae5351d193caecb4de2ac42d931bca06f7cd140065f7" exitCode=0 Mar 09 10:01:08 crc kubenswrapper[4971]: I0309 10:01:08.265169 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-d48pd" event={"ID":"472b53a9-3ce4-4dff-bbe5-b62a90ce955a","Type":"ContainerDied","Data":"3a752b5a987227c9d24eae5351d193caecb4de2ac42d931bca06f7cd140065f7"} Mar 09 10:01:09 crc kubenswrapper[4971]: I0309 10:01:09.566573 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-d48pd" Mar 09 10:01:09 crc kubenswrapper[4971]: I0309 10:01:09.596794 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-d48pd"] Mar 09 10:01:09 crc kubenswrapper[4971]: I0309 10:01:09.603767 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-d48pd"] Mar 09 10:01:09 crc kubenswrapper[4971]: I0309 10:01:09.672336 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-dispersionconf\") pod \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\" (UID: \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\") " Mar 09 10:01:09 crc kubenswrapper[4971]: I0309 10:01:09.672410 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-scripts\") pod \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\" (UID: \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\") " Mar 09 10:01:09 crc kubenswrapper[4971]: I0309 10:01:09.672447 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjd7r\" (UniqueName: \"kubernetes.io/projected/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-kube-api-access-rjd7r\") pod \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\" (UID: \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\") " Mar 09 10:01:09 crc kubenswrapper[4971]: I0309 10:01:09.672479 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-swiftconf\") pod \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\" (UID: \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\") " Mar 09 10:01:09 crc kubenswrapper[4971]: I0309 10:01:09.672506 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-ring-data-devices\") pod \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\" (UID: \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\") " Mar 09 10:01:09 crc kubenswrapper[4971]: I0309 10:01:09.672521 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-etc-swift\") pod \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\" (UID: \"472b53a9-3ce4-4dff-bbe5-b62a90ce955a\") " Mar 09 10:01:09 crc kubenswrapper[4971]: I0309 10:01:09.673245 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "472b53a9-3ce4-4dff-bbe5-b62a90ce955a" (UID: "472b53a9-3ce4-4dff-bbe5-b62a90ce955a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:01:09 crc kubenswrapper[4971]: I0309 10:01:09.673334 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "472b53a9-3ce4-4dff-bbe5-b62a90ce955a" (UID: "472b53a9-3ce4-4dff-bbe5-b62a90ce955a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:01:09 crc kubenswrapper[4971]: I0309 10:01:09.676769 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-kube-api-access-rjd7r" (OuterVolumeSpecName: "kube-api-access-rjd7r") pod "472b53a9-3ce4-4dff-bbe5-b62a90ce955a" (UID: "472b53a9-3ce4-4dff-bbe5-b62a90ce955a"). InnerVolumeSpecName "kube-api-access-rjd7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:01:09 crc kubenswrapper[4971]: I0309 10:01:09.692690 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "472b53a9-3ce4-4dff-bbe5-b62a90ce955a" (UID: "472b53a9-3ce4-4dff-bbe5-b62a90ce955a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:09 crc kubenswrapper[4971]: I0309 10:01:09.696519 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "472b53a9-3ce4-4dff-bbe5-b62a90ce955a" (UID: "472b53a9-3ce4-4dff-bbe5-b62a90ce955a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:09 crc kubenswrapper[4971]: I0309 10:01:09.699005 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-scripts" (OuterVolumeSpecName: "scripts") pod "472b53a9-3ce4-4dff-bbe5-b62a90ce955a" (UID: "472b53a9-3ce4-4dff-bbe5-b62a90ce955a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:01:09 crc kubenswrapper[4971]: I0309 10:01:09.774932 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:09 crc kubenswrapper[4971]: I0309 10:01:09.774973 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:09 crc kubenswrapper[4971]: I0309 10:01:09.774984 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjd7r\" (UniqueName: \"kubernetes.io/projected/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-kube-api-access-rjd7r\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:09 crc kubenswrapper[4971]: I0309 10:01:09.774995 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:09 crc kubenswrapper[4971]: I0309 10:01:09.775004 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:09 crc kubenswrapper[4971]: I0309 10:01:09.775012 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/472b53a9-3ce4-4dff-bbe5-b62a90ce955a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:10 crc kubenswrapper[4971]: I0309 10:01:10.283597 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4c60a439333abbf2457f8c14a0de97389e83dfab3fbd9945cc6068e101bb41e" Mar 09 10:01:10 crc kubenswrapper[4971]: I0309 10:01:10.283715 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-d48pd" Mar 09 10:01:10 crc kubenswrapper[4971]: I0309 10:01:10.752877 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq"] Mar 09 10:01:10 crc kubenswrapper[4971]: E0309 10:01:10.753179 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="472b53a9-3ce4-4dff-bbe5-b62a90ce955a" containerName="swift-ring-rebalance" Mar 09 10:01:10 crc kubenswrapper[4971]: I0309 10:01:10.753191 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="472b53a9-3ce4-4dff-bbe5-b62a90ce955a" containerName="swift-ring-rebalance" Mar 09 10:01:10 crc kubenswrapper[4971]: I0309 10:01:10.753382 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="472b53a9-3ce4-4dff-bbe5-b62a90ce955a" containerName="swift-ring-rebalance" Mar 09 10:01:10 crc kubenswrapper[4971]: I0309 10:01:10.753860 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq" Mar 09 10:01:10 crc kubenswrapper[4971]: I0309 10:01:10.755746 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 10:01:10 crc kubenswrapper[4971]: I0309 10:01:10.757001 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 10:01:10 crc kubenswrapper[4971]: I0309 10:01:10.761975 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq"] Mar 09 10:01:10 crc kubenswrapper[4971]: I0309 10:01:10.889917 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnqkb\" (UniqueName: \"kubernetes.io/projected/2d1fdf69-e50e-45f7-91ba-caaef78417be-kube-api-access-dnqkb\") pod \"swift-ring-rebalance-debug-w4fdq\" (UID: \"2d1fdf69-e50e-45f7-91ba-caaef78417be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq" Mar 09 10:01:10 crc kubenswrapper[4971]: I0309 10:01:10.890015 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d1fdf69-e50e-45f7-91ba-caaef78417be-scripts\") pod \"swift-ring-rebalance-debug-w4fdq\" (UID: \"2d1fdf69-e50e-45f7-91ba-caaef78417be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq" Mar 09 10:01:10 crc kubenswrapper[4971]: I0309 10:01:10.890122 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2d1fdf69-e50e-45f7-91ba-caaef78417be-dispersionconf\") pod \"swift-ring-rebalance-debug-w4fdq\" (UID: \"2d1fdf69-e50e-45f7-91ba-caaef78417be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq" Mar 09 10:01:10 crc kubenswrapper[4971]: I0309 10:01:10.890152 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2d1fdf69-e50e-45f7-91ba-caaef78417be-etc-swift\") pod \"swift-ring-rebalance-debug-w4fdq\" (UID: \"2d1fdf69-e50e-45f7-91ba-caaef78417be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq" Mar 09 10:01:10 crc kubenswrapper[4971]: I0309 10:01:10.890187 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2d1fdf69-e50e-45f7-91ba-caaef78417be-ring-data-devices\") pod \"swift-ring-rebalance-debug-w4fdq\" (UID: \"2d1fdf69-e50e-45f7-91ba-caaef78417be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq" Mar 09 10:01:10 crc kubenswrapper[4971]: I0309 10:01:10.890256 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2d1fdf69-e50e-45f7-91ba-caaef78417be-swiftconf\") pod \"swift-ring-rebalance-debug-w4fdq\" (UID: \"2d1fdf69-e50e-45f7-91ba-caaef78417be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq" Mar 09 10:01:10 crc kubenswrapper[4971]: I0309 10:01:10.991152 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2d1fdf69-e50e-45f7-91ba-caaef78417be-etc-swift\") pod \"swift-ring-rebalance-debug-w4fdq\" (UID: \"2d1fdf69-e50e-45f7-91ba-caaef78417be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq" Mar 09 10:01:10 crc kubenswrapper[4971]: I0309 10:01:10.991267 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2d1fdf69-e50e-45f7-91ba-caaef78417be-ring-data-devices\") pod \"swift-ring-rebalance-debug-w4fdq\" (UID: \"2d1fdf69-e50e-45f7-91ba-caaef78417be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq" Mar 09 10:01:10 crc kubenswrapper[4971]: I0309 10:01:10.991305 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2d1fdf69-e50e-45f7-91ba-caaef78417be-swiftconf\") pod \"swift-ring-rebalance-debug-w4fdq\" (UID: \"2d1fdf69-e50e-45f7-91ba-caaef78417be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq" Mar 09 10:01:10 crc kubenswrapper[4971]: I0309 10:01:10.991367 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnqkb\" (UniqueName: \"kubernetes.io/projected/2d1fdf69-e50e-45f7-91ba-caaef78417be-kube-api-access-dnqkb\") pod \"swift-ring-rebalance-debug-w4fdq\" (UID: \"2d1fdf69-e50e-45f7-91ba-caaef78417be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq" Mar 09 10:01:10 crc kubenswrapper[4971]: I0309 10:01:10.991407 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d1fdf69-e50e-45f7-91ba-caaef78417be-scripts\") pod \"swift-ring-rebalance-debug-w4fdq\" (UID: \"2d1fdf69-e50e-45f7-91ba-caaef78417be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq" Mar 09 10:01:10 crc kubenswrapper[4971]: I0309 10:01:10.991478 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2d1fdf69-e50e-45f7-91ba-caaef78417be-dispersionconf\") pod \"swift-ring-rebalance-debug-w4fdq\" (UID: \"2d1fdf69-e50e-45f7-91ba-caaef78417be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq" Mar 09 10:01:10 crc kubenswrapper[4971]: I0309 10:01:10.991746 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2d1fdf69-e50e-45f7-91ba-caaef78417be-etc-swift\") pod \"swift-ring-rebalance-debug-w4fdq\" (UID: \"2d1fdf69-e50e-45f7-91ba-caaef78417be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq" Mar 09 10:01:10 crc kubenswrapper[4971]: I0309 10:01:10.992110 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2d1fdf69-e50e-45f7-91ba-caaef78417be-ring-data-devices\") pod \"swift-ring-rebalance-debug-w4fdq\" (UID: \"2d1fdf69-e50e-45f7-91ba-caaef78417be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq" Mar 09 10:01:10 crc kubenswrapper[4971]: I0309 10:01:10.992211 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d1fdf69-e50e-45f7-91ba-caaef78417be-scripts\") pod \"swift-ring-rebalance-debug-w4fdq\" (UID: \"2d1fdf69-e50e-45f7-91ba-caaef78417be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq" Mar 09 10:01:11 crc kubenswrapper[4971]: I0309 10:01:10.995653 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2d1fdf69-e50e-45f7-91ba-caaef78417be-dispersionconf\") pod \"swift-ring-rebalance-debug-w4fdq\" (UID: \"2d1fdf69-e50e-45f7-91ba-caaef78417be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq" Mar 09 10:01:11 crc kubenswrapper[4971]: I0309 10:01:10.995709 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2d1fdf69-e50e-45f7-91ba-caaef78417be-swiftconf\") pod \"swift-ring-rebalance-debug-w4fdq\" (UID: \"2d1fdf69-e50e-45f7-91ba-caaef78417be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq" Mar 09 10:01:11 crc kubenswrapper[4971]: I0309 10:01:11.010104 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnqkb\" (UniqueName: \"kubernetes.io/projected/2d1fdf69-e50e-45f7-91ba-caaef78417be-kube-api-access-dnqkb\") pod \"swift-ring-rebalance-debug-w4fdq\" (UID: \"2d1fdf69-e50e-45f7-91ba-caaef78417be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq" Mar 09 10:01:11 crc kubenswrapper[4971]: I0309 10:01:11.069320 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq" Mar 09 10:01:11 crc kubenswrapper[4971]: I0309 10:01:11.173240 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="472b53a9-3ce4-4dff-bbe5-b62a90ce955a" path="/var/lib/kubelet/pods/472b53a9-3ce4-4dff-bbe5-b62a90ce955a/volumes" Mar 09 10:01:11 crc kubenswrapper[4971]: I0309 10:01:11.471337 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq"] Mar 09 10:01:12 crc kubenswrapper[4971]: I0309 10:01:12.300359 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq" event={"ID":"2d1fdf69-e50e-45f7-91ba-caaef78417be","Type":"ContainerStarted","Data":"8942d6c97e5f582863ad23c16a9fb047210557bc9140a3cdd97e02d4818c2b6d"} Mar 09 10:01:12 crc kubenswrapper[4971]: I0309 10:01:12.300669 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq" event={"ID":"2d1fdf69-e50e-45f7-91ba-caaef78417be","Type":"ContainerStarted","Data":"fce95dee5addd513d1a10adca789606eb2949dee112c6e9266cb3cbbcffd7bd5"} Mar 09 10:01:12 crc kubenswrapper[4971]: I0309 10:01:12.327805 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq" podStartSLOduration=2.327783762 podStartE2EDuration="2.327783762s" podCreationTimestamp="2026-03-09 10:01:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:01:12.321938492 +0000 UTC m=+2475.881866312" watchObservedRunningTime="2026-03-09 10:01:12.327783762 +0000 UTC m=+2475.887711582" Mar 09 10:01:13 crc kubenswrapper[4971]: I0309 10:01:13.320424 4971 generic.go:334] "Generic (PLEG): container finished" podID="2d1fdf69-e50e-45f7-91ba-caaef78417be" containerID="8942d6c97e5f582863ad23c16a9fb047210557bc9140a3cdd97e02d4818c2b6d" exitCode=0 Mar 09 10:01:13 crc kubenswrapper[4971]: I0309 10:01:13.320474 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq" event={"ID":"2d1fdf69-e50e-45f7-91ba-caaef78417be","Type":"ContainerDied","Data":"8942d6c97e5f582863ad23c16a9fb047210557bc9140a3cdd97e02d4818c2b6d"} Mar 09 10:01:14 crc kubenswrapper[4971]: I0309 10:01:14.577263 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq" Mar 09 10:01:14 crc kubenswrapper[4971]: I0309 10:01:14.607683 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq"] Mar 09 10:01:14 crc kubenswrapper[4971]: I0309 10:01:14.612943 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq"] Mar 09 10:01:14 crc kubenswrapper[4971]: I0309 10:01:14.656800 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d1fdf69-e50e-45f7-91ba-caaef78417be-scripts\") pod \"2d1fdf69-e50e-45f7-91ba-caaef78417be\" (UID: \"2d1fdf69-e50e-45f7-91ba-caaef78417be\") " Mar 09 10:01:14 crc kubenswrapper[4971]: I0309 10:01:14.656886 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2d1fdf69-e50e-45f7-91ba-caaef78417be-dispersionconf\") pod \"2d1fdf69-e50e-45f7-91ba-caaef78417be\" (UID: \"2d1fdf69-e50e-45f7-91ba-caaef78417be\") " Mar 09 10:01:14 crc kubenswrapper[4971]: I0309 10:01:14.656961 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2d1fdf69-e50e-45f7-91ba-caaef78417be-swiftconf\") pod \"2d1fdf69-e50e-45f7-91ba-caaef78417be\" (UID: \"2d1fdf69-e50e-45f7-91ba-caaef78417be\") " Mar 09 10:01:14 crc kubenswrapper[4971]: I0309 10:01:14.657107 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnqkb\" (UniqueName: \"kubernetes.io/projected/2d1fdf69-e50e-45f7-91ba-caaef78417be-kube-api-access-dnqkb\") pod \"2d1fdf69-e50e-45f7-91ba-caaef78417be\" (UID: \"2d1fdf69-e50e-45f7-91ba-caaef78417be\") " Mar 09 10:01:14 crc kubenswrapper[4971]: I0309 10:01:14.657163 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2d1fdf69-e50e-45f7-91ba-caaef78417be-ring-data-devices\") pod \"2d1fdf69-e50e-45f7-91ba-caaef78417be\" (UID: \"2d1fdf69-e50e-45f7-91ba-caaef78417be\") " Mar 09 10:01:14 crc kubenswrapper[4971]: I0309 10:01:14.657190 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2d1fdf69-e50e-45f7-91ba-caaef78417be-etc-swift\") pod \"2d1fdf69-e50e-45f7-91ba-caaef78417be\" (UID: \"2d1fdf69-e50e-45f7-91ba-caaef78417be\") " Mar 09 10:01:14 crc kubenswrapper[4971]: I0309 10:01:14.658452 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d1fdf69-e50e-45f7-91ba-caaef78417be-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "2d1fdf69-e50e-45f7-91ba-caaef78417be" (UID: "2d1fdf69-e50e-45f7-91ba-caaef78417be"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:01:14 crc kubenswrapper[4971]: I0309 10:01:14.658506 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d1fdf69-e50e-45f7-91ba-caaef78417be-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2d1fdf69-e50e-45f7-91ba-caaef78417be" (UID: "2d1fdf69-e50e-45f7-91ba-caaef78417be"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:01:14 crc kubenswrapper[4971]: I0309 10:01:14.662636 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d1fdf69-e50e-45f7-91ba-caaef78417be-kube-api-access-dnqkb" (OuterVolumeSpecName: "kube-api-access-dnqkb") pod "2d1fdf69-e50e-45f7-91ba-caaef78417be" (UID: "2d1fdf69-e50e-45f7-91ba-caaef78417be"). InnerVolumeSpecName "kube-api-access-dnqkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:01:14 crc kubenswrapper[4971]: I0309 10:01:14.678633 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d1fdf69-e50e-45f7-91ba-caaef78417be-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "2d1fdf69-e50e-45f7-91ba-caaef78417be" (UID: "2d1fdf69-e50e-45f7-91ba-caaef78417be"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:14 crc kubenswrapper[4971]: I0309 10:01:14.678988 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d1fdf69-e50e-45f7-91ba-caaef78417be-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "2d1fdf69-e50e-45f7-91ba-caaef78417be" (UID: "2d1fdf69-e50e-45f7-91ba-caaef78417be"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:14 crc kubenswrapper[4971]: I0309 10:01:14.688189 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d1fdf69-e50e-45f7-91ba-caaef78417be-scripts" (OuterVolumeSpecName: "scripts") pod "2d1fdf69-e50e-45f7-91ba-caaef78417be" (UID: "2d1fdf69-e50e-45f7-91ba-caaef78417be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:01:14 crc kubenswrapper[4971]: I0309 10:01:14.759182 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnqkb\" (UniqueName: \"kubernetes.io/projected/2d1fdf69-e50e-45f7-91ba-caaef78417be-kube-api-access-dnqkb\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:14 crc kubenswrapper[4971]: I0309 10:01:14.759548 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2d1fdf69-e50e-45f7-91ba-caaef78417be-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:14 crc kubenswrapper[4971]: I0309 10:01:14.759681 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2d1fdf69-e50e-45f7-91ba-caaef78417be-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:14 crc kubenswrapper[4971]: I0309 10:01:14.759846 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d1fdf69-e50e-45f7-91ba-caaef78417be-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:14 crc kubenswrapper[4971]: I0309 10:01:14.759972 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2d1fdf69-e50e-45f7-91ba-caaef78417be-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:14 crc kubenswrapper[4971]: I0309 10:01:14.760087 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2d1fdf69-e50e-45f7-91ba-caaef78417be-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:14 crc kubenswrapper[4971]: I0309 10:01:14.795470 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:01:14 crc kubenswrapper[4971]: I0309 10:01:14.795906 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.163295 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d1fdf69-e50e-45f7-91ba-caaef78417be" path="/var/lib/kubelet/pods/2d1fdf69-e50e-45f7-91ba-caaef78417be/volumes" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.339428 4971 scope.go:117] "RemoveContainer" containerID="8942d6c97e5f582863ad23c16a9fb047210557bc9140a3cdd97e02d4818c2b6d" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.339686 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w4fdq" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.737744 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5"] Mar 09 10:01:15 crc kubenswrapper[4971]: E0309 10:01:15.738806 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d1fdf69-e50e-45f7-91ba-caaef78417be" containerName="swift-ring-rebalance" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.738831 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d1fdf69-e50e-45f7-91ba-caaef78417be" containerName="swift-ring-rebalance" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.738986 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d1fdf69-e50e-45f7-91ba-caaef78417be" containerName="swift-ring-rebalance" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.739549 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.741670 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.741885 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.747042 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5"] Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.776920 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-ring-data-devices\") pod \"swift-ring-rebalance-debug-hvpn5\" (UID: \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.776968 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-swiftconf\") pod \"swift-ring-rebalance-debug-hvpn5\" (UID: \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.777013 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-etc-swift\") pod \"swift-ring-rebalance-debug-hvpn5\" (UID: \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.777054 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmwk8\" (UniqueName: \"kubernetes.io/projected/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-kube-api-access-cmwk8\") pod \"swift-ring-rebalance-debug-hvpn5\" (UID: \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.777078 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-scripts\") pod \"swift-ring-rebalance-debug-hvpn5\" (UID: \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.777100 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-dispersionconf\") pod \"swift-ring-rebalance-debug-hvpn5\" (UID: \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.878412 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-ring-data-devices\") pod \"swift-ring-rebalance-debug-hvpn5\" (UID: \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.878503 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-swiftconf\") pod \"swift-ring-rebalance-debug-hvpn5\" (UID: \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.878558 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-etc-swift\") pod \"swift-ring-rebalance-debug-hvpn5\" (UID: \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.878602 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmwk8\" (UniqueName: \"kubernetes.io/projected/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-kube-api-access-cmwk8\") pod \"swift-ring-rebalance-debug-hvpn5\" (UID: \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.878624 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-scripts\") pod \"swift-ring-rebalance-debug-hvpn5\" (UID: \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.879174 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-etc-swift\") pod \"swift-ring-rebalance-debug-hvpn5\" (UID: \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.879391 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-scripts\") pod \"swift-ring-rebalance-debug-hvpn5\" (UID: \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.879425 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-dispersionconf\") pod \"swift-ring-rebalance-debug-hvpn5\" (UID: \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.879963 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-ring-data-devices\") pod \"swift-ring-rebalance-debug-hvpn5\" (UID: \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.883402 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-swiftconf\") pod \"swift-ring-rebalance-debug-hvpn5\" (UID: \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.893894 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-dispersionconf\") pod \"swift-ring-rebalance-debug-hvpn5\" (UID: \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5" Mar 09 10:01:15 crc kubenswrapper[4971]: I0309 10:01:15.898178 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmwk8\" (UniqueName: \"kubernetes.io/projected/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-kube-api-access-cmwk8\") pod \"swift-ring-rebalance-debug-hvpn5\" (UID: \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5" Mar 09 10:01:16 crc kubenswrapper[4971]: I0309 10:01:16.059877 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5" Mar 09 10:01:16 crc kubenswrapper[4971]: I0309 10:01:16.485043 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5"] Mar 09 10:01:16 crc kubenswrapper[4971]: W0309 10:01:16.491535 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d1f30ab_2ad8_4422_a9b9_14d349ff4ef1.slice/crio-73e65ba2a1962b8a57d42f57a30859ed76b172f18b04bc7ac81fd453ff95c09c WatchSource:0}: Error finding container 73e65ba2a1962b8a57d42f57a30859ed76b172f18b04bc7ac81fd453ff95c09c: Status 404 returned error can't find the container with id 73e65ba2a1962b8a57d42f57a30859ed76b172f18b04bc7ac81fd453ff95c09c Mar 09 10:01:17 crc kubenswrapper[4971]: I0309 10:01:17.362490 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5" event={"ID":"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1","Type":"ContainerStarted","Data":"39a3b1b3cc4e2473b788319e2286c8add3dd9867083cf796351ec71ee81267bc"} Mar 09 10:01:17 crc kubenswrapper[4971]: I0309 10:01:17.362852 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5" event={"ID":"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1","Type":"ContainerStarted","Data":"73e65ba2a1962b8a57d42f57a30859ed76b172f18b04bc7ac81fd453ff95c09c"} Mar 09 10:01:17 crc kubenswrapper[4971]: I0309 10:01:17.385303 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5" podStartSLOduration=2.385281129 podStartE2EDuration="2.385281129s" podCreationTimestamp="2026-03-09 10:01:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:01:17.3769196 +0000 UTC m=+2480.936847410" watchObservedRunningTime="2026-03-09 10:01:17.385281129 +0000 UTC m=+2480.945208949" Mar 09 10:01:18 crc kubenswrapper[4971]: I0309 10:01:18.374125 4971 generic.go:334] "Generic (PLEG): container finished" podID="3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1" containerID="39a3b1b3cc4e2473b788319e2286c8add3dd9867083cf796351ec71ee81267bc" exitCode=0 Mar 09 10:01:18 crc kubenswrapper[4971]: I0309 10:01:18.374272 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5" event={"ID":"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1","Type":"ContainerDied","Data":"39a3b1b3cc4e2473b788319e2286c8add3dd9867083cf796351ec71ee81267bc"} Mar 09 10:01:19 crc kubenswrapper[4971]: I0309 10:01:19.701395 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5" Mar 09 10:01:19 crc kubenswrapper[4971]: I0309 10:01:19.727285 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5"] Mar 09 10:01:19 crc kubenswrapper[4971]: I0309 10:01:19.732579 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5"] Mar 09 10:01:19 crc kubenswrapper[4971]: I0309 10:01:19.849375 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-swiftconf\") pod \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\" (UID: \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\") " Mar 09 10:01:19 crc kubenswrapper[4971]: I0309 10:01:19.849449 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-dispersionconf\") pod \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\" (UID: \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\") " Mar 09 10:01:19 crc kubenswrapper[4971]: I0309 10:01:19.849516 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-etc-swift\") pod \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\" (UID: \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\") " Mar 09 10:01:19 crc kubenswrapper[4971]: I0309 10:01:19.849554 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-scripts\") pod \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\" (UID: \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\") " Mar 09 10:01:19 crc kubenswrapper[4971]: I0309 10:01:19.849627 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-ring-data-devices\") pod \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\" (UID: \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\") " Mar 09 10:01:19 crc kubenswrapper[4971]: I0309 10:01:19.849681 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmwk8\" (UniqueName: \"kubernetes.io/projected/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-kube-api-access-cmwk8\") pod \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\" (UID: \"3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1\") " Mar 09 10:01:19 crc kubenswrapper[4971]: I0309 10:01:19.851094 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1" (UID: "3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:01:19 crc kubenswrapper[4971]: I0309 10:01:19.851286 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1" (UID: "3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:01:19 crc kubenswrapper[4971]: I0309 10:01:19.856113 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-kube-api-access-cmwk8" (OuterVolumeSpecName: "kube-api-access-cmwk8") pod "3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1" (UID: "3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1"). InnerVolumeSpecName "kube-api-access-cmwk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:01:19 crc kubenswrapper[4971]: I0309 10:01:19.875517 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1" (UID: "3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:19 crc kubenswrapper[4971]: I0309 10:01:19.876171 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-scripts" (OuterVolumeSpecName: "scripts") pod "3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1" (UID: "3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:01:19 crc kubenswrapper[4971]: I0309 10:01:19.879112 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1" (UID: "3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:19 crc kubenswrapper[4971]: I0309 10:01:19.951033 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:19 crc kubenswrapper[4971]: I0309 10:01:19.951068 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmwk8\" (UniqueName: \"kubernetes.io/projected/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-kube-api-access-cmwk8\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:19 crc kubenswrapper[4971]: I0309 10:01:19.951082 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:19 crc kubenswrapper[4971]: I0309 10:01:19.951090 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:19 crc kubenswrapper[4971]: I0309 10:01:19.951098 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:19 crc kubenswrapper[4971]: I0309 10:01:19.951107 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:20 crc kubenswrapper[4971]: I0309 10:01:20.393226 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73e65ba2a1962b8a57d42f57a30859ed76b172f18b04bc7ac81fd453ff95c09c" Mar 09 10:01:20 crc kubenswrapper[4971]: I0309 10:01:20.393277 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hvpn5" Mar 09 10:01:20 crc kubenswrapper[4971]: I0309 10:01:20.881705 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q"] Mar 09 10:01:20 crc kubenswrapper[4971]: E0309 10:01:20.882313 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1" containerName="swift-ring-rebalance" Mar 09 10:01:20 crc kubenswrapper[4971]: I0309 10:01:20.882326 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1" containerName="swift-ring-rebalance" Mar 09 10:01:20 crc kubenswrapper[4971]: I0309 10:01:20.882511 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1" containerName="swift-ring-rebalance" Mar 09 10:01:20 crc kubenswrapper[4971]: I0309 10:01:20.882994 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q" Mar 09 10:01:20 crc kubenswrapper[4971]: I0309 10:01:20.885090 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 10:01:20 crc kubenswrapper[4971]: I0309 10:01:20.885315 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 10:01:20 crc kubenswrapper[4971]: I0309 10:01:20.895956 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q"] Mar 09 10:01:21 crc kubenswrapper[4971]: I0309 10:01:21.068001 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6csr\" (UniqueName: \"kubernetes.io/projected/5c294a9a-868c-49e1-80da-2e616f2714f0-kube-api-access-r6csr\") pod \"swift-ring-rebalance-debug-gtr5q\" (UID: \"5c294a9a-868c-49e1-80da-2e616f2714f0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q" Mar 09 10:01:21 crc kubenswrapper[4971]: I0309 10:01:21.068236 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5c294a9a-868c-49e1-80da-2e616f2714f0-swiftconf\") pod \"swift-ring-rebalance-debug-gtr5q\" (UID: \"5c294a9a-868c-49e1-80da-2e616f2714f0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q" Mar 09 10:01:21 crc kubenswrapper[4971]: I0309 10:01:21.068519 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c294a9a-868c-49e1-80da-2e616f2714f0-scripts\") pod \"swift-ring-rebalance-debug-gtr5q\" (UID: \"5c294a9a-868c-49e1-80da-2e616f2714f0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q" Mar 09 10:01:21 crc kubenswrapper[4971]: I0309 10:01:21.068572 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5c294a9a-868c-49e1-80da-2e616f2714f0-ring-data-devices\") pod \"swift-ring-rebalance-debug-gtr5q\" (UID: \"5c294a9a-868c-49e1-80da-2e616f2714f0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q" Mar 09 10:01:21 crc kubenswrapper[4971]: I0309 10:01:21.068606 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5c294a9a-868c-49e1-80da-2e616f2714f0-etc-swift\") pod \"swift-ring-rebalance-debug-gtr5q\" (UID: \"5c294a9a-868c-49e1-80da-2e616f2714f0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q" Mar 09 10:01:21 crc kubenswrapper[4971]: I0309 10:01:21.068677 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5c294a9a-868c-49e1-80da-2e616f2714f0-dispersionconf\") pod \"swift-ring-rebalance-debug-gtr5q\" (UID: \"5c294a9a-868c-49e1-80da-2e616f2714f0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q" Mar 09 10:01:21 crc kubenswrapper[4971]: I0309 10:01:21.162223 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1" path="/var/lib/kubelet/pods/3d1f30ab-2ad8-4422-a9b9-14d349ff4ef1/volumes" Mar 09 10:01:21 crc kubenswrapper[4971]: I0309 10:01:21.170605 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c294a9a-868c-49e1-80da-2e616f2714f0-scripts\") pod \"swift-ring-rebalance-debug-gtr5q\" (UID: \"5c294a9a-868c-49e1-80da-2e616f2714f0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q" Mar 09 10:01:21 crc kubenswrapper[4971]: I0309 10:01:21.170678 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5c294a9a-868c-49e1-80da-2e616f2714f0-ring-data-devices\") pod \"swift-ring-rebalance-debug-gtr5q\" (UID: \"5c294a9a-868c-49e1-80da-2e616f2714f0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q" Mar 09 10:01:21 crc kubenswrapper[4971]: I0309 10:01:21.170709 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5c294a9a-868c-49e1-80da-2e616f2714f0-etc-swift\") pod \"swift-ring-rebalance-debug-gtr5q\" (UID: \"5c294a9a-868c-49e1-80da-2e616f2714f0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q" Mar 09 10:01:21 crc kubenswrapper[4971]: I0309 10:01:21.170749 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5c294a9a-868c-49e1-80da-2e616f2714f0-dispersionconf\") pod \"swift-ring-rebalance-debug-gtr5q\" (UID: \"5c294a9a-868c-49e1-80da-2e616f2714f0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q" Mar 09 10:01:21 crc kubenswrapper[4971]: I0309 10:01:21.170781 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6csr\" (UniqueName: \"kubernetes.io/projected/5c294a9a-868c-49e1-80da-2e616f2714f0-kube-api-access-r6csr\") pod \"swift-ring-rebalance-debug-gtr5q\" (UID: \"5c294a9a-868c-49e1-80da-2e616f2714f0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q" Mar 09 10:01:21 crc kubenswrapper[4971]: I0309 10:01:21.170849 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5c294a9a-868c-49e1-80da-2e616f2714f0-swiftconf\") pod \"swift-ring-rebalance-debug-gtr5q\" (UID: \"5c294a9a-868c-49e1-80da-2e616f2714f0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q" Mar 09 10:01:21 crc kubenswrapper[4971]: I0309 10:01:21.172329 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5c294a9a-868c-49e1-80da-2e616f2714f0-etc-swift\") pod \"swift-ring-rebalance-debug-gtr5q\" (UID: \"5c294a9a-868c-49e1-80da-2e616f2714f0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q" Mar 09 10:01:21 crc kubenswrapper[4971]: I0309 10:01:21.172822 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c294a9a-868c-49e1-80da-2e616f2714f0-scripts\") pod \"swift-ring-rebalance-debug-gtr5q\" (UID: \"5c294a9a-868c-49e1-80da-2e616f2714f0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q" Mar 09 10:01:21 crc kubenswrapper[4971]: I0309 10:01:21.173021 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5c294a9a-868c-49e1-80da-2e616f2714f0-ring-data-devices\") pod \"swift-ring-rebalance-debug-gtr5q\" (UID: \"5c294a9a-868c-49e1-80da-2e616f2714f0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q" Mar 09 10:01:21 crc kubenswrapper[4971]: I0309 10:01:21.178105 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5c294a9a-868c-49e1-80da-2e616f2714f0-swiftconf\") pod \"swift-ring-rebalance-debug-gtr5q\" (UID: \"5c294a9a-868c-49e1-80da-2e616f2714f0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q" Mar 09 10:01:21 crc kubenswrapper[4971]: I0309 10:01:21.178105 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5c294a9a-868c-49e1-80da-2e616f2714f0-dispersionconf\") pod \"swift-ring-rebalance-debug-gtr5q\" (UID: \"5c294a9a-868c-49e1-80da-2e616f2714f0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q" Mar 09 10:01:21 crc kubenswrapper[4971]: I0309 10:01:21.196252 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6csr\" (UniqueName: \"kubernetes.io/projected/5c294a9a-868c-49e1-80da-2e616f2714f0-kube-api-access-r6csr\") pod \"swift-ring-rebalance-debug-gtr5q\" (UID: \"5c294a9a-868c-49e1-80da-2e616f2714f0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q" Mar 09 10:01:21 crc kubenswrapper[4971]: I0309 10:01:21.213405 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q" Mar 09 10:01:21 crc kubenswrapper[4971]: I0309 10:01:21.635049 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q"] Mar 09 10:01:22 crc kubenswrapper[4971]: I0309 10:01:22.418245 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q" event={"ID":"5c294a9a-868c-49e1-80da-2e616f2714f0","Type":"ContainerStarted","Data":"1e73fd92ed285bc0bf36e046de4ff60dd32c347e19b0520e2b56d550b3227d40"} Mar 09 10:01:22 crc kubenswrapper[4971]: I0309 10:01:22.418662 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q" event={"ID":"5c294a9a-868c-49e1-80da-2e616f2714f0","Type":"ContainerStarted","Data":"f2aafc9b76ad27ee48c091c1370afa908e41f95dfe1a2223cd048dcf85686d54"} Mar 09 10:01:22 crc kubenswrapper[4971]: I0309 10:01:22.444692 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q" podStartSLOduration=2.444670717 podStartE2EDuration="2.444670717s" podCreationTimestamp="2026-03-09 10:01:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:01:22.435554396 +0000 UTC m=+2485.995482246" watchObservedRunningTime="2026-03-09 10:01:22.444670717 +0000 UTC m=+2486.004598567" Mar 09 10:01:23 crc kubenswrapper[4971]: I0309 10:01:23.427445 4971 generic.go:334] "Generic (PLEG): container finished" podID="5c294a9a-868c-49e1-80da-2e616f2714f0" containerID="1e73fd92ed285bc0bf36e046de4ff60dd32c347e19b0520e2b56d550b3227d40" exitCode=0 Mar 09 10:01:23 crc kubenswrapper[4971]: I0309 10:01:23.427491 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q" event={"ID":"5c294a9a-868c-49e1-80da-2e616f2714f0","Type":"ContainerDied","Data":"1e73fd92ed285bc0bf36e046de4ff60dd32c347e19b0520e2b56d550b3227d40"} Mar 09 10:01:24 crc kubenswrapper[4971]: I0309 10:01:24.726710 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q" Mar 09 10:01:24 crc kubenswrapper[4971]: I0309 10:01:24.763750 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q"] Mar 09 10:01:24 crc kubenswrapper[4971]: I0309 10:01:24.771664 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q"] Mar 09 10:01:24 crc kubenswrapper[4971]: I0309 10:01:24.839107 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c294a9a-868c-49e1-80da-2e616f2714f0-scripts\") pod \"5c294a9a-868c-49e1-80da-2e616f2714f0\" (UID: \"5c294a9a-868c-49e1-80da-2e616f2714f0\") " Mar 09 10:01:24 crc kubenswrapper[4971]: I0309 10:01:24.839161 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5c294a9a-868c-49e1-80da-2e616f2714f0-etc-swift\") pod \"5c294a9a-868c-49e1-80da-2e616f2714f0\" (UID: \"5c294a9a-868c-49e1-80da-2e616f2714f0\") " Mar 09 10:01:24 crc kubenswrapper[4971]: I0309 10:01:24.839202 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6csr\" (UniqueName: \"kubernetes.io/projected/5c294a9a-868c-49e1-80da-2e616f2714f0-kube-api-access-r6csr\") pod \"5c294a9a-868c-49e1-80da-2e616f2714f0\" (UID: \"5c294a9a-868c-49e1-80da-2e616f2714f0\") " Mar 09 10:01:24 crc kubenswrapper[4971]: I0309 10:01:24.839261 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5c294a9a-868c-49e1-80da-2e616f2714f0-ring-data-devices\") pod \"5c294a9a-868c-49e1-80da-2e616f2714f0\" (UID: \"5c294a9a-868c-49e1-80da-2e616f2714f0\") " Mar 09 10:01:24 crc kubenswrapper[4971]: I0309 10:01:24.839338 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5c294a9a-868c-49e1-80da-2e616f2714f0-dispersionconf\") pod \"5c294a9a-868c-49e1-80da-2e616f2714f0\" (UID: \"5c294a9a-868c-49e1-80da-2e616f2714f0\") " Mar 09 10:01:24 crc kubenswrapper[4971]: I0309 10:01:24.839417 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5c294a9a-868c-49e1-80da-2e616f2714f0-swiftconf\") pod \"5c294a9a-868c-49e1-80da-2e616f2714f0\" (UID: \"5c294a9a-868c-49e1-80da-2e616f2714f0\") " Mar 09 10:01:24 crc kubenswrapper[4971]: I0309 10:01:24.839965 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c294a9a-868c-49e1-80da-2e616f2714f0-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5c294a9a-868c-49e1-80da-2e616f2714f0" (UID: "5c294a9a-868c-49e1-80da-2e616f2714f0"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:01:24 crc kubenswrapper[4971]: I0309 10:01:24.839975 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c294a9a-868c-49e1-80da-2e616f2714f0-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5c294a9a-868c-49e1-80da-2e616f2714f0" (UID: "5c294a9a-868c-49e1-80da-2e616f2714f0"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:01:24 crc kubenswrapper[4971]: I0309 10:01:24.840197 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5c294a9a-868c-49e1-80da-2e616f2714f0-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:24 crc kubenswrapper[4971]: I0309 10:01:24.840213 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5c294a9a-868c-49e1-80da-2e616f2714f0-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:24 crc kubenswrapper[4971]: I0309 10:01:24.845101 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c294a9a-868c-49e1-80da-2e616f2714f0-kube-api-access-r6csr" (OuterVolumeSpecName: "kube-api-access-r6csr") pod "5c294a9a-868c-49e1-80da-2e616f2714f0" (UID: "5c294a9a-868c-49e1-80da-2e616f2714f0"). InnerVolumeSpecName "kube-api-access-r6csr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:01:24 crc kubenswrapper[4971]: I0309 10:01:24.865216 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c294a9a-868c-49e1-80da-2e616f2714f0-scripts" (OuterVolumeSpecName: "scripts") pod "5c294a9a-868c-49e1-80da-2e616f2714f0" (UID: "5c294a9a-868c-49e1-80da-2e616f2714f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:01:24 crc kubenswrapper[4971]: I0309 10:01:24.867487 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c294a9a-868c-49e1-80da-2e616f2714f0-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5c294a9a-868c-49e1-80da-2e616f2714f0" (UID: "5c294a9a-868c-49e1-80da-2e616f2714f0"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:24 crc kubenswrapper[4971]: I0309 10:01:24.872523 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c294a9a-868c-49e1-80da-2e616f2714f0-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5c294a9a-868c-49e1-80da-2e616f2714f0" (UID: "5c294a9a-868c-49e1-80da-2e616f2714f0"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:24 crc kubenswrapper[4971]: I0309 10:01:24.942069 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5c294a9a-868c-49e1-80da-2e616f2714f0-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:24 crc kubenswrapper[4971]: I0309 10:01:24.942111 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5c294a9a-868c-49e1-80da-2e616f2714f0-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:24 crc kubenswrapper[4971]: I0309 10:01:24.942125 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c294a9a-868c-49e1-80da-2e616f2714f0-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:24 crc kubenswrapper[4971]: I0309 10:01:24.942140 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6csr\" (UniqueName: \"kubernetes.io/projected/5c294a9a-868c-49e1-80da-2e616f2714f0-kube-api-access-r6csr\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:25 crc kubenswrapper[4971]: I0309 10:01:25.166810 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c294a9a-868c-49e1-80da-2e616f2714f0" path="/var/lib/kubelet/pods/5c294a9a-868c-49e1-80da-2e616f2714f0/volumes" Mar 09 10:01:25 crc kubenswrapper[4971]: I0309 10:01:25.446564 4971 scope.go:117] "RemoveContainer" containerID="1e73fd92ed285bc0bf36e046de4ff60dd32c347e19b0520e2b56d550b3227d40" Mar 09 10:01:25 crc kubenswrapper[4971]: I0309 10:01:25.446631 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gtr5q" Mar 09 10:01:25 crc kubenswrapper[4971]: I0309 10:01:25.909405 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-djngc"] Mar 09 10:01:25 crc kubenswrapper[4971]: E0309 10:01:25.910156 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c294a9a-868c-49e1-80da-2e616f2714f0" containerName="swift-ring-rebalance" Mar 09 10:01:25 crc kubenswrapper[4971]: I0309 10:01:25.910173 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c294a9a-868c-49e1-80da-2e616f2714f0" containerName="swift-ring-rebalance" Mar 09 10:01:25 crc kubenswrapper[4971]: I0309 10:01:25.910311 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c294a9a-868c-49e1-80da-2e616f2714f0" containerName="swift-ring-rebalance" Mar 09 10:01:25 crc kubenswrapper[4971]: I0309 10:01:25.910844 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-djngc" Mar 09 10:01:25 crc kubenswrapper[4971]: I0309 10:01:25.912916 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 09 10:01:25 crc kubenswrapper[4971]: I0309 10:01:25.914871 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 09 10:01:25 crc kubenswrapper[4971]: I0309 10:01:25.932140 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-djngc"] Mar 09 10:01:25 crc kubenswrapper[4971]: I0309 10:01:25.953708 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkrw6\" (UniqueName: \"kubernetes.io/projected/78c79918-db02-4094-985a-f6da439ac6a7-kube-api-access-zkrw6\") pod \"swift-ring-rebalance-debug-djngc\" (UID: \"78c79918-db02-4094-985a-f6da439ac6a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-djngc" Mar 09 10:01:25 crc kubenswrapper[4971]: I0309 10:01:25.953774 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/78c79918-db02-4094-985a-f6da439ac6a7-ring-data-devices\") pod \"swift-ring-rebalance-debug-djngc\" (UID: \"78c79918-db02-4094-985a-f6da439ac6a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-djngc" Mar 09 10:01:25 crc kubenswrapper[4971]: I0309 10:01:25.953806 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/78c79918-db02-4094-985a-f6da439ac6a7-dispersionconf\") pod \"swift-ring-rebalance-debug-djngc\" (UID: \"78c79918-db02-4094-985a-f6da439ac6a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-djngc" Mar 09 10:01:25 crc kubenswrapper[4971]: I0309 10:01:25.953891 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78c79918-db02-4094-985a-f6da439ac6a7-scripts\") pod \"swift-ring-rebalance-debug-djngc\" (UID: \"78c79918-db02-4094-985a-f6da439ac6a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-djngc" Mar 09 10:01:25 crc kubenswrapper[4971]: I0309 10:01:25.953918 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/78c79918-db02-4094-985a-f6da439ac6a7-swiftconf\") pod \"swift-ring-rebalance-debug-djngc\" (UID: \"78c79918-db02-4094-985a-f6da439ac6a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-djngc" Mar 09 10:01:25 crc kubenswrapper[4971]: I0309 10:01:25.953956 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/78c79918-db02-4094-985a-f6da439ac6a7-etc-swift\") pod \"swift-ring-rebalance-debug-djngc\" (UID: \"78c79918-db02-4094-985a-f6da439ac6a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-djngc" Mar 09 10:01:26 crc kubenswrapper[4971]: I0309 10:01:26.055643 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78c79918-db02-4094-985a-f6da439ac6a7-scripts\") pod \"swift-ring-rebalance-debug-djngc\" (UID: \"78c79918-db02-4094-985a-f6da439ac6a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-djngc" Mar 09 10:01:26 crc kubenswrapper[4971]: I0309 10:01:26.055693 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/78c79918-db02-4094-985a-f6da439ac6a7-swiftconf\") pod \"swift-ring-rebalance-debug-djngc\" (UID: \"78c79918-db02-4094-985a-f6da439ac6a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-djngc" Mar 09 10:01:26 crc kubenswrapper[4971]: I0309 10:01:26.055724 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/78c79918-db02-4094-985a-f6da439ac6a7-etc-swift\") pod \"swift-ring-rebalance-debug-djngc\" (UID: \"78c79918-db02-4094-985a-f6da439ac6a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-djngc" Mar 09 10:01:26 crc kubenswrapper[4971]: I0309 10:01:26.055783 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkrw6\" (UniqueName: \"kubernetes.io/projected/78c79918-db02-4094-985a-f6da439ac6a7-kube-api-access-zkrw6\") pod \"swift-ring-rebalance-debug-djngc\" (UID: \"78c79918-db02-4094-985a-f6da439ac6a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-djngc" Mar 09 10:01:26 crc kubenswrapper[4971]: I0309 10:01:26.055817 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/78c79918-db02-4094-985a-f6da439ac6a7-ring-data-devices\") pod \"swift-ring-rebalance-debug-djngc\" (UID: \"78c79918-db02-4094-985a-f6da439ac6a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-djngc" Mar 09 10:01:26 crc kubenswrapper[4971]: I0309 10:01:26.055835 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/78c79918-db02-4094-985a-f6da439ac6a7-dispersionconf\") pod \"swift-ring-rebalance-debug-djngc\" (UID: \"78c79918-db02-4094-985a-f6da439ac6a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-djngc" Mar 09 10:01:26 crc kubenswrapper[4971]: I0309 10:01:26.056599 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/78c79918-db02-4094-985a-f6da439ac6a7-etc-swift\") pod \"swift-ring-rebalance-debug-djngc\" (UID: \"78c79918-db02-4094-985a-f6da439ac6a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-djngc" Mar 09 10:01:26 crc kubenswrapper[4971]: I0309 10:01:26.056691 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/78c79918-db02-4094-985a-f6da439ac6a7-ring-data-devices\") pod \"swift-ring-rebalance-debug-djngc\" (UID: \"78c79918-db02-4094-985a-f6da439ac6a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-djngc" Mar 09 10:01:26 crc kubenswrapper[4971]: I0309 10:01:26.056799 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78c79918-db02-4094-985a-f6da439ac6a7-scripts\") pod \"swift-ring-rebalance-debug-djngc\" (UID: \"78c79918-db02-4094-985a-f6da439ac6a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-djngc" Mar 09 10:01:26 crc kubenswrapper[4971]: I0309 10:01:26.066424 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/78c79918-db02-4094-985a-f6da439ac6a7-swiftconf\") pod \"swift-ring-rebalance-debug-djngc\" (UID: \"78c79918-db02-4094-985a-f6da439ac6a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-djngc" Mar 09 10:01:26 crc kubenswrapper[4971]: I0309 10:01:26.071961 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/78c79918-db02-4094-985a-f6da439ac6a7-dispersionconf\") pod \"swift-ring-rebalance-debug-djngc\" (UID: \"78c79918-db02-4094-985a-f6da439ac6a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-djngc" Mar 09 10:01:26 crc kubenswrapper[4971]: I0309 10:01:26.072960 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkrw6\" (UniqueName: \"kubernetes.io/projected/78c79918-db02-4094-985a-f6da439ac6a7-kube-api-access-zkrw6\") pod \"swift-ring-rebalance-debug-djngc\" (UID: \"78c79918-db02-4094-985a-f6da439ac6a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-djngc" Mar 09 10:01:26 crc kubenswrapper[4971]: I0309 10:01:26.275186 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-djngc" Mar 09 10:01:26 crc kubenswrapper[4971]: I0309 10:01:26.695225 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-djngc"] Mar 09 10:01:27 crc kubenswrapper[4971]: I0309 10:01:27.472403 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-djngc" event={"ID":"78c79918-db02-4094-985a-f6da439ac6a7","Type":"ContainerStarted","Data":"84a0b990a76a3e599c0ae045b32188cf72edcc478ba44d1eaf4590ae2b360387"} Mar 09 10:01:27 crc kubenswrapper[4971]: I0309 10:01:27.472767 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-djngc" event={"ID":"78c79918-db02-4094-985a-f6da439ac6a7","Type":"ContainerStarted","Data":"3c21519e43054bea6566815b05ce68b1339d4841043cc98ae0d13c99c5eddd51"} Mar 09 10:01:27 crc kubenswrapper[4971]: I0309 10:01:27.501146 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-djngc" podStartSLOduration=2.501128594 podStartE2EDuration="2.501128594s" podCreationTimestamp="2026-03-09 10:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:01:27.499680114 +0000 UTC m=+2491.059607934" watchObservedRunningTime="2026-03-09 10:01:27.501128594 +0000 UTC m=+2491.061056414" Mar 09 10:01:28 crc kubenswrapper[4971]: I0309 10:01:28.484583 4971 generic.go:334] "Generic (PLEG): container finished" podID="78c79918-db02-4094-985a-f6da439ac6a7" containerID="84a0b990a76a3e599c0ae045b32188cf72edcc478ba44d1eaf4590ae2b360387" exitCode=0 Mar 09 10:01:28 crc kubenswrapper[4971]: I0309 10:01:28.484645 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-djngc" event={"ID":"78c79918-db02-4094-985a-f6da439ac6a7","Type":"ContainerDied","Data":"84a0b990a76a3e599c0ae045b32188cf72edcc478ba44d1eaf4590ae2b360387"} Mar 09 10:01:29 crc kubenswrapper[4971]: I0309 10:01:29.721145 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-djngc" Mar 09 10:01:29 crc kubenswrapper[4971]: I0309 10:01:29.749159 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-djngc"] Mar 09 10:01:29 crc kubenswrapper[4971]: I0309 10:01:29.754872 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-djngc"] Mar 09 10:01:29 crc kubenswrapper[4971]: I0309 10:01:29.923626 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78c79918-db02-4094-985a-f6da439ac6a7-scripts\") pod \"78c79918-db02-4094-985a-f6da439ac6a7\" (UID: \"78c79918-db02-4094-985a-f6da439ac6a7\") " Mar 09 10:01:29 crc kubenswrapper[4971]: I0309 10:01:29.923682 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/78c79918-db02-4094-985a-f6da439ac6a7-etc-swift\") pod \"78c79918-db02-4094-985a-f6da439ac6a7\" (UID: \"78c79918-db02-4094-985a-f6da439ac6a7\") " Mar 09 10:01:29 crc kubenswrapper[4971]: I0309 10:01:29.923831 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/78c79918-db02-4094-985a-f6da439ac6a7-dispersionconf\") pod \"78c79918-db02-4094-985a-f6da439ac6a7\" (UID: \"78c79918-db02-4094-985a-f6da439ac6a7\") " Mar 09 10:01:29 crc kubenswrapper[4971]: I0309 10:01:29.923910 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/78c79918-db02-4094-985a-f6da439ac6a7-swiftconf\") pod \"78c79918-db02-4094-985a-f6da439ac6a7\" (UID: \"78c79918-db02-4094-985a-f6da439ac6a7\") " Mar 09 10:01:29 crc kubenswrapper[4971]: I0309 10:01:29.923978 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/78c79918-db02-4094-985a-f6da439ac6a7-ring-data-devices\") pod \"78c79918-db02-4094-985a-f6da439ac6a7\" (UID: \"78c79918-db02-4094-985a-f6da439ac6a7\") " Mar 09 10:01:29 crc kubenswrapper[4971]: I0309 10:01:29.924010 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkrw6\" (UniqueName: \"kubernetes.io/projected/78c79918-db02-4094-985a-f6da439ac6a7-kube-api-access-zkrw6\") pod \"78c79918-db02-4094-985a-f6da439ac6a7\" (UID: \"78c79918-db02-4094-985a-f6da439ac6a7\") " Mar 09 10:01:29 crc kubenswrapper[4971]: I0309 10:01:29.924665 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78c79918-db02-4094-985a-f6da439ac6a7-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "78c79918-db02-4094-985a-f6da439ac6a7" (UID: "78c79918-db02-4094-985a-f6da439ac6a7"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:01:29 crc kubenswrapper[4971]: I0309 10:01:29.924796 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78c79918-db02-4094-985a-f6da439ac6a7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "78c79918-db02-4094-985a-f6da439ac6a7" (UID: "78c79918-db02-4094-985a-f6da439ac6a7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:01:29 crc kubenswrapper[4971]: I0309 10:01:29.929143 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78c79918-db02-4094-985a-f6da439ac6a7-kube-api-access-zkrw6" (OuterVolumeSpecName: "kube-api-access-zkrw6") pod "78c79918-db02-4094-985a-f6da439ac6a7" (UID: "78c79918-db02-4094-985a-f6da439ac6a7"). InnerVolumeSpecName "kube-api-access-zkrw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:01:29 crc kubenswrapper[4971]: I0309 10:01:29.947305 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78c79918-db02-4094-985a-f6da439ac6a7-scripts" (OuterVolumeSpecName: "scripts") pod "78c79918-db02-4094-985a-f6da439ac6a7" (UID: "78c79918-db02-4094-985a-f6da439ac6a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:01:29 crc kubenswrapper[4971]: I0309 10:01:29.949881 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78c79918-db02-4094-985a-f6da439ac6a7-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "78c79918-db02-4094-985a-f6da439ac6a7" (UID: "78c79918-db02-4094-985a-f6da439ac6a7"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:29 crc kubenswrapper[4971]: I0309 10:01:29.950433 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78c79918-db02-4094-985a-f6da439ac6a7-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "78c79918-db02-4094-985a-f6da439ac6a7" (UID: "78c79918-db02-4094-985a-f6da439ac6a7"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:30 crc kubenswrapper[4971]: I0309 10:01:30.025725 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkrw6\" (UniqueName: \"kubernetes.io/projected/78c79918-db02-4094-985a-f6da439ac6a7-kube-api-access-zkrw6\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:30 crc kubenswrapper[4971]: I0309 10:01:30.025772 4971 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78c79918-db02-4094-985a-f6da439ac6a7-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:30 crc kubenswrapper[4971]: I0309 10:01:30.025785 4971 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/78c79918-db02-4094-985a-f6da439ac6a7-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:30 crc kubenswrapper[4971]: I0309 10:01:30.025797 4971 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/78c79918-db02-4094-985a-f6da439ac6a7-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:30 crc kubenswrapper[4971]: I0309 10:01:30.025810 4971 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/78c79918-db02-4094-985a-f6da439ac6a7-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:30 crc kubenswrapper[4971]: I0309 10:01:30.025822 4971 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/78c79918-db02-4094-985a-f6da439ac6a7-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:30 crc kubenswrapper[4971]: I0309 10:01:30.505335 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c21519e43054bea6566815b05ce68b1339d4841043cc98ae0d13c99c5eddd51" Mar 09 10:01:30 crc kubenswrapper[4971]: I0309 10:01:30.505462 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-djngc" Mar 09 10:01:31 crc kubenswrapper[4971]: I0309 10:01:31.164611 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78c79918-db02-4094-985a-f6da439ac6a7" path="/var/lib/kubelet/pods/78c79918-db02-4094-985a-f6da439ac6a7/volumes" Mar 09 10:01:35 crc kubenswrapper[4971]: I0309 10:01:35.301451 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j86cm"] Mar 09 10:01:35 crc kubenswrapper[4971]: E0309 10:01:35.302374 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78c79918-db02-4094-985a-f6da439ac6a7" containerName="swift-ring-rebalance" Mar 09 10:01:35 crc kubenswrapper[4971]: I0309 10:01:35.302390 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="78c79918-db02-4094-985a-f6da439ac6a7" containerName="swift-ring-rebalance" Mar 09 10:01:35 crc kubenswrapper[4971]: I0309 10:01:35.302562 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="78c79918-db02-4094-985a-f6da439ac6a7" containerName="swift-ring-rebalance" Mar 09 10:01:35 crc kubenswrapper[4971]: I0309 10:01:35.303733 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j86cm" Mar 09 10:01:35 crc kubenswrapper[4971]: I0309 10:01:35.313341 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j86cm"] Mar 09 10:01:35 crc kubenswrapper[4971]: I0309 10:01:35.319778 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5wl7\" (UniqueName: \"kubernetes.io/projected/2031a2e3-d41e-4191-8c20-c6ab40deaf2a-kube-api-access-m5wl7\") pod \"redhat-marketplace-j86cm\" (UID: \"2031a2e3-d41e-4191-8c20-c6ab40deaf2a\") " pod="openshift-marketplace/redhat-marketplace-j86cm" Mar 09 10:01:35 crc kubenswrapper[4971]: I0309 10:01:35.319897 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2031a2e3-d41e-4191-8c20-c6ab40deaf2a-catalog-content\") pod \"redhat-marketplace-j86cm\" (UID: \"2031a2e3-d41e-4191-8c20-c6ab40deaf2a\") " pod="openshift-marketplace/redhat-marketplace-j86cm" Mar 09 10:01:35 crc kubenswrapper[4971]: I0309 10:01:35.319921 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2031a2e3-d41e-4191-8c20-c6ab40deaf2a-utilities\") pod \"redhat-marketplace-j86cm\" (UID: \"2031a2e3-d41e-4191-8c20-c6ab40deaf2a\") " pod="openshift-marketplace/redhat-marketplace-j86cm" Mar 09 10:01:35 crc kubenswrapper[4971]: I0309 10:01:35.420863 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2031a2e3-d41e-4191-8c20-c6ab40deaf2a-catalog-content\") pod \"redhat-marketplace-j86cm\" (UID: \"2031a2e3-d41e-4191-8c20-c6ab40deaf2a\") " pod="openshift-marketplace/redhat-marketplace-j86cm" Mar 09 10:01:35 crc kubenswrapper[4971]: I0309 10:01:35.420907 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2031a2e3-d41e-4191-8c20-c6ab40deaf2a-utilities\") pod \"redhat-marketplace-j86cm\" (UID: \"2031a2e3-d41e-4191-8c20-c6ab40deaf2a\") " pod="openshift-marketplace/redhat-marketplace-j86cm" Mar 09 10:01:35 crc kubenswrapper[4971]: I0309 10:01:35.420952 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5wl7\" (UniqueName: \"kubernetes.io/projected/2031a2e3-d41e-4191-8c20-c6ab40deaf2a-kube-api-access-m5wl7\") pod \"redhat-marketplace-j86cm\" (UID: \"2031a2e3-d41e-4191-8c20-c6ab40deaf2a\") " pod="openshift-marketplace/redhat-marketplace-j86cm" Mar 09 10:01:35 crc kubenswrapper[4971]: I0309 10:01:35.421808 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2031a2e3-d41e-4191-8c20-c6ab40deaf2a-utilities\") pod \"redhat-marketplace-j86cm\" (UID: \"2031a2e3-d41e-4191-8c20-c6ab40deaf2a\") " pod="openshift-marketplace/redhat-marketplace-j86cm" Mar 09 10:01:35 crc kubenswrapper[4971]: I0309 10:01:35.421929 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2031a2e3-d41e-4191-8c20-c6ab40deaf2a-catalog-content\") pod \"redhat-marketplace-j86cm\" (UID: \"2031a2e3-d41e-4191-8c20-c6ab40deaf2a\") " pod="openshift-marketplace/redhat-marketplace-j86cm" Mar 09 10:01:35 crc kubenswrapper[4971]: I0309 10:01:35.473562 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5wl7\" (UniqueName: \"kubernetes.io/projected/2031a2e3-d41e-4191-8c20-c6ab40deaf2a-kube-api-access-m5wl7\") pod \"redhat-marketplace-j86cm\" (UID: \"2031a2e3-d41e-4191-8c20-c6ab40deaf2a\") " pod="openshift-marketplace/redhat-marketplace-j86cm" Mar 09 10:01:35 crc kubenswrapper[4971]: I0309 10:01:35.626169 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j86cm" Mar 09 10:01:36 crc kubenswrapper[4971]: I0309 10:01:36.106364 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j86cm"] Mar 09 10:01:36 crc kubenswrapper[4971]: I0309 10:01:36.565073 4971 generic.go:334] "Generic (PLEG): container finished" podID="2031a2e3-d41e-4191-8c20-c6ab40deaf2a" containerID="c1b6bf9238780af0499ab500f7f3342fedda8c4f000da2e9d9af7189faf8159e" exitCode=0 Mar 09 10:01:36 crc kubenswrapper[4971]: I0309 10:01:36.565476 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j86cm" event={"ID":"2031a2e3-d41e-4191-8c20-c6ab40deaf2a","Type":"ContainerDied","Data":"c1b6bf9238780af0499ab500f7f3342fedda8c4f000da2e9d9af7189faf8159e"} Mar 09 10:01:36 crc kubenswrapper[4971]: I0309 10:01:36.565509 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j86cm" event={"ID":"2031a2e3-d41e-4191-8c20-c6ab40deaf2a","Type":"ContainerStarted","Data":"f1d78d66f6f209510efdd0fc28fc27a86eea6827f7991661809499aef7017647"} Mar 09 10:01:36 crc kubenswrapper[4971]: I0309 10:01:36.566795 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 10:01:38 crc kubenswrapper[4971]: I0309 10:01:38.586825 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j86cm" event={"ID":"2031a2e3-d41e-4191-8c20-c6ab40deaf2a","Type":"ContainerStarted","Data":"a499d85b4bbefd76e1810d7d15bd79f564c693519a9dd5d18d482a3be384061b"} Mar 09 10:01:39 crc kubenswrapper[4971]: I0309 10:01:39.598391 4971 generic.go:334] "Generic (PLEG): container finished" podID="2031a2e3-d41e-4191-8c20-c6ab40deaf2a" containerID="a499d85b4bbefd76e1810d7d15bd79f564c693519a9dd5d18d482a3be384061b" exitCode=0 Mar 09 10:01:39 crc kubenswrapper[4971]: I0309 10:01:39.598447 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j86cm" event={"ID":"2031a2e3-d41e-4191-8c20-c6ab40deaf2a","Type":"ContainerDied","Data":"a499d85b4bbefd76e1810d7d15bd79f564c693519a9dd5d18d482a3be384061b"} Mar 09 10:01:40 crc kubenswrapper[4971]: I0309 10:01:40.608142 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j86cm" event={"ID":"2031a2e3-d41e-4191-8c20-c6ab40deaf2a","Type":"ContainerStarted","Data":"d47652097c9fe02457c7d2a94fb3211dac605062d4f3ad120e9cd4514905dbe0"} Mar 09 10:01:40 crc kubenswrapper[4971]: I0309 10:01:40.670456 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j86cm" podStartSLOduration=2.050836797 podStartE2EDuration="5.670438195s" podCreationTimestamp="2026-03-09 10:01:35 +0000 UTC" firstStartedPulling="2026-03-09 10:01:36.566545633 +0000 UTC m=+2500.126473443" lastFinishedPulling="2026-03-09 10:01:40.186147031 +0000 UTC m=+2503.746074841" observedRunningTime="2026-03-09 10:01:40.665407896 +0000 UTC m=+2504.225335726" watchObservedRunningTime="2026-03-09 10:01:40.670438195 +0000 UTC m=+2504.230366005" Mar 09 10:01:41 crc kubenswrapper[4971]: I0309 10:01:41.677935 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kjjbv"] Mar 09 10:01:41 crc kubenswrapper[4971]: I0309 10:01:41.680720 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kjjbv" Mar 09 10:01:41 crc kubenswrapper[4971]: I0309 10:01:41.693202 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kjjbv"] Mar 09 10:01:41 crc kubenswrapper[4971]: I0309 10:01:41.864583 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3b392e6-c5b9-4bdf-93a2-9a10abd331bc-catalog-content\") pod \"redhat-operators-kjjbv\" (UID: \"e3b392e6-c5b9-4bdf-93a2-9a10abd331bc\") " pod="openshift-marketplace/redhat-operators-kjjbv" Mar 09 10:01:41 crc kubenswrapper[4971]: I0309 10:01:41.865103 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3b392e6-c5b9-4bdf-93a2-9a10abd331bc-utilities\") pod \"redhat-operators-kjjbv\" (UID: \"e3b392e6-c5b9-4bdf-93a2-9a10abd331bc\") " pod="openshift-marketplace/redhat-operators-kjjbv" Mar 09 10:01:41 crc kubenswrapper[4971]: I0309 10:01:41.865221 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn4m7\" (UniqueName: \"kubernetes.io/projected/e3b392e6-c5b9-4bdf-93a2-9a10abd331bc-kube-api-access-rn4m7\") pod \"redhat-operators-kjjbv\" (UID: \"e3b392e6-c5b9-4bdf-93a2-9a10abd331bc\") " pod="openshift-marketplace/redhat-operators-kjjbv" Mar 09 10:01:41 crc kubenswrapper[4971]: I0309 10:01:41.967272 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3b392e6-c5b9-4bdf-93a2-9a10abd331bc-utilities\") pod \"redhat-operators-kjjbv\" (UID: \"e3b392e6-c5b9-4bdf-93a2-9a10abd331bc\") " pod="openshift-marketplace/redhat-operators-kjjbv" Mar 09 10:01:41 crc kubenswrapper[4971]: I0309 10:01:41.967632 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn4m7\" (UniqueName: \"kubernetes.io/projected/e3b392e6-c5b9-4bdf-93a2-9a10abd331bc-kube-api-access-rn4m7\") pod \"redhat-operators-kjjbv\" (UID: \"e3b392e6-c5b9-4bdf-93a2-9a10abd331bc\") " pod="openshift-marketplace/redhat-operators-kjjbv" Mar 09 10:01:41 crc kubenswrapper[4971]: I0309 10:01:41.967865 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3b392e6-c5b9-4bdf-93a2-9a10abd331bc-catalog-content\") pod \"redhat-operators-kjjbv\" (UID: \"e3b392e6-c5b9-4bdf-93a2-9a10abd331bc\") " pod="openshift-marketplace/redhat-operators-kjjbv" Mar 09 10:01:41 crc kubenswrapper[4971]: I0309 10:01:41.967873 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3b392e6-c5b9-4bdf-93a2-9a10abd331bc-utilities\") pod \"redhat-operators-kjjbv\" (UID: \"e3b392e6-c5b9-4bdf-93a2-9a10abd331bc\") " pod="openshift-marketplace/redhat-operators-kjjbv" Mar 09 10:01:41 crc kubenswrapper[4971]: I0309 10:01:41.968383 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3b392e6-c5b9-4bdf-93a2-9a10abd331bc-catalog-content\") pod \"redhat-operators-kjjbv\" (UID: \"e3b392e6-c5b9-4bdf-93a2-9a10abd331bc\") " pod="openshift-marketplace/redhat-operators-kjjbv" Mar 09 10:01:41 crc kubenswrapper[4971]: I0309 10:01:41.986413 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn4m7\" (UniqueName: \"kubernetes.io/projected/e3b392e6-c5b9-4bdf-93a2-9a10abd331bc-kube-api-access-rn4m7\") pod \"redhat-operators-kjjbv\" (UID: \"e3b392e6-c5b9-4bdf-93a2-9a10abd331bc\") " pod="openshift-marketplace/redhat-operators-kjjbv" Mar 09 10:01:41 crc kubenswrapper[4971]: I0309 10:01:41.996660 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kjjbv" Mar 09 10:01:42 crc kubenswrapper[4971]: I0309 10:01:42.420655 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kjjbv"] Mar 09 10:01:42 crc kubenswrapper[4971]: W0309 10:01:42.424675 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3b392e6_c5b9_4bdf_93a2_9a10abd331bc.slice/crio-e2ccfe13f9eaeaf83917abc9dab5eb64f0bdeaaf2d4556853993649980ca4e9f WatchSource:0}: Error finding container e2ccfe13f9eaeaf83917abc9dab5eb64f0bdeaaf2d4556853993649980ca4e9f: Status 404 returned error can't find the container with id e2ccfe13f9eaeaf83917abc9dab5eb64f0bdeaaf2d4556853993649980ca4e9f Mar 09 10:01:42 crc kubenswrapper[4971]: I0309 10:01:42.626610 4971 generic.go:334] "Generic (PLEG): container finished" podID="e3b392e6-c5b9-4bdf-93a2-9a10abd331bc" containerID="bdc0f6a029467e1971db806a7a94314315752a31c4784fb825b648024c484c8c" exitCode=0 Mar 09 10:01:42 crc kubenswrapper[4971]: I0309 10:01:42.626659 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjjbv" event={"ID":"e3b392e6-c5b9-4bdf-93a2-9a10abd331bc","Type":"ContainerDied","Data":"bdc0f6a029467e1971db806a7a94314315752a31c4784fb825b648024c484c8c"} Mar 09 10:01:42 crc kubenswrapper[4971]: I0309 10:01:42.626688 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjjbv" event={"ID":"e3b392e6-c5b9-4bdf-93a2-9a10abd331bc","Type":"ContainerStarted","Data":"e2ccfe13f9eaeaf83917abc9dab5eb64f0bdeaaf2d4556853993649980ca4e9f"} Mar 09 10:01:44 crc kubenswrapper[4971]: I0309 10:01:44.644491 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjjbv" event={"ID":"e3b392e6-c5b9-4bdf-93a2-9a10abd331bc","Type":"ContainerStarted","Data":"91837da4346ec13d6f7718f4f9cb7795110e866c3d0745f7d5893bc57680117d"} Mar 09 10:01:44 crc kubenswrapper[4971]: I0309 10:01:44.794964 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:01:44 crc kubenswrapper[4971]: I0309 10:01:44.795045 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:01:44 crc kubenswrapper[4971]: I0309 10:01:44.795096 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" Mar 09 10:01:44 crc kubenswrapper[4971]: I0309 10:01:44.795788 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819"} pod="openshift-machine-config-operator/machine-config-daemon-p56wx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 10:01:44 crc kubenswrapper[4971]: I0309 10:01:44.795843 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" containerID="cri-o://47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" gracePeriod=600 Mar 09 10:01:45 crc kubenswrapper[4971]: I0309 10:01:45.627771 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j86cm" Mar 09 10:01:45 crc kubenswrapper[4971]: I0309 10:01:45.628126 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j86cm" Mar 09 10:01:45 crc kubenswrapper[4971]: I0309 10:01:45.682513 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j86cm" Mar 09 10:01:45 crc kubenswrapper[4971]: I0309 10:01:45.734666 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j86cm" Mar 09 10:01:46 crc kubenswrapper[4971]: I0309 10:01:46.663681 4971 generic.go:334] "Generic (PLEG): container finished" podID="e3b392e6-c5b9-4bdf-93a2-9a10abd331bc" containerID="91837da4346ec13d6f7718f4f9cb7795110e866c3d0745f7d5893bc57680117d" exitCode=0 Mar 09 10:01:46 crc kubenswrapper[4971]: I0309 10:01:46.663756 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjjbv" event={"ID":"e3b392e6-c5b9-4bdf-93a2-9a10abd331bc","Type":"ContainerDied","Data":"91837da4346ec13d6f7718f4f9cb7795110e866c3d0745f7d5893bc57680117d"} Mar 09 10:01:46 crc kubenswrapper[4971]: I0309 10:01:46.870648 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j86cm"] Mar 09 10:01:47 crc kubenswrapper[4971]: E0309 10:01:47.617123 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 10:01:47 crc kubenswrapper[4971]: I0309 10:01:47.678109 4971 generic.go:334] "Generic (PLEG): container finished" podID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerID="47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" exitCode=0 Mar 09 10:01:47 crc kubenswrapper[4971]: I0309 10:01:47.678326 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j86cm" podUID="2031a2e3-d41e-4191-8c20-c6ab40deaf2a" containerName="registry-server" containerID="cri-o://d47652097c9fe02457c7d2a94fb3211dac605062d4f3ad120e9cd4514905dbe0" gracePeriod=2 Mar 09 10:01:47 crc kubenswrapper[4971]: I0309 10:01:47.678575 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" event={"ID":"05fde3ad-1182-4b15-bb1a-f365ecc92d75","Type":"ContainerDied","Data":"47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819"} Mar 09 10:01:47 crc kubenswrapper[4971]: I0309 10:01:47.678665 4971 scope.go:117] "RemoveContainer" containerID="fb854a481092dad066a02e66c2ebd6763e161f9c45ef6671e752ecdc7ae089b9" Mar 09 10:01:47 crc kubenswrapper[4971]: I0309 10:01:47.679315 4971 scope.go:117] "RemoveContainer" containerID="47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" Mar 09 10:01:47 crc kubenswrapper[4971]: E0309 10:01:47.679790 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.085153 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j86cm" Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.260939 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5wl7\" (UniqueName: \"kubernetes.io/projected/2031a2e3-d41e-4191-8c20-c6ab40deaf2a-kube-api-access-m5wl7\") pod \"2031a2e3-d41e-4191-8c20-c6ab40deaf2a\" (UID: \"2031a2e3-d41e-4191-8c20-c6ab40deaf2a\") " Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.261641 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2031a2e3-d41e-4191-8c20-c6ab40deaf2a-utilities\") pod \"2031a2e3-d41e-4191-8c20-c6ab40deaf2a\" (UID: \"2031a2e3-d41e-4191-8c20-c6ab40deaf2a\") " Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.261993 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2031a2e3-d41e-4191-8c20-c6ab40deaf2a-catalog-content\") pod \"2031a2e3-d41e-4191-8c20-c6ab40deaf2a\" (UID: \"2031a2e3-d41e-4191-8c20-c6ab40deaf2a\") " Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.262258 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2031a2e3-d41e-4191-8c20-c6ab40deaf2a-utilities" (OuterVolumeSpecName: "utilities") pod "2031a2e3-d41e-4191-8c20-c6ab40deaf2a" (UID: "2031a2e3-d41e-4191-8c20-c6ab40deaf2a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.262865 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2031a2e3-d41e-4191-8c20-c6ab40deaf2a-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.266497 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2031a2e3-d41e-4191-8c20-c6ab40deaf2a-kube-api-access-m5wl7" (OuterVolumeSpecName: "kube-api-access-m5wl7") pod "2031a2e3-d41e-4191-8c20-c6ab40deaf2a" (UID: "2031a2e3-d41e-4191-8c20-c6ab40deaf2a"). InnerVolumeSpecName "kube-api-access-m5wl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.289138 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2031a2e3-d41e-4191-8c20-c6ab40deaf2a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2031a2e3-d41e-4191-8c20-c6ab40deaf2a" (UID: "2031a2e3-d41e-4191-8c20-c6ab40deaf2a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.364550 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2031a2e3-d41e-4191-8c20-c6ab40deaf2a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.364595 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5wl7\" (UniqueName: \"kubernetes.io/projected/2031a2e3-d41e-4191-8c20-c6ab40deaf2a-kube-api-access-m5wl7\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.688438 4971 generic.go:334] "Generic (PLEG): container finished" podID="2031a2e3-d41e-4191-8c20-c6ab40deaf2a" containerID="d47652097c9fe02457c7d2a94fb3211dac605062d4f3ad120e9cd4514905dbe0" exitCode=0 Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.688498 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j86cm" event={"ID":"2031a2e3-d41e-4191-8c20-c6ab40deaf2a","Type":"ContainerDied","Data":"d47652097c9fe02457c7d2a94fb3211dac605062d4f3ad120e9cd4514905dbe0"} Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.688523 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j86cm" event={"ID":"2031a2e3-d41e-4191-8c20-c6ab40deaf2a","Type":"ContainerDied","Data":"f1d78d66f6f209510efdd0fc28fc27a86eea6827f7991661809499aef7017647"} Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.688539 4971 scope.go:117] "RemoveContainer" containerID="d47652097c9fe02457c7d2a94fb3211dac605062d4f3ad120e9cd4514905dbe0" Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.688632 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j86cm" Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.696423 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjjbv" event={"ID":"e3b392e6-c5b9-4bdf-93a2-9a10abd331bc","Type":"ContainerStarted","Data":"d006b1b6eb2da484e275c5d1bbe43ea3becf095e43166c61eb0890d117ec8815"} Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.712018 4971 scope.go:117] "RemoveContainer" containerID="a499d85b4bbefd76e1810d7d15bd79f564c693519a9dd5d18d482a3be384061b" Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.732195 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kjjbv" podStartSLOduration=2.804049753 podStartE2EDuration="7.732167434s" podCreationTimestamp="2026-03-09 10:01:41 +0000 UTC" firstStartedPulling="2026-03-09 10:01:42.627991218 +0000 UTC m=+2506.187919028" lastFinishedPulling="2026-03-09 10:01:47.556108899 +0000 UTC m=+2511.116036709" observedRunningTime="2026-03-09 10:01:48.720669398 +0000 UTC m=+2512.280597218" watchObservedRunningTime="2026-03-09 10:01:48.732167434 +0000 UTC m=+2512.292095294" Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.748938 4971 scope.go:117] "RemoveContainer" containerID="c1b6bf9238780af0499ab500f7f3342fedda8c4f000da2e9d9af7189faf8159e" Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.749076 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j86cm"] Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.755859 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j86cm"] Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.780726 4971 scope.go:117] "RemoveContainer" containerID="d47652097c9fe02457c7d2a94fb3211dac605062d4f3ad120e9cd4514905dbe0" Mar 09 10:01:48 crc kubenswrapper[4971]: E0309 10:01:48.781111 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d47652097c9fe02457c7d2a94fb3211dac605062d4f3ad120e9cd4514905dbe0\": container with ID starting with d47652097c9fe02457c7d2a94fb3211dac605062d4f3ad120e9cd4514905dbe0 not found: ID does not exist" containerID="d47652097c9fe02457c7d2a94fb3211dac605062d4f3ad120e9cd4514905dbe0" Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.781155 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d47652097c9fe02457c7d2a94fb3211dac605062d4f3ad120e9cd4514905dbe0"} err="failed to get container status \"d47652097c9fe02457c7d2a94fb3211dac605062d4f3ad120e9cd4514905dbe0\": rpc error: code = NotFound desc = could not find container \"d47652097c9fe02457c7d2a94fb3211dac605062d4f3ad120e9cd4514905dbe0\": container with ID starting with d47652097c9fe02457c7d2a94fb3211dac605062d4f3ad120e9cd4514905dbe0 not found: ID does not exist" Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.781181 4971 scope.go:117] "RemoveContainer" containerID="a499d85b4bbefd76e1810d7d15bd79f564c693519a9dd5d18d482a3be384061b" Mar 09 10:01:48 crc kubenswrapper[4971]: E0309 10:01:48.781418 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a499d85b4bbefd76e1810d7d15bd79f564c693519a9dd5d18d482a3be384061b\": container with ID starting with a499d85b4bbefd76e1810d7d15bd79f564c693519a9dd5d18d482a3be384061b not found: ID does not exist" containerID="a499d85b4bbefd76e1810d7d15bd79f564c693519a9dd5d18d482a3be384061b" Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.781454 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a499d85b4bbefd76e1810d7d15bd79f564c693519a9dd5d18d482a3be384061b"} err="failed to get container status \"a499d85b4bbefd76e1810d7d15bd79f564c693519a9dd5d18d482a3be384061b\": rpc error: code = NotFound desc = could not find container \"a499d85b4bbefd76e1810d7d15bd79f564c693519a9dd5d18d482a3be384061b\": container with ID starting with a499d85b4bbefd76e1810d7d15bd79f564c693519a9dd5d18d482a3be384061b not found: ID does not exist" Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.781507 4971 scope.go:117] "RemoveContainer" containerID="c1b6bf9238780af0499ab500f7f3342fedda8c4f000da2e9d9af7189faf8159e" Mar 09 10:01:48 crc kubenswrapper[4971]: E0309 10:01:48.782134 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1b6bf9238780af0499ab500f7f3342fedda8c4f000da2e9d9af7189faf8159e\": container with ID starting with c1b6bf9238780af0499ab500f7f3342fedda8c4f000da2e9d9af7189faf8159e not found: ID does not exist" containerID="c1b6bf9238780af0499ab500f7f3342fedda8c4f000da2e9d9af7189faf8159e" Mar 09 10:01:48 crc kubenswrapper[4971]: I0309 10:01:48.782167 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1b6bf9238780af0499ab500f7f3342fedda8c4f000da2e9d9af7189faf8159e"} err="failed to get container status \"c1b6bf9238780af0499ab500f7f3342fedda8c4f000da2e9d9af7189faf8159e\": rpc error: code = NotFound desc = could not find container \"c1b6bf9238780af0499ab500f7f3342fedda8c4f000da2e9d9af7189faf8159e\": container with ID starting with c1b6bf9238780af0499ab500f7f3342fedda8c4f000da2e9d9af7189faf8159e not found: ID does not exist" Mar 09 10:01:49 crc kubenswrapper[4971]: I0309 10:01:49.160449 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2031a2e3-d41e-4191-8c20-c6ab40deaf2a" path="/var/lib/kubelet/pods/2031a2e3-d41e-4191-8c20-c6ab40deaf2a/volumes" Mar 09 10:01:51 crc kubenswrapper[4971]: I0309 10:01:51.997449 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kjjbv" Mar 09 10:01:51 crc kubenswrapper[4971]: I0309 10:01:51.997743 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kjjbv" Mar 09 10:01:53 crc kubenswrapper[4971]: I0309 10:01:53.038330 4971 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kjjbv" podUID="e3b392e6-c5b9-4bdf-93a2-9a10abd331bc" containerName="registry-server" probeResult="failure" output=< Mar 09 10:01:53 crc kubenswrapper[4971]: timeout: failed to connect service ":50051" within 1s Mar 09 10:01:53 crc kubenswrapper[4971]: > Mar 09 10:02:00 crc kubenswrapper[4971]: I0309 10:02:00.137865 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550842-kwsrg"] Mar 09 10:02:00 crc kubenswrapper[4971]: E0309 10:02:00.138756 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2031a2e3-d41e-4191-8c20-c6ab40deaf2a" containerName="extract-content" Mar 09 10:02:00 crc kubenswrapper[4971]: I0309 10:02:00.138769 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2031a2e3-d41e-4191-8c20-c6ab40deaf2a" containerName="extract-content" Mar 09 10:02:00 crc kubenswrapper[4971]: E0309 10:02:00.138781 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2031a2e3-d41e-4191-8c20-c6ab40deaf2a" containerName="extract-utilities" Mar 09 10:02:00 crc kubenswrapper[4971]: I0309 10:02:00.138789 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2031a2e3-d41e-4191-8c20-c6ab40deaf2a" containerName="extract-utilities" Mar 09 10:02:00 crc kubenswrapper[4971]: E0309 10:02:00.138809 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2031a2e3-d41e-4191-8c20-c6ab40deaf2a" containerName="registry-server" Mar 09 10:02:00 crc kubenswrapper[4971]: I0309 10:02:00.138815 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="2031a2e3-d41e-4191-8c20-c6ab40deaf2a" containerName="registry-server" Mar 09 10:02:00 crc kubenswrapper[4971]: I0309 10:02:00.138977 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="2031a2e3-d41e-4191-8c20-c6ab40deaf2a" containerName="registry-server" Mar 09 10:02:00 crc kubenswrapper[4971]: I0309 10:02:00.139472 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550842-kwsrg" Mar 09 10:02:00 crc kubenswrapper[4971]: I0309 10:02:00.143126 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:02:00 crc kubenswrapper[4971]: I0309 10:02:00.143530 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:02:00 crc kubenswrapper[4971]: I0309 10:02:00.143553 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-xhrv2" Mar 09 10:02:00 crc kubenswrapper[4971]: I0309 10:02:00.148830 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550842-kwsrg"] Mar 09 10:02:00 crc kubenswrapper[4971]: I0309 10:02:00.231090 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4jd6\" (UniqueName: \"kubernetes.io/projected/b62d5246-a7fe-4be6-9935-732dafc959a0-kube-api-access-x4jd6\") pod \"auto-csr-approver-29550842-kwsrg\" (UID: \"b62d5246-a7fe-4be6-9935-732dafc959a0\") " pod="openshift-infra/auto-csr-approver-29550842-kwsrg" Mar 09 10:02:00 crc kubenswrapper[4971]: I0309 10:02:00.332590 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4jd6\" (UniqueName: \"kubernetes.io/projected/b62d5246-a7fe-4be6-9935-732dafc959a0-kube-api-access-x4jd6\") pod \"auto-csr-approver-29550842-kwsrg\" (UID: \"b62d5246-a7fe-4be6-9935-732dafc959a0\") " pod="openshift-infra/auto-csr-approver-29550842-kwsrg" Mar 09 10:02:00 crc kubenswrapper[4971]: I0309 10:02:00.350775 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4jd6\" (UniqueName: \"kubernetes.io/projected/b62d5246-a7fe-4be6-9935-732dafc959a0-kube-api-access-x4jd6\") pod \"auto-csr-approver-29550842-kwsrg\" (UID: \"b62d5246-a7fe-4be6-9935-732dafc959a0\") " pod="openshift-infra/auto-csr-approver-29550842-kwsrg" Mar 09 10:02:00 crc kubenswrapper[4971]: I0309 10:02:00.463717 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550842-kwsrg" Mar 09 10:02:01 crc kubenswrapper[4971]: I0309 10:02:01.455848 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550842-kwsrg"] Mar 09 10:02:01 crc kubenswrapper[4971]: I0309 10:02:01.820398 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550842-kwsrg" event={"ID":"b62d5246-a7fe-4be6-9935-732dafc959a0","Type":"ContainerStarted","Data":"9350316beb7aa63c36148d198fedea74b651611ea844fe84cc54a4649485e1c1"} Mar 09 10:02:02 crc kubenswrapper[4971]: I0309 10:02:02.031400 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kcgfg/must-gather-xchvf"] Mar 09 10:02:02 crc kubenswrapper[4971]: I0309 10:02:02.032949 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kcgfg/must-gather-xchvf" Mar 09 10:02:02 crc kubenswrapper[4971]: I0309 10:02:02.036568 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-kcgfg"/"default-dockercfg-j28kb" Mar 09 10:02:02 crc kubenswrapper[4971]: I0309 10:02:02.036900 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kcgfg"/"openshift-service-ca.crt" Mar 09 10:02:02 crc kubenswrapper[4971]: I0309 10:02:02.037211 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kcgfg"/"kube-root-ca.crt" Mar 09 10:02:02 crc kubenswrapper[4971]: I0309 10:02:02.060674 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kcgfg/must-gather-xchvf"] Mar 09 10:02:02 crc kubenswrapper[4971]: I0309 10:02:02.097159 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kjjbv" Mar 09 10:02:02 crc kubenswrapper[4971]: I0309 10:02:02.152056 4971 scope.go:117] "RemoveContainer" containerID="47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" Mar 09 10:02:02 crc kubenswrapper[4971]: E0309 10:02:02.152230 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 10:02:02 crc kubenswrapper[4971]: I0309 10:02:02.166989 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxs6k\" (UniqueName: \"kubernetes.io/projected/30f1f59d-f892-41c9-bbcf-f1a1f8fd9677-kube-api-access-gxs6k\") pod \"must-gather-xchvf\" (UID: \"30f1f59d-f892-41c9-bbcf-f1a1f8fd9677\") " pod="openshift-must-gather-kcgfg/must-gather-xchvf" Mar 09 10:02:02 crc kubenswrapper[4971]: I0309 10:02:02.167136 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/30f1f59d-f892-41c9-bbcf-f1a1f8fd9677-must-gather-output\") pod \"must-gather-xchvf\" (UID: \"30f1f59d-f892-41c9-bbcf-f1a1f8fd9677\") " pod="openshift-must-gather-kcgfg/must-gather-xchvf" Mar 09 10:02:02 crc kubenswrapper[4971]: I0309 10:02:02.167702 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kjjbv" Mar 09 10:02:02 crc kubenswrapper[4971]: I0309 10:02:02.269206 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxs6k\" (UniqueName: \"kubernetes.io/projected/30f1f59d-f892-41c9-bbcf-f1a1f8fd9677-kube-api-access-gxs6k\") pod \"must-gather-xchvf\" (UID: \"30f1f59d-f892-41c9-bbcf-f1a1f8fd9677\") " pod="openshift-must-gather-kcgfg/must-gather-xchvf" Mar 09 10:02:02 crc kubenswrapper[4971]: I0309 10:02:02.269316 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/30f1f59d-f892-41c9-bbcf-f1a1f8fd9677-must-gather-output\") pod \"must-gather-xchvf\" (UID: \"30f1f59d-f892-41c9-bbcf-f1a1f8fd9677\") " pod="openshift-must-gather-kcgfg/must-gather-xchvf" Mar 09 10:02:02 crc kubenswrapper[4971]: I0309 10:02:02.269774 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/30f1f59d-f892-41c9-bbcf-f1a1f8fd9677-must-gather-output\") pod \"must-gather-xchvf\" (UID: \"30f1f59d-f892-41c9-bbcf-f1a1f8fd9677\") " pod="openshift-must-gather-kcgfg/must-gather-xchvf" Mar 09 10:02:02 crc kubenswrapper[4971]: I0309 10:02:02.289083 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxs6k\" (UniqueName: \"kubernetes.io/projected/30f1f59d-f892-41c9-bbcf-f1a1f8fd9677-kube-api-access-gxs6k\") pod \"must-gather-xchvf\" (UID: \"30f1f59d-f892-41c9-bbcf-f1a1f8fd9677\") " pod="openshift-must-gather-kcgfg/must-gather-xchvf" Mar 09 10:02:02 crc kubenswrapper[4971]: I0309 10:02:02.341788 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kjjbv"] Mar 09 10:02:02 crc kubenswrapper[4971]: I0309 10:02:02.354578 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kcgfg/must-gather-xchvf" Mar 09 10:02:02 crc kubenswrapper[4971]: I0309 10:02:02.762090 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kcgfg/must-gather-xchvf"] Mar 09 10:02:02 crc kubenswrapper[4971]: W0309 10:02:02.770333 4971 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30f1f59d_f892_41c9_bbcf_f1a1f8fd9677.slice/crio-5bb849f4ca283e3d4002d06663ce061fc661bdcfc3519dd296a1def76f835f37 WatchSource:0}: Error finding container 5bb849f4ca283e3d4002d06663ce061fc661bdcfc3519dd296a1def76f835f37: Status 404 returned error can't find the container with id 5bb849f4ca283e3d4002d06663ce061fc661bdcfc3519dd296a1def76f835f37 Mar 09 10:02:02 crc kubenswrapper[4971]: I0309 10:02:02.827337 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kcgfg/must-gather-xchvf" event={"ID":"30f1f59d-f892-41c9-bbcf-f1a1f8fd9677","Type":"ContainerStarted","Data":"5bb849f4ca283e3d4002d06663ce061fc661bdcfc3519dd296a1def76f835f37"} Mar 09 10:02:02 crc kubenswrapper[4971]: I0309 10:02:02.828817 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550842-kwsrg" event={"ID":"b62d5246-a7fe-4be6-9935-732dafc959a0","Type":"ContainerStarted","Data":"d60e3d02ec86c24df0b1021efac9bdd6358fe7ca123da1f824e62aaa57122002"} Mar 09 10:02:02 crc kubenswrapper[4971]: I0309 10:02:02.845470 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550842-kwsrg" podStartSLOduration=1.943297076 podStartE2EDuration="2.845450287s" podCreationTimestamp="2026-03-09 10:02:00 +0000 UTC" firstStartedPulling="2026-03-09 10:02:01.464499475 +0000 UTC m=+2525.024427285" lastFinishedPulling="2026-03-09 10:02:02.366652686 +0000 UTC m=+2525.926580496" observedRunningTime="2026-03-09 10:02:02.842298207 +0000 UTC m=+2526.402226017" watchObservedRunningTime="2026-03-09 10:02:02.845450287 +0000 UTC m=+2526.405378097" Mar 09 10:02:03 crc kubenswrapper[4971]: I0309 10:02:03.839918 4971 generic.go:334] "Generic (PLEG): container finished" podID="b62d5246-a7fe-4be6-9935-732dafc959a0" containerID="d60e3d02ec86c24df0b1021efac9bdd6358fe7ca123da1f824e62aaa57122002" exitCode=0 Mar 09 10:02:03 crc kubenswrapper[4971]: I0309 10:02:03.839972 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550842-kwsrg" event={"ID":"b62d5246-a7fe-4be6-9935-732dafc959a0","Type":"ContainerDied","Data":"d60e3d02ec86c24df0b1021efac9bdd6358fe7ca123da1f824e62aaa57122002"} Mar 09 10:02:03 crc kubenswrapper[4971]: I0309 10:02:03.840307 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kjjbv" podUID="e3b392e6-c5b9-4bdf-93a2-9a10abd331bc" containerName="registry-server" containerID="cri-o://d006b1b6eb2da484e275c5d1bbe43ea3becf095e43166c61eb0890d117ec8815" gracePeriod=2 Mar 09 10:02:04 crc kubenswrapper[4971]: I0309 10:02:04.270422 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kjjbv" Mar 09 10:02:04 crc kubenswrapper[4971]: I0309 10:02:04.396979 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3b392e6-c5b9-4bdf-93a2-9a10abd331bc-utilities\") pod \"e3b392e6-c5b9-4bdf-93a2-9a10abd331bc\" (UID: \"e3b392e6-c5b9-4bdf-93a2-9a10abd331bc\") " Mar 09 10:02:04 crc kubenswrapper[4971]: I0309 10:02:04.397047 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3b392e6-c5b9-4bdf-93a2-9a10abd331bc-catalog-content\") pod \"e3b392e6-c5b9-4bdf-93a2-9a10abd331bc\" (UID: \"e3b392e6-c5b9-4bdf-93a2-9a10abd331bc\") " Mar 09 10:02:04 crc kubenswrapper[4971]: I0309 10:02:04.397117 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn4m7\" (UniqueName: \"kubernetes.io/projected/e3b392e6-c5b9-4bdf-93a2-9a10abd331bc-kube-api-access-rn4m7\") pod \"e3b392e6-c5b9-4bdf-93a2-9a10abd331bc\" (UID: \"e3b392e6-c5b9-4bdf-93a2-9a10abd331bc\") " Mar 09 10:02:04 crc kubenswrapper[4971]: I0309 10:02:04.397860 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3b392e6-c5b9-4bdf-93a2-9a10abd331bc-utilities" (OuterVolumeSpecName: "utilities") pod "e3b392e6-c5b9-4bdf-93a2-9a10abd331bc" (UID: "e3b392e6-c5b9-4bdf-93a2-9a10abd331bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:02:04 crc kubenswrapper[4971]: I0309 10:02:04.403278 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b392e6-c5b9-4bdf-93a2-9a10abd331bc-kube-api-access-rn4m7" (OuterVolumeSpecName: "kube-api-access-rn4m7") pod "e3b392e6-c5b9-4bdf-93a2-9a10abd331bc" (UID: "e3b392e6-c5b9-4bdf-93a2-9a10abd331bc"). InnerVolumeSpecName "kube-api-access-rn4m7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:02:04 crc kubenswrapper[4971]: I0309 10:02:04.499541 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3b392e6-c5b9-4bdf-93a2-9a10abd331bc-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 10:02:04 crc kubenswrapper[4971]: I0309 10:02:04.499587 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn4m7\" (UniqueName: \"kubernetes.io/projected/e3b392e6-c5b9-4bdf-93a2-9a10abd331bc-kube-api-access-rn4m7\") on node \"crc\" DevicePath \"\"" Mar 09 10:02:04 crc kubenswrapper[4971]: I0309 10:02:04.547036 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3b392e6-c5b9-4bdf-93a2-9a10abd331bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3b392e6-c5b9-4bdf-93a2-9a10abd331bc" (UID: "e3b392e6-c5b9-4bdf-93a2-9a10abd331bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:02:04 crc kubenswrapper[4971]: I0309 10:02:04.602264 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3b392e6-c5b9-4bdf-93a2-9a10abd331bc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 10:02:04 crc kubenswrapper[4971]: I0309 10:02:04.856247 4971 generic.go:334] "Generic (PLEG): container finished" podID="e3b392e6-c5b9-4bdf-93a2-9a10abd331bc" containerID="d006b1b6eb2da484e275c5d1bbe43ea3becf095e43166c61eb0890d117ec8815" exitCode=0 Mar 09 10:02:04 crc kubenswrapper[4971]: I0309 10:02:04.856306 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjjbv" event={"ID":"e3b392e6-c5b9-4bdf-93a2-9a10abd331bc","Type":"ContainerDied","Data":"d006b1b6eb2da484e275c5d1bbe43ea3becf095e43166c61eb0890d117ec8815"} Mar 09 10:02:04 crc kubenswrapper[4971]: I0309 10:02:04.856373 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjjbv" event={"ID":"e3b392e6-c5b9-4bdf-93a2-9a10abd331bc","Type":"ContainerDied","Data":"e2ccfe13f9eaeaf83917abc9dab5eb64f0bdeaaf2d4556853993649980ca4e9f"} Mar 09 10:02:04 crc kubenswrapper[4971]: I0309 10:02:04.856400 4971 scope.go:117] "RemoveContainer" containerID="d006b1b6eb2da484e275c5d1bbe43ea3becf095e43166c61eb0890d117ec8815" Mar 09 10:02:04 crc kubenswrapper[4971]: I0309 10:02:04.856457 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kjjbv" Mar 09 10:02:04 crc kubenswrapper[4971]: I0309 10:02:04.900937 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kjjbv"] Mar 09 10:02:04 crc kubenswrapper[4971]: I0309 10:02:04.906552 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kjjbv"] Mar 09 10:02:05 crc kubenswrapper[4971]: I0309 10:02:05.162880 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3b392e6-c5b9-4bdf-93a2-9a10abd331bc" path="/var/lib/kubelet/pods/e3b392e6-c5b9-4bdf-93a2-9a10abd331bc/volumes" Mar 09 10:02:07 crc kubenswrapper[4971]: I0309 10:02:07.923602 4971 scope.go:117] "RemoveContainer" containerID="dc45fcc3a20b773814ee053ffe3fefd339d84c7c11f2deacdc70a5377133616a" Mar 09 10:02:09 crc kubenswrapper[4971]: I0309 10:02:09.075999 4971 scope.go:117] "RemoveContainer" containerID="91837da4346ec13d6f7718f4f9cb7795110e866c3d0745f7d5893bc57680117d" Mar 09 10:02:09 crc kubenswrapper[4971]: I0309 10:02:09.104861 4971 scope.go:117] "RemoveContainer" containerID="b7359a996de529e746fd0c2682b33c6703ef9ce3b8fdfc095db7ac9f4498e328" Mar 09 10:02:09 crc kubenswrapper[4971]: I0309 10:02:09.122398 4971 scope.go:117] "RemoveContainer" containerID="bdc0f6a029467e1971db806a7a94314315752a31c4784fb825b648024c484c8c" Mar 09 10:02:09 crc kubenswrapper[4971]: I0309 10:02:09.140741 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550842-kwsrg" Mar 09 10:02:09 crc kubenswrapper[4971]: I0309 10:02:09.171835 4971 scope.go:117] "RemoveContainer" containerID="dd164751dddf8119010b520442b9226651c4d94e43e1fd8edc0ae8f360889bd4" Mar 09 10:02:09 crc kubenswrapper[4971]: I0309 10:02:09.184433 4971 scope.go:117] "RemoveContainer" containerID="d006b1b6eb2da484e275c5d1bbe43ea3becf095e43166c61eb0890d117ec8815" Mar 09 10:02:09 crc kubenswrapper[4971]: E0309 10:02:09.184889 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d006b1b6eb2da484e275c5d1bbe43ea3becf095e43166c61eb0890d117ec8815\": container with ID starting with d006b1b6eb2da484e275c5d1bbe43ea3becf095e43166c61eb0890d117ec8815 not found: ID does not exist" containerID="d006b1b6eb2da484e275c5d1bbe43ea3becf095e43166c61eb0890d117ec8815" Mar 09 10:02:09 crc kubenswrapper[4971]: I0309 10:02:09.184933 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d006b1b6eb2da484e275c5d1bbe43ea3becf095e43166c61eb0890d117ec8815"} err="failed to get container status \"d006b1b6eb2da484e275c5d1bbe43ea3becf095e43166c61eb0890d117ec8815\": rpc error: code = NotFound desc = could not find container \"d006b1b6eb2da484e275c5d1bbe43ea3becf095e43166c61eb0890d117ec8815\": container with ID starting with d006b1b6eb2da484e275c5d1bbe43ea3becf095e43166c61eb0890d117ec8815 not found: ID does not exist" Mar 09 10:02:09 crc kubenswrapper[4971]: I0309 10:02:09.184961 4971 scope.go:117] "RemoveContainer" containerID="91837da4346ec13d6f7718f4f9cb7795110e866c3d0745f7d5893bc57680117d" Mar 09 10:02:09 crc kubenswrapper[4971]: E0309 10:02:09.185558 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91837da4346ec13d6f7718f4f9cb7795110e866c3d0745f7d5893bc57680117d\": container with ID starting with 91837da4346ec13d6f7718f4f9cb7795110e866c3d0745f7d5893bc57680117d not found: ID does not exist" containerID="91837da4346ec13d6f7718f4f9cb7795110e866c3d0745f7d5893bc57680117d" Mar 09 10:02:09 crc kubenswrapper[4971]: I0309 10:02:09.185589 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91837da4346ec13d6f7718f4f9cb7795110e866c3d0745f7d5893bc57680117d"} err="failed to get container status \"91837da4346ec13d6f7718f4f9cb7795110e866c3d0745f7d5893bc57680117d\": rpc error: code = NotFound desc = could not find container \"91837da4346ec13d6f7718f4f9cb7795110e866c3d0745f7d5893bc57680117d\": container with ID starting with 91837da4346ec13d6f7718f4f9cb7795110e866c3d0745f7d5893bc57680117d not found: ID does not exist" Mar 09 10:02:09 crc kubenswrapper[4971]: I0309 10:02:09.185624 4971 scope.go:117] "RemoveContainer" containerID="bdc0f6a029467e1971db806a7a94314315752a31c4784fb825b648024c484c8c" Mar 09 10:02:09 crc kubenswrapper[4971]: E0309 10:02:09.185905 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdc0f6a029467e1971db806a7a94314315752a31c4784fb825b648024c484c8c\": container with ID starting with bdc0f6a029467e1971db806a7a94314315752a31c4784fb825b648024c484c8c not found: ID does not exist" containerID="bdc0f6a029467e1971db806a7a94314315752a31c4784fb825b648024c484c8c" Mar 09 10:02:09 crc kubenswrapper[4971]: I0309 10:02:09.185956 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdc0f6a029467e1971db806a7a94314315752a31c4784fb825b648024c484c8c"} err="failed to get container status \"bdc0f6a029467e1971db806a7a94314315752a31c4784fb825b648024c484c8c\": rpc error: code = NotFound desc = could not find container \"bdc0f6a029467e1971db806a7a94314315752a31c4784fb825b648024c484c8c\": container with ID starting with bdc0f6a029467e1971db806a7a94314315752a31c4784fb825b648024c484c8c not found: ID does not exist" Mar 09 10:02:09 crc kubenswrapper[4971]: I0309 10:02:09.234512 4971 scope.go:117] "RemoveContainer" containerID="e494e41aa24e964423bfe50acc40db831782757ecaf70f17b9cf8095a61cf1f0" Mar 09 10:02:09 crc kubenswrapper[4971]: I0309 10:02:09.258426 4971 scope.go:117] "RemoveContainer" containerID="387b2365a9ff29875776b85adc14987f2110e074d2204addf6cc14451029b082" Mar 09 10:02:09 crc kubenswrapper[4971]: I0309 10:02:09.277376 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4jd6\" (UniqueName: \"kubernetes.io/projected/b62d5246-a7fe-4be6-9935-732dafc959a0-kube-api-access-x4jd6\") pod \"b62d5246-a7fe-4be6-9935-732dafc959a0\" (UID: \"b62d5246-a7fe-4be6-9935-732dafc959a0\") " Mar 09 10:02:09 crc kubenswrapper[4971]: I0309 10:02:09.282512 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b62d5246-a7fe-4be6-9935-732dafc959a0-kube-api-access-x4jd6" (OuterVolumeSpecName: "kube-api-access-x4jd6") pod "b62d5246-a7fe-4be6-9935-732dafc959a0" (UID: "b62d5246-a7fe-4be6-9935-732dafc959a0"). InnerVolumeSpecName "kube-api-access-x4jd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:02:09 crc kubenswrapper[4971]: I0309 10:02:09.285061 4971 scope.go:117] "RemoveContainer" containerID="addf5a50f8ae5778125e8c51fedd74e68fad53dc157ea2cd71b543cc15a49308" Mar 09 10:02:09 crc kubenswrapper[4971]: I0309 10:02:09.309736 4971 scope.go:117] "RemoveContainer" containerID="0e67b4838cd380b4b2100e2af862182488f46704ee234f96714aade59f12ccc8" Mar 09 10:02:09 crc kubenswrapper[4971]: I0309 10:02:09.351587 4971 scope.go:117] "RemoveContainer" containerID="9b75a598bd5760c08bfec3d215e963d264f41fcd7034641b137fb9e1250ee069" Mar 09 10:02:09 crc kubenswrapper[4971]: I0309 10:02:09.379536 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4jd6\" (UniqueName: \"kubernetes.io/projected/b62d5246-a7fe-4be6-9935-732dafc959a0-kube-api-access-x4jd6\") on node \"crc\" DevicePath \"\"" Mar 09 10:02:09 crc kubenswrapper[4971]: I0309 10:02:09.382638 4971 scope.go:117] "RemoveContainer" containerID="981f1843c34316c5481bd60d5a3239a73b420beb85b61747c32bbd64f3b70863" Mar 09 10:02:09 crc kubenswrapper[4971]: I0309 10:02:09.907765 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kcgfg/must-gather-xchvf" event={"ID":"30f1f59d-f892-41c9-bbcf-f1a1f8fd9677","Type":"ContainerStarted","Data":"eb9dbd6d716d92b782824034dcae6cac7db93f3383203e20ab5f8f75edcb1f96"} Mar 09 10:02:09 crc kubenswrapper[4971]: I0309 10:02:09.908092 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kcgfg/must-gather-xchvf" event={"ID":"30f1f59d-f892-41c9-bbcf-f1a1f8fd9677","Type":"ContainerStarted","Data":"cf0a6c15d4a3b2748b60fa01d21a47e38be63a23c582789361ef666b89cb1bfd"} Mar 09 10:02:09 crc kubenswrapper[4971]: I0309 10:02:09.909742 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550842-kwsrg" event={"ID":"b62d5246-a7fe-4be6-9935-732dafc959a0","Type":"ContainerDied","Data":"9350316beb7aa63c36148d198fedea74b651611ea844fe84cc54a4649485e1c1"} Mar 09 10:02:09 crc kubenswrapper[4971]: I0309 10:02:09.909830 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9350316beb7aa63c36148d198fedea74b651611ea844fe84cc54a4649485e1c1" Mar 09 10:02:09 crc kubenswrapper[4971]: I0309 10:02:09.909800 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550842-kwsrg" Mar 09 10:02:09 crc kubenswrapper[4971]: I0309 10:02:09.931708 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kcgfg/must-gather-xchvf" podStartSLOduration=1.519163335 podStartE2EDuration="7.931688299s" podCreationTimestamp="2026-03-09 10:02:02 +0000 UTC" firstStartedPulling="2026-03-09 10:02:02.77260864 +0000 UTC m=+2526.332536450" lastFinishedPulling="2026-03-09 10:02:09.185133604 +0000 UTC m=+2532.745061414" observedRunningTime="2026-03-09 10:02:09.92752594 +0000 UTC m=+2533.487453750" watchObservedRunningTime="2026-03-09 10:02:09.931688299 +0000 UTC m=+2533.491616109" Mar 09 10:02:10 crc kubenswrapper[4971]: I0309 10:02:10.196848 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550836-4gnqj"] Mar 09 10:02:10 crc kubenswrapper[4971]: I0309 10:02:10.202963 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550836-4gnqj"] Mar 09 10:02:11 crc kubenswrapper[4971]: I0309 10:02:11.163797 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a220b921-116d-4764-86f8-894f497476f6" path="/var/lib/kubelet/pods/a220b921-116d-4764-86f8-894f497476f6/volumes" Mar 09 10:02:14 crc kubenswrapper[4971]: I0309 10:02:14.152664 4971 scope.go:117] "RemoveContainer" containerID="47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" Mar 09 10:02:14 crc kubenswrapper[4971]: E0309 10:02:14.153262 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 10:02:27 crc kubenswrapper[4971]: I0309 10:02:27.156962 4971 scope.go:117] "RemoveContainer" containerID="47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" Mar 09 10:02:27 crc kubenswrapper[4971]: E0309 10:02:27.157702 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 10:02:40 crc kubenswrapper[4971]: I0309 10:02:40.151722 4971 scope.go:117] "RemoveContainer" containerID="47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" Mar 09 10:02:40 crc kubenswrapper[4971]: E0309 10:02:40.152451 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 10:02:47 crc kubenswrapper[4971]: I0309 10:02:47.433584 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb_d7c7637a-be84-42d8-bb09-14af7f6acc0b/util/0.log" Mar 09 10:02:47 crc kubenswrapper[4971]: I0309 10:02:47.607666 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb_d7c7637a-be84-42d8-bb09-14af7f6acc0b/pull/0.log" Mar 09 10:02:47 crc kubenswrapper[4971]: I0309 10:02:47.633489 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb_d7c7637a-be84-42d8-bb09-14af7f6acc0b/pull/0.log" Mar 09 10:02:47 crc kubenswrapper[4971]: I0309 10:02:47.635310 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb_d7c7637a-be84-42d8-bb09-14af7f6acc0b/util/0.log" Mar 09 10:02:47 crc kubenswrapper[4971]: I0309 10:02:47.818283 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb_d7c7637a-be84-42d8-bb09-14af7f6acc0b/util/0.log" Mar 09 10:02:47 crc kubenswrapper[4971]: I0309 10:02:47.854450 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb_d7c7637a-be84-42d8-bb09-14af7f6acc0b/pull/0.log" Mar 09 10:02:47 crc kubenswrapper[4971]: I0309 10:02:47.881196 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_30e5b271560c361eaa311e71b932ea1535005918a3724116e4108e6d0arwfzb_d7c7637a-be84-42d8-bb09-14af7f6acc0b/extract/0.log" Mar 09 10:02:48 crc kubenswrapper[4971]: I0309 10:02:48.027277 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p_f7a9f6bd-2366-4ffb-95a1-14d177e046a6/util/0.log" Mar 09 10:02:48 crc kubenswrapper[4971]: I0309 10:02:48.145093 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p_f7a9f6bd-2366-4ffb-95a1-14d177e046a6/pull/0.log" Mar 09 10:02:48 crc kubenswrapper[4971]: I0309 10:02:48.151486 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p_f7a9f6bd-2366-4ffb-95a1-14d177e046a6/util/0.log" Mar 09 10:02:48 crc kubenswrapper[4971]: I0309 10:02:48.180446 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p_f7a9f6bd-2366-4ffb-95a1-14d177e046a6/pull/0.log" Mar 09 10:02:48 crc kubenswrapper[4971]: I0309 10:02:48.344639 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p_f7a9f6bd-2366-4ffb-95a1-14d177e046a6/util/0.log" Mar 09 10:02:48 crc kubenswrapper[4971]: I0309 10:02:48.361674 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p_f7a9f6bd-2366-4ffb-95a1-14d177e046a6/pull/0.log" Mar 09 10:02:48 crc kubenswrapper[4971]: I0309 10:02:48.361793 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5907bn9p_f7a9f6bd-2366-4ffb-95a1-14d177e046a6/extract/0.log" Mar 09 10:02:48 crc kubenswrapper[4971]: I0309 10:02:48.536385 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586_af0437bc-ace3-44dd-97d2-f23bee5b48f7/util/0.log" Mar 09 10:02:48 crc kubenswrapper[4971]: I0309 10:02:48.674284 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586_af0437bc-ace3-44dd-97d2-f23bee5b48f7/pull/0.log" Mar 09 10:02:48 crc kubenswrapper[4971]: I0309 10:02:48.694706 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586_af0437bc-ace3-44dd-97d2-f23bee5b48f7/pull/0.log" Mar 09 10:02:48 crc kubenswrapper[4971]: I0309 10:02:48.757879 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586_af0437bc-ace3-44dd-97d2-f23bee5b48f7/util/0.log" Mar 09 10:02:48 crc kubenswrapper[4971]: I0309 10:02:48.868682 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586_af0437bc-ace3-44dd-97d2-f23bee5b48f7/util/0.log" Mar 09 10:02:48 crc kubenswrapper[4971]: I0309 10:02:48.909876 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586_af0437bc-ace3-44dd-97d2-f23bee5b48f7/pull/0.log" Mar 09 10:02:48 crc kubenswrapper[4971]: I0309 10:02:48.928550 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40vh586_af0437bc-ace3-44dd-97d2-f23bee5b48f7/extract/0.log" Mar 09 10:02:49 crc kubenswrapper[4971]: I0309 10:02:49.049771 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs_29e24e00-d64d-44ac-9ea3-b6cfb014d046/util/0.log" Mar 09 10:02:49 crc kubenswrapper[4971]: I0309 10:02:49.223206 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs_29e24e00-d64d-44ac-9ea3-b6cfb014d046/pull/0.log" Mar 09 10:02:49 crc kubenswrapper[4971]: I0309 10:02:49.245409 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs_29e24e00-d64d-44ac-9ea3-b6cfb014d046/util/0.log" Mar 09 10:02:49 crc kubenswrapper[4971]: I0309 10:02:49.289617 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs_29e24e00-d64d-44ac-9ea3-b6cfb014d046/pull/0.log" Mar 09 10:02:49 crc kubenswrapper[4971]: I0309 10:02:49.436966 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs_29e24e00-d64d-44ac-9ea3-b6cfb014d046/util/0.log" Mar 09 10:02:49 crc kubenswrapper[4971]: I0309 10:02:49.457973 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs_29e24e00-d64d-44ac-9ea3-b6cfb014d046/pull/0.log" Mar 09 10:02:49 crc kubenswrapper[4971]: I0309 10:02:49.458472 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6c8jxs_29e24e00-d64d-44ac-9ea3-b6cfb014d046/extract/0.log" Mar 09 10:02:49 crc kubenswrapper[4971]: I0309 10:02:49.712551 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-index-p2vwl_02129c03-c7b1-4165-b737-019e757c635d/registry-server/0.log" Mar 09 10:02:49 crc kubenswrapper[4971]: I0309 10:02:49.922152 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf_48c57993-da6a-45d8-8103-c90eb33399b0/util/0.log" Mar 09 10:02:50 crc kubenswrapper[4971]: I0309 10:02:50.203328 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf_48c57993-da6a-45d8-8103-c90eb33399b0/util/0.log" Mar 09 10:02:50 crc kubenswrapper[4971]: I0309 10:02:50.231751 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf_48c57993-da6a-45d8-8103-c90eb33399b0/pull/0.log" Mar 09 10:02:50 crc kubenswrapper[4971]: I0309 10:02:50.235711 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf_48c57993-da6a-45d8-8103-c90eb33399b0/pull/0.log" Mar 09 10:02:50 crc kubenswrapper[4971]: I0309 10:02:50.554096 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf_48c57993-da6a-45d8-8103-c90eb33399b0/extract/0.log" Mar 09 10:02:50 crc kubenswrapper[4971]: I0309 10:02:50.561490 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf_48c57993-da6a-45d8-8103-c90eb33399b0/util/0.log" Mar 09 10:02:50 crc kubenswrapper[4971]: I0309 10:02:50.577593 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8c9gkgf_48c57993-da6a-45d8-8103-c90eb33399b0/pull/0.log" Mar 09 10:02:50 crc kubenswrapper[4971]: I0309 10:02:50.852574 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt_566d73b4-920e-430e-ab8c-da58c5834dce/util/0.log" Mar 09 10:02:51 crc kubenswrapper[4971]: I0309 10:02:51.091489 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt_566d73b4-920e-430e-ab8c-da58c5834dce/pull/0.log" Mar 09 10:02:51 crc kubenswrapper[4971]: I0309 10:02:51.094379 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt_566d73b4-920e-430e-ab8c-da58c5834dce/pull/0.log" Mar 09 10:02:51 crc kubenswrapper[4971]: I0309 10:02:51.103070 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt_566d73b4-920e-430e-ab8c-da58c5834dce/util/0.log" Mar 09 10:02:51 crc kubenswrapper[4971]: I0309 10:02:51.355959 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt_566d73b4-920e-430e-ab8c-da58c5834dce/util/0.log" Mar 09 10:02:51 crc kubenswrapper[4971]: I0309 10:02:51.357309 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt_566d73b4-920e-430e-ab8c-da58c5834dce/extract/0.log" Mar 09 10:02:51 crc kubenswrapper[4971]: I0309 10:02:51.539317 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd4824cf53af521817afef413be59efd51147582fa0bb18c7636ae5f656bhpt_566d73b4-920e-430e-ab8c-da58c5834dce/pull/0.log" Mar 09 10:02:51 crc kubenswrapper[4971]: I0309 10:02:51.703215 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-557f5c56bb-4glvw_ed9e539e-9f00-4168-9486-c1aa126c0514/manager/0.log" Mar 09 10:02:51 crc kubenswrapper[4971]: I0309 10:02:51.779261 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-index-rvqm6_5d62c895-9226-41c3-b4b3-23f6990c1ee7/registry-server/0.log" Mar 09 10:02:51 crc kubenswrapper[4971]: I0309 10:02:51.923526 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-6659f69886-7494k_2fead548-d73c-4b70-8a1f-84aedf664c53/manager/0.log" Mar 09 10:02:52 crc kubenswrapper[4971]: I0309 10:02:52.019555 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-index-4m4x8_9be7483c-58ce-4857-b90f-fe74b32b3bdd/registry-server/0.log" Mar 09 10:02:52 crc kubenswrapper[4971]: I0309 10:02:52.131717 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5794c4499-rj58k_f0a1e70a-0ac8-4e6e-87d6-85e6097cf8e7/manager/0.log" Mar 09 10:02:52 crc kubenswrapper[4971]: I0309 10:02:52.151961 4971 scope.go:117] "RemoveContainer" containerID="47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" Mar 09 10:02:52 crc kubenswrapper[4971]: E0309 10:02:52.152246 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 10:02:52 crc kubenswrapper[4971]: I0309 10:02:52.314110 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-index-g8xc7_ae6f5029-30ab-4f16-bae0-38c580d4acfa/registry-server/0.log" Mar 09 10:02:52 crc kubenswrapper[4971]: I0309 10:02:52.409773 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-779fc9694b-gvnmw_76fe4e94-ba21-4369-882b-efdc47c25ec3/operator/0.log" Mar 09 10:02:52 crc kubenswrapper[4971]: I0309 10:02:52.468797 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5bfb4fc94-fzpfg_732b8106-b919-410c-b481-43320eb43604/manager/0.log" Mar 09 10:02:52 crc kubenswrapper[4971]: I0309 10:02:52.517417 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-index-f85lg_1d93efdc-b806-4f44-806b-9b6b43b80b22/registry-server/0.log" Mar 09 10:02:52 crc kubenswrapper[4971]: I0309 10:02:52.657295 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6fd95cd797-b9sk8_802c0560-39aa-4e21-a55e-6374f50e4301/manager/0.log" Mar 09 10:02:52 crc kubenswrapper[4971]: I0309 10:02:52.679064 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-index-jc7b2_97cd3aa2-fa2c-4950-aadc-75530bbfe9bb/registry-server/0.log" Mar 09 10:03:03 crc kubenswrapper[4971]: I0309 10:03:03.152454 4971 scope.go:117] "RemoveContainer" containerID="47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" Mar 09 10:03:03 crc kubenswrapper[4971]: E0309 10:03:03.153463 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 10:03:06 crc kubenswrapper[4971]: I0309 10:03:06.898426 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dvp8t_c9cdbff0-0cca-4375-8c92-1117ce5d1dea/control-plane-machine-set-operator/0.log" Mar 09 10:03:07 crc kubenswrapper[4971]: I0309 10:03:07.048508 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t9hb6_db64f07f-f1cb-4754-8e1f-33951a826f78/kube-rbac-proxy/0.log" Mar 09 10:03:07 crc kubenswrapper[4971]: I0309 10:03:07.081601 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t9hb6_db64f07f-f1cb-4754-8e1f-33951a826f78/machine-api-operator/0.log" Mar 09 10:03:09 crc kubenswrapper[4971]: I0309 10:03:09.634836 4971 scope.go:117] "RemoveContainer" containerID="9c7e0e86b8223b66eeeee7e68d93bfc096cc6e5e37d0a55d9172ea7846991416" Mar 09 10:03:09 crc kubenswrapper[4971]: I0309 10:03:09.672959 4971 scope.go:117] "RemoveContainer" containerID="76e0391295db63046de77962ba199197b6ccf7fb83aafa13c1ded24ebb4178d3" Mar 09 10:03:09 crc kubenswrapper[4971]: I0309 10:03:09.694405 4971 scope.go:117] "RemoveContainer" containerID="9209e709ecb9e93c272bf7c92edf9bfb58eb7683ec1e7a2aae415a62f09861f4" Mar 09 10:03:09 crc kubenswrapper[4971]: I0309 10:03:09.734734 4971 scope.go:117] "RemoveContainer" containerID="475793facec3a2f431770ecc9f9694cfcb75932344012d26b4b3ba8f7b779aff" Mar 09 10:03:09 crc kubenswrapper[4971]: I0309 10:03:09.763493 4971 scope.go:117] "RemoveContainer" containerID="9610e8eb00338ed9eaafbdffba515c31d540923887aef38c53f3a1c5b79eb694" Mar 09 10:03:09 crc kubenswrapper[4971]: I0309 10:03:09.803589 4971 scope.go:117] "RemoveContainer" containerID="0540622c2bcb9bcf74f1cff2a4d07ae304122babc85ef78c6a76be2b57b0ffa7" Mar 09 10:03:09 crc kubenswrapper[4971]: I0309 10:03:09.842517 4971 scope.go:117] "RemoveContainer" containerID="664d86477fa926710de60a78aee673ad9c9795e7460c47390a6e656aea7cf8e2" Mar 09 10:03:09 crc kubenswrapper[4971]: I0309 10:03:09.865244 4971 scope.go:117] "RemoveContainer" containerID="bf4e0b4cd830d25934f87c427bbbad31e1102c0c0ea5bc2a89a31559114f2caf" Mar 09 10:03:17 crc kubenswrapper[4971]: I0309 10:03:17.157455 4971 scope.go:117] "RemoveContainer" containerID="47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" Mar 09 10:03:17 crc kubenswrapper[4971]: E0309 10:03:17.158205 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 10:03:29 crc kubenswrapper[4971]: I0309 10:03:29.152471 4971 scope.go:117] "RemoveContainer" containerID="47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" Mar 09 10:03:29 crc kubenswrapper[4971]: E0309 10:03:29.153246 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 10:03:34 crc kubenswrapper[4971]: I0309 10:03:34.446999 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-z4jb9_3a469eec-42c7-456a-9315-d028751496cd/controller/0.log" Mar 09 10:03:34 crc kubenswrapper[4971]: I0309 10:03:34.462702 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-z4jb9_3a469eec-42c7-456a-9315-d028751496cd/kube-rbac-proxy/0.log" Mar 09 10:03:34 crc kubenswrapper[4971]: I0309 10:03:34.616565 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cs8lp_624ce1af-f384-423e-847c-dc60c2996603/cp-frr-files/0.log" Mar 09 10:03:34 crc kubenswrapper[4971]: I0309 10:03:34.809003 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cs8lp_624ce1af-f384-423e-847c-dc60c2996603/cp-frr-files/0.log" Mar 09 10:03:34 crc kubenswrapper[4971]: I0309 10:03:34.810954 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cs8lp_624ce1af-f384-423e-847c-dc60c2996603/cp-reloader/0.log" Mar 09 10:03:34 crc kubenswrapper[4971]: I0309 10:03:34.815937 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cs8lp_624ce1af-f384-423e-847c-dc60c2996603/cp-metrics/0.log" Mar 09 10:03:34 crc kubenswrapper[4971]: I0309 10:03:34.849238 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cs8lp_624ce1af-f384-423e-847c-dc60c2996603/cp-reloader/0.log" Mar 09 10:03:34 crc kubenswrapper[4971]: I0309 10:03:34.999584 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cs8lp_624ce1af-f384-423e-847c-dc60c2996603/cp-frr-files/0.log" Mar 09 10:03:35 crc kubenswrapper[4971]: I0309 10:03:35.007781 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cs8lp_624ce1af-f384-423e-847c-dc60c2996603/cp-metrics/0.log" Mar 09 10:03:35 crc kubenswrapper[4971]: I0309 10:03:35.032619 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cs8lp_624ce1af-f384-423e-847c-dc60c2996603/cp-metrics/0.log" Mar 09 10:03:35 crc kubenswrapper[4971]: I0309 10:03:35.067833 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cs8lp_624ce1af-f384-423e-847c-dc60c2996603/cp-reloader/0.log" Mar 09 10:03:35 crc kubenswrapper[4971]: I0309 10:03:35.183838 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cs8lp_624ce1af-f384-423e-847c-dc60c2996603/cp-metrics/0.log" Mar 09 10:03:35 crc kubenswrapper[4971]: I0309 10:03:35.213075 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cs8lp_624ce1af-f384-423e-847c-dc60c2996603/cp-reloader/0.log" Mar 09 10:03:35 crc kubenswrapper[4971]: I0309 10:03:35.227415 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cs8lp_624ce1af-f384-423e-847c-dc60c2996603/cp-frr-files/0.log" Mar 09 10:03:35 crc kubenswrapper[4971]: I0309 10:03:35.258992 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cs8lp_624ce1af-f384-423e-847c-dc60c2996603/controller/0.log" Mar 09 10:03:35 crc kubenswrapper[4971]: I0309 10:03:35.380270 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cs8lp_624ce1af-f384-423e-847c-dc60c2996603/frr-metrics/0.log" Mar 09 10:03:35 crc kubenswrapper[4971]: I0309 10:03:35.423810 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cs8lp_624ce1af-f384-423e-847c-dc60c2996603/kube-rbac-proxy/0.log" Mar 09 10:03:35 crc kubenswrapper[4971]: I0309 10:03:35.444867 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cs8lp_624ce1af-f384-423e-847c-dc60c2996603/kube-rbac-proxy-frr/0.log" Mar 09 10:03:35 crc kubenswrapper[4971]: I0309 10:03:35.600881 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cs8lp_624ce1af-f384-423e-847c-dc60c2996603/reloader/0.log" Mar 09 10:03:35 crc kubenswrapper[4971]: I0309 10:03:35.679974 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-bvkfw_49bcc560-e687-4f99-9526-4baacbce3baa/frr-k8s-webhook-server/0.log" Mar 09 10:03:35 crc kubenswrapper[4971]: I0309 10:03:35.813510 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6758965db4-5xg8k_d00659bf-90ef-473d-b641-160aafb0e5cb/manager/0.log" Mar 09 10:03:36 crc kubenswrapper[4971]: I0309 10:03:36.041228 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-78cf4d58c9-fftzx_f97f3e74-40e6-4980-a47c-e184ccb1ee4e/webhook-server/0.log" Mar 09 10:03:36 crc kubenswrapper[4971]: I0309 10:03:36.192880 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4xcv7_6280ff70-b6ef-483e-a767-9b62f92c1d4e/kube-rbac-proxy/0.log" Mar 09 10:03:36 crc kubenswrapper[4971]: I0309 10:03:36.376913 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4xcv7_6280ff70-b6ef-483e-a767-9b62f92c1d4e/speaker/0.log" Mar 09 10:03:36 crc kubenswrapper[4971]: I0309 10:03:36.885484 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cs8lp_624ce1af-f384-423e-847c-dc60c2996603/frr/0.log" Mar 09 10:03:43 crc kubenswrapper[4971]: I0309 10:03:43.153318 4971 scope.go:117] "RemoveContainer" containerID="47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" Mar 09 10:03:43 crc kubenswrapper[4971]: E0309 10:03:43.154098 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 10:03:51 crc kubenswrapper[4971]: I0309 10:03:51.640242 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_barbican-api-9984f6cdd-9rzrp_da90eb8d-a57a-4d85-978b-919c2008cd3a/barbican-api/0.log" Mar 09 10:03:51 crc kubenswrapper[4971]: I0309 10:03:51.770192 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_barbican-api-9984f6cdd-9rzrp_da90eb8d-a57a-4d85-978b-919c2008cd3a/barbican-api-log/0.log" Mar 09 10:03:51 crc kubenswrapper[4971]: I0309 10:03:51.828410 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_barbican-db-sync-5rgjq_1a41ed0b-fe0b-486e-844e-4f0cfa225bb8/barbican-db-sync/0.log" Mar 09 10:03:52 crc kubenswrapper[4971]: I0309 10:03:52.050775 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_barbican-keystone-listener-7c5f6b4756-n7vwl_a3c571db-af37-4e76-a633-9f58b814341f/barbican-keystone-listener/0.log" Mar 09 10:03:52 crc kubenswrapper[4971]: I0309 10:03:52.123856 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_barbican-keystone-listener-7c5f6b4756-n7vwl_a3c571db-af37-4e76-a633-9f58b814341f/barbican-keystone-listener-log/0.log" Mar 09 10:03:52 crc kubenswrapper[4971]: I0309 10:03:52.197710 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_barbican-worker-5dbbf7ff77-mmrwc_e2752865-3de3-46a0-b9d1-0ddb9422835b/barbican-worker/0.log" Mar 09 10:03:52 crc kubenswrapper[4971]: I0309 10:03:52.235169 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_barbican-worker-5dbbf7ff77-mmrwc_e2752865-3de3-46a0-b9d1-0ddb9422835b/barbican-worker-log/0.log" Mar 09 10:03:52 crc kubenswrapper[4971]: I0309 10:03:52.397020 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_keystone-cron-29550841-pnmz4_364959d2-c613-4a79-940c-3c00d24887ca/keystone-cron/0.log" Mar 09 10:03:52 crc kubenswrapper[4971]: I0309 10:03:52.759310 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_keystone-5b6d9bc6b9-jlndm_feed0de4-5e24-4a88-8a8c-4552940e76bb/keystone-api/0.log" Mar 09 10:03:52 crc kubenswrapper[4971]: I0309 10:03:52.763334 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-0_7fde2aa4-e297-4641-b450-e95ea05b5229/mysql-bootstrap/0.log" Mar 09 10:03:53 crc kubenswrapper[4971]: I0309 10:03:53.005584 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-0_7fde2aa4-e297-4641-b450-e95ea05b5229/galera/0.log" Mar 09 10:03:53 crc kubenswrapper[4971]: I0309 10:03:53.032761 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-0_7fde2aa4-e297-4641-b450-e95ea05b5229/mysql-bootstrap/0.log" Mar 09 10:03:53 crc kubenswrapper[4971]: I0309 10:03:53.300468 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-1_09cf3e3d-f27d-4258-a35a-17172dce14cf/mysql-bootstrap/0.log" Mar 09 10:03:53 crc kubenswrapper[4971]: I0309 10:03:53.468682 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-1_09cf3e3d-f27d-4258-a35a-17172dce14cf/mysql-bootstrap/0.log" Mar 09 10:03:53 crc kubenswrapper[4971]: I0309 10:03:53.503724 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-1_09cf3e3d-f27d-4258-a35a-17172dce14cf/galera/0.log" Mar 09 10:03:53 crc kubenswrapper[4971]: I0309 10:03:53.687985 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-2_0248bf28-3089-40e7-9ab1-2131010368c4/mysql-bootstrap/0.log" Mar 09 10:03:53 crc kubenswrapper[4971]: I0309 10:03:53.879320 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-2_0248bf28-3089-40e7-9ab1-2131010368c4/mysql-bootstrap/0.log" Mar 09 10:03:53 crc kubenswrapper[4971]: I0309 10:03:53.941168 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-2_0248bf28-3089-40e7-9ab1-2131010368c4/galera/0.log" Mar 09 10:03:54 crc kubenswrapper[4971]: I0309 10:03:54.094152 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_rabbitmq-server-0_5fbe67b5-f371-4d9a-9777-cbfeff3f2863/setup-container/0.log" Mar 09 10:03:54 crc kubenswrapper[4971]: I0309 10:03:54.246234 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_rabbitmq-server-0_5fbe67b5-f371-4d9a-9777-cbfeff3f2863/setup-container/0.log" Mar 09 10:03:54 crc kubenswrapper[4971]: I0309 10:03:54.312112 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_rabbitmq-server-0_5fbe67b5-f371-4d9a-9777-cbfeff3f2863/rabbitmq/0.log" Mar 09 10:03:54 crc kubenswrapper[4971]: I0309 10:03:54.481289 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-proxy-76c998454c-4gkzk_c0f6b660-a1e1-4d7d-bff5-3b2cc666bada/proxy-httpd/0.log" Mar 09 10:03:54 crc kubenswrapper[4971]: I0309 10:03:54.531138 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-proxy-76c998454c-4gkzk_c0f6b660-a1e1-4d7d-bff5-3b2cc666bada/proxy-server/0.log" Mar 09 10:03:54 crc kubenswrapper[4971]: I0309 10:03:54.714170 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_698abc6e-c9eb-4568-8639-8c10c5958c3c/account-auditor/0.log" Mar 09 10:03:54 crc kubenswrapper[4971]: I0309 10:03:54.737753 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_698abc6e-c9eb-4568-8639-8c10c5958c3c/account-reaper/0.log" Mar 09 10:03:54 crc kubenswrapper[4971]: I0309 10:03:54.837685 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_698abc6e-c9eb-4568-8639-8c10c5958c3c/account-replicator/0.log" Mar 09 10:03:54 crc kubenswrapper[4971]: I0309 10:03:54.930596 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_698abc6e-c9eb-4568-8639-8c10c5958c3c/account-server/0.log" Mar 09 10:03:54 crc kubenswrapper[4971]: I0309 10:03:54.981401 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_698abc6e-c9eb-4568-8639-8c10c5958c3c/container-auditor/0.log" Mar 09 10:03:55 crc kubenswrapper[4971]: I0309 10:03:55.039365 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_698abc6e-c9eb-4568-8639-8c10c5958c3c/container-replicator/0.log" Mar 09 10:03:55 crc kubenswrapper[4971]: I0309 10:03:55.072578 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_memcached-0_e4c9ed17-abec-40ab-acd0-aa857fd946f9/memcached/0.log" Mar 09 10:03:55 crc kubenswrapper[4971]: I0309 10:03:55.165747 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_698abc6e-c9eb-4568-8639-8c10c5958c3c/container-server/0.log" Mar 09 10:03:55 crc kubenswrapper[4971]: I0309 10:03:55.198320 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_698abc6e-c9eb-4568-8639-8c10c5958c3c/container-updater/0.log" Mar 09 10:03:55 crc kubenswrapper[4971]: I0309 10:03:55.204918 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_698abc6e-c9eb-4568-8639-8c10c5958c3c/object-auditor/0.log" Mar 09 10:03:55 crc kubenswrapper[4971]: I0309 10:03:55.245447 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_698abc6e-c9eb-4568-8639-8c10c5958c3c/object-expirer/0.log" Mar 09 10:03:55 crc kubenswrapper[4971]: I0309 10:03:55.363245 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_698abc6e-c9eb-4568-8639-8c10c5958c3c/object-replicator/0.log" Mar 09 10:03:55 crc kubenswrapper[4971]: I0309 10:03:55.396744 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_698abc6e-c9eb-4568-8639-8c10c5958c3c/object-updater/0.log" Mar 09 10:03:55 crc kubenswrapper[4971]: I0309 10:03:55.399309 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_698abc6e-c9eb-4568-8639-8c10c5958c3c/object-server/0.log" Mar 09 10:03:55 crc kubenswrapper[4971]: I0309 10:03:55.433035 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_698abc6e-c9eb-4568-8639-8c10c5958c3c/rsync/0.log" Mar 09 10:03:55 crc kubenswrapper[4971]: I0309 10:03:55.560222 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_604e95e7-5b66-4837-ae0a-2b08c59fac4b/account-auditor/0.log" Mar 09 10:03:55 crc kubenswrapper[4971]: I0309 10:03:55.582103 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_698abc6e-c9eb-4568-8639-8c10c5958c3c/swift-recon-cron/0.log" Mar 09 10:03:55 crc kubenswrapper[4971]: I0309 10:03:55.608090 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_604e95e7-5b66-4837-ae0a-2b08c59fac4b/account-reaper/0.log" Mar 09 10:03:55 crc kubenswrapper[4971]: I0309 10:03:55.735158 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_604e95e7-5b66-4837-ae0a-2b08c59fac4b/account-server/0.log" Mar 09 10:03:55 crc kubenswrapper[4971]: I0309 10:03:55.746338 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_604e95e7-5b66-4837-ae0a-2b08c59fac4b/account-replicator/0.log" Mar 09 10:03:55 crc kubenswrapper[4971]: I0309 10:03:55.791452 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_604e95e7-5b66-4837-ae0a-2b08c59fac4b/container-replicator/0.log" Mar 09 10:03:55 crc kubenswrapper[4971]: I0309 10:03:55.799052 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_604e95e7-5b66-4837-ae0a-2b08c59fac4b/container-auditor/0.log" Mar 09 10:03:55 crc kubenswrapper[4971]: I0309 10:03:55.919255 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_604e95e7-5b66-4837-ae0a-2b08c59fac4b/container-server/0.log" Mar 09 10:03:55 crc kubenswrapper[4971]: I0309 10:03:55.919556 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_604e95e7-5b66-4837-ae0a-2b08c59fac4b/container-updater/0.log" Mar 09 10:03:55 crc kubenswrapper[4971]: I0309 10:03:55.957574 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_604e95e7-5b66-4837-ae0a-2b08c59fac4b/object-auditor/0.log" Mar 09 10:03:55 crc kubenswrapper[4971]: I0309 10:03:55.977823 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_604e95e7-5b66-4837-ae0a-2b08c59fac4b/object-expirer/0.log" Mar 09 10:03:55 crc kubenswrapper[4971]: I0309 10:03:55.989204 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_604e95e7-5b66-4837-ae0a-2b08c59fac4b/object-replicator/0.log" Mar 09 10:03:56 crc kubenswrapper[4971]: I0309 10:03:56.078239 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_604e95e7-5b66-4837-ae0a-2b08c59fac4b/object-server/0.log" Mar 09 10:03:56 crc kubenswrapper[4971]: I0309 10:03:56.094009 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_604e95e7-5b66-4837-ae0a-2b08c59fac4b/object-updater/0.log" Mar 09 10:03:56 crc kubenswrapper[4971]: I0309 10:03:56.111168 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_604e95e7-5b66-4837-ae0a-2b08c59fac4b/rsync/0.log" Mar 09 10:03:56 crc kubenswrapper[4971]: I0309 10:03:56.138799 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_604e95e7-5b66-4837-ae0a-2b08c59fac4b/swift-recon-cron/0.log" Mar 09 10:03:56 crc kubenswrapper[4971]: I0309 10:03:56.151731 4971 scope.go:117] "RemoveContainer" containerID="47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" Mar 09 10:03:56 crc kubenswrapper[4971]: E0309 10:03:56.151944 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 10:03:56 crc kubenswrapper[4971]: I0309 10:03:56.249951 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_ae2371a4-446c-4c46-844e-0132f54ca498/account-auditor/0.log" Mar 09 10:03:56 crc kubenswrapper[4971]: I0309 10:03:56.273503 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_ae2371a4-446c-4c46-844e-0132f54ca498/account-reaper/0.log" Mar 09 10:03:56 crc kubenswrapper[4971]: I0309 10:03:56.289220 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_ae2371a4-446c-4c46-844e-0132f54ca498/account-replicator/0.log" Mar 09 10:03:56 crc kubenswrapper[4971]: I0309 10:03:56.333557 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_ae2371a4-446c-4c46-844e-0132f54ca498/account-server/0.log" Mar 09 10:03:56 crc kubenswrapper[4971]: I0309 10:03:56.399156 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_ae2371a4-446c-4c46-844e-0132f54ca498/container-auditor/0.log" Mar 09 10:03:56 crc kubenswrapper[4971]: I0309 10:03:56.430984 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_ae2371a4-446c-4c46-844e-0132f54ca498/container-replicator/0.log" Mar 09 10:03:56 crc kubenswrapper[4971]: I0309 10:03:56.460641 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_ae2371a4-446c-4c46-844e-0132f54ca498/container-server/0.log" Mar 09 10:03:56 crc kubenswrapper[4971]: I0309 10:03:56.466767 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_ae2371a4-446c-4c46-844e-0132f54ca498/container-updater/0.log" Mar 09 10:03:56 crc kubenswrapper[4971]: I0309 10:03:56.494769 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_ae2371a4-446c-4c46-844e-0132f54ca498/object-auditor/0.log" Mar 09 10:03:56 crc kubenswrapper[4971]: I0309 10:03:56.578819 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_ae2371a4-446c-4c46-844e-0132f54ca498/object-expirer/0.log" Mar 09 10:03:56 crc kubenswrapper[4971]: I0309 10:03:56.611819 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_ae2371a4-446c-4c46-844e-0132f54ca498/object-replicator/0.log" Mar 09 10:03:56 crc kubenswrapper[4971]: I0309 10:03:56.639541 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_ae2371a4-446c-4c46-844e-0132f54ca498/object-server/0.log" Mar 09 10:03:56 crc kubenswrapper[4971]: I0309 10:03:56.654743 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_ae2371a4-446c-4c46-844e-0132f54ca498/object-updater/0.log" Mar 09 10:03:56 crc kubenswrapper[4971]: I0309 10:03:56.690811 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_ae2371a4-446c-4c46-844e-0132f54ca498/rsync/0.log" Mar 09 10:03:56 crc kubenswrapper[4971]: I0309 10:03:56.756413 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_ae2371a4-446c-4c46-844e-0132f54ca498/swift-recon-cron/0.log" Mar 09 10:04:00 crc kubenswrapper[4971]: I0309 10:04:00.143723 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550844-f6wb5"] Mar 09 10:04:00 crc kubenswrapper[4971]: E0309 10:04:00.144475 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b62d5246-a7fe-4be6-9935-732dafc959a0" containerName="oc" Mar 09 10:04:00 crc kubenswrapper[4971]: I0309 10:04:00.144496 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b62d5246-a7fe-4be6-9935-732dafc959a0" containerName="oc" Mar 09 10:04:00 crc kubenswrapper[4971]: E0309 10:04:00.144521 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b392e6-c5b9-4bdf-93a2-9a10abd331bc" containerName="registry-server" Mar 09 10:04:00 crc kubenswrapper[4971]: I0309 10:04:00.144528 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b392e6-c5b9-4bdf-93a2-9a10abd331bc" containerName="registry-server" Mar 09 10:04:00 crc kubenswrapper[4971]: E0309 10:04:00.144547 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b392e6-c5b9-4bdf-93a2-9a10abd331bc" containerName="extract-content" Mar 09 10:04:00 crc kubenswrapper[4971]: I0309 10:04:00.144555 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b392e6-c5b9-4bdf-93a2-9a10abd331bc" containerName="extract-content" Mar 09 10:04:00 crc kubenswrapper[4971]: E0309 10:04:00.144567 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b392e6-c5b9-4bdf-93a2-9a10abd331bc" containerName="extract-utilities" Mar 09 10:04:00 crc kubenswrapper[4971]: I0309 10:04:00.144575 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b392e6-c5b9-4bdf-93a2-9a10abd331bc" containerName="extract-utilities" Mar 09 10:04:00 crc kubenswrapper[4971]: I0309 10:04:00.144741 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b392e6-c5b9-4bdf-93a2-9a10abd331bc" containerName="registry-server" Mar 09 10:04:00 crc kubenswrapper[4971]: I0309 10:04:00.144771 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b62d5246-a7fe-4be6-9935-732dafc959a0" containerName="oc" Mar 09 10:04:00 crc kubenswrapper[4971]: I0309 10:04:00.145385 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550844-f6wb5" Mar 09 10:04:00 crc kubenswrapper[4971]: I0309 10:04:00.148064 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:04:00 crc kubenswrapper[4971]: I0309 10:04:00.148903 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-xhrv2" Mar 09 10:04:00 crc kubenswrapper[4971]: I0309 10:04:00.149076 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:04:00 crc kubenswrapper[4971]: I0309 10:04:00.160990 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550844-f6wb5"] Mar 09 10:04:00 crc kubenswrapper[4971]: I0309 10:04:00.274300 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzzvt\" (UniqueName: \"kubernetes.io/projected/0317963a-2307-420e-a2b2-0c3df19d4959-kube-api-access-gzzvt\") pod \"auto-csr-approver-29550844-f6wb5\" (UID: \"0317963a-2307-420e-a2b2-0c3df19d4959\") " pod="openshift-infra/auto-csr-approver-29550844-f6wb5" Mar 09 10:04:00 crc kubenswrapper[4971]: I0309 10:04:00.377451 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzzvt\" (UniqueName: \"kubernetes.io/projected/0317963a-2307-420e-a2b2-0c3df19d4959-kube-api-access-gzzvt\") pod \"auto-csr-approver-29550844-f6wb5\" (UID: \"0317963a-2307-420e-a2b2-0c3df19d4959\") " pod="openshift-infra/auto-csr-approver-29550844-f6wb5" Mar 09 10:04:00 crc kubenswrapper[4971]: I0309 10:04:00.397127 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzzvt\" (UniqueName: \"kubernetes.io/projected/0317963a-2307-420e-a2b2-0c3df19d4959-kube-api-access-gzzvt\") pod \"auto-csr-approver-29550844-f6wb5\" (UID: \"0317963a-2307-420e-a2b2-0c3df19d4959\") " pod="openshift-infra/auto-csr-approver-29550844-f6wb5" Mar 09 10:04:00 crc kubenswrapper[4971]: I0309 10:04:00.461549 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550844-f6wb5" Mar 09 10:04:00 crc kubenswrapper[4971]: I0309 10:04:00.923991 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550844-f6wb5"] Mar 09 10:04:00 crc kubenswrapper[4971]: I0309 10:04:00.955148 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550844-f6wb5" event={"ID":"0317963a-2307-420e-a2b2-0c3df19d4959","Type":"ContainerStarted","Data":"a9b0ffdff1795e23d1067a19b9cd91ce01f23620fa72cb9628c154c0e2260887"} Mar 09 10:04:02 crc kubenswrapper[4971]: I0309 10:04:02.983315 4971 generic.go:334] "Generic (PLEG): container finished" podID="0317963a-2307-420e-a2b2-0c3df19d4959" containerID="9a6087d19bbbe3332b8637aa5f386f1478a8f140d17f32f4cdf1583666df107c" exitCode=0 Mar 09 10:04:02 crc kubenswrapper[4971]: I0309 10:04:02.983380 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550844-f6wb5" event={"ID":"0317963a-2307-420e-a2b2-0c3df19d4959","Type":"ContainerDied","Data":"9a6087d19bbbe3332b8637aa5f386f1478a8f140d17f32f4cdf1583666df107c"} Mar 09 10:04:04 crc kubenswrapper[4971]: I0309 10:04:04.273336 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550844-f6wb5" Mar 09 10:04:04 crc kubenswrapper[4971]: I0309 10:04:04.439220 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzzvt\" (UniqueName: \"kubernetes.io/projected/0317963a-2307-420e-a2b2-0c3df19d4959-kube-api-access-gzzvt\") pod \"0317963a-2307-420e-a2b2-0c3df19d4959\" (UID: \"0317963a-2307-420e-a2b2-0c3df19d4959\") " Mar 09 10:04:04 crc kubenswrapper[4971]: I0309 10:04:04.445550 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0317963a-2307-420e-a2b2-0c3df19d4959-kube-api-access-gzzvt" (OuterVolumeSpecName: "kube-api-access-gzzvt") pod "0317963a-2307-420e-a2b2-0c3df19d4959" (UID: "0317963a-2307-420e-a2b2-0c3df19d4959"). InnerVolumeSpecName "kube-api-access-gzzvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:04:04 crc kubenswrapper[4971]: I0309 10:04:04.541439 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzzvt\" (UniqueName: \"kubernetes.io/projected/0317963a-2307-420e-a2b2-0c3df19d4959-kube-api-access-gzzvt\") on node \"crc\" DevicePath \"\"" Mar 09 10:04:05 crc kubenswrapper[4971]: I0309 10:04:05.001260 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550844-f6wb5" event={"ID":"0317963a-2307-420e-a2b2-0c3df19d4959","Type":"ContainerDied","Data":"a9b0ffdff1795e23d1067a19b9cd91ce01f23620fa72cb9628c154c0e2260887"} Mar 09 10:04:05 crc kubenswrapper[4971]: I0309 10:04:05.001719 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9b0ffdff1795e23d1067a19b9cd91ce01f23620fa72cb9628c154c0e2260887" Mar 09 10:04:05 crc kubenswrapper[4971]: I0309 10:04:05.001674 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550844-f6wb5" Mar 09 10:04:05 crc kubenswrapper[4971]: I0309 10:04:05.325742 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550838-fvgvb"] Mar 09 10:04:05 crc kubenswrapper[4971]: I0309 10:04:05.331975 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550838-fvgvb"] Mar 09 10:04:07 crc kubenswrapper[4971]: I0309 10:04:07.163997 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ec3c03f-f1a7-4212-9f5f-0c3b79671ddb" path="/var/lib/kubelet/pods/4ec3c03f-f1a7-4212-9f5f-0c3b79671ddb/volumes" Mar 09 10:04:08 crc kubenswrapper[4971]: I0309 10:04:08.152178 4971 scope.go:117] "RemoveContainer" containerID="47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" Mar 09 10:04:08 crc kubenswrapper[4971]: E0309 10:04:08.152740 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 10:04:08 crc kubenswrapper[4971]: I0309 10:04:08.857838 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pqnjj_cef1bcb9-ac3d-4891-8308-d53d5acf90ac/extract-utilities/0.log" Mar 09 10:04:09 crc kubenswrapper[4971]: I0309 10:04:09.069510 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pqnjj_cef1bcb9-ac3d-4891-8308-d53d5acf90ac/extract-content/0.log" Mar 09 10:04:09 crc kubenswrapper[4971]: I0309 10:04:09.102174 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pqnjj_cef1bcb9-ac3d-4891-8308-d53d5acf90ac/extract-utilities/0.log" Mar 09 10:04:09 crc kubenswrapper[4971]: I0309 10:04:09.106792 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pqnjj_cef1bcb9-ac3d-4891-8308-d53d5acf90ac/extract-content/0.log" Mar 09 10:04:09 crc kubenswrapper[4971]: I0309 10:04:09.268113 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pqnjj_cef1bcb9-ac3d-4891-8308-d53d5acf90ac/extract-utilities/0.log" Mar 09 10:04:09 crc kubenswrapper[4971]: I0309 10:04:09.300141 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pqnjj_cef1bcb9-ac3d-4891-8308-d53d5acf90ac/extract-content/0.log" Mar 09 10:04:09 crc kubenswrapper[4971]: I0309 10:04:09.574170 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f6f5p_299ba20e-3df0-4e8d-9f7b-8e2201422c98/extract-utilities/0.log" Mar 09 10:04:09 crc kubenswrapper[4971]: I0309 10:04:09.656004 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f6f5p_299ba20e-3df0-4e8d-9f7b-8e2201422c98/extract-utilities/0.log" Mar 09 10:04:09 crc kubenswrapper[4971]: I0309 10:04:09.687961 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f6f5p_299ba20e-3df0-4e8d-9f7b-8e2201422c98/extract-content/0.log" Mar 09 10:04:09 crc kubenswrapper[4971]: I0309 10:04:09.792016 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f6f5p_299ba20e-3df0-4e8d-9f7b-8e2201422c98/extract-content/0.log" Mar 09 10:04:09 crc kubenswrapper[4971]: I0309 10:04:09.822151 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pqnjj_cef1bcb9-ac3d-4891-8308-d53d5acf90ac/registry-server/0.log" Mar 09 10:04:10 crc kubenswrapper[4971]: I0309 10:04:10.002400 4971 scope.go:117] "RemoveContainer" containerID="db9706a2cd00b931324fa2f277088837c2872c9b3268294838243eb2e2d7fecc" Mar 09 10:04:10 crc kubenswrapper[4971]: I0309 10:04:10.003530 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f6f5p_299ba20e-3df0-4e8d-9f7b-8e2201422c98/extract-content/0.log" Mar 09 10:04:10 crc kubenswrapper[4971]: I0309 10:04:10.008525 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f6f5p_299ba20e-3df0-4e8d-9f7b-8e2201422c98/extract-utilities/0.log" Mar 09 10:04:10 crc kubenswrapper[4971]: I0309 10:04:10.028702 4971 scope.go:117] "RemoveContainer" containerID="25f4f5ce8ee26693d1bdd6e6c575a9ade25a381aed2bebdb78375e00d502030e" Mar 09 10:04:10 crc kubenswrapper[4971]: I0309 10:04:10.092836 4971 scope.go:117] "RemoveContainer" containerID="fb822c0e1473ee993de436875741d20de5a4cd0128860d42b74dbfddd7d7653a" Mar 09 10:04:10 crc kubenswrapper[4971]: I0309 10:04:10.138605 4971 scope.go:117] "RemoveContainer" containerID="da66792949c32da0ae0b5463bf18a328622da4220e5b44b4e0bfd9159b9a726f" Mar 09 10:04:10 crc kubenswrapper[4971]: I0309 10:04:10.163162 4971 scope.go:117] "RemoveContainer" containerID="919e6e55c80df5f96f166baccbc961dfd066fb667b6821478672eb0a661336ac" Mar 09 10:04:10 crc kubenswrapper[4971]: I0309 10:04:10.209487 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh_d32c7d36-749d-4cd4-a790-e1e702d6cd64/util/0.log" Mar 09 10:04:10 crc kubenswrapper[4971]: I0309 10:04:10.226442 4971 scope.go:117] "RemoveContainer" containerID="e395dac390e2310108846d7a781e8be05e9ac4e9554caadc8cf57fe60e56aa19" Mar 09 10:04:10 crc kubenswrapper[4971]: I0309 10:04:10.251540 4971 scope.go:117] "RemoveContainer" containerID="cda968c437520562c4f15807e00fcb0df9be2b2ec794240bacac803a73af827a" Mar 09 10:04:10 crc kubenswrapper[4971]: I0309 10:04:10.278484 4971 scope.go:117] "RemoveContainer" containerID="fdc681a59d04d51318a102f0fd0e3dd9f580263ce86d9ae72d11b8f85b2eb334" Mar 09 10:04:10 crc kubenswrapper[4971]: I0309 10:04:10.403475 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f6f5p_299ba20e-3df0-4e8d-9f7b-8e2201422c98/registry-server/0.log" Mar 09 10:04:10 crc kubenswrapper[4971]: I0309 10:04:10.500043 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh_d32c7d36-749d-4cd4-a790-e1e702d6cd64/util/0.log" Mar 09 10:04:10 crc kubenswrapper[4971]: I0309 10:04:10.520864 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh_d32c7d36-749d-4cd4-a790-e1e702d6cd64/pull/0.log" Mar 09 10:04:10 crc kubenswrapper[4971]: I0309 10:04:10.524001 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh_d32c7d36-749d-4cd4-a790-e1e702d6cd64/pull/0.log" Mar 09 10:04:10 crc kubenswrapper[4971]: I0309 10:04:10.669104 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh_d32c7d36-749d-4cd4-a790-e1e702d6cd64/util/0.log" Mar 09 10:04:10 crc kubenswrapper[4971]: I0309 10:04:10.675513 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh_d32c7d36-749d-4cd4-a790-e1e702d6cd64/pull/0.log" Mar 09 10:04:10 crc kubenswrapper[4971]: I0309 10:04:10.720748 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f46q4vh_d32c7d36-749d-4cd4-a790-e1e702d6cd64/extract/0.log" Mar 09 10:04:10 crc kubenswrapper[4971]: I0309 10:04:10.864478 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-26997_3cda571e-d5b5-4436-8846-df239e1c4b79/marketplace-operator/0.log" Mar 09 10:04:10 crc kubenswrapper[4971]: I0309 10:04:10.925245 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-przm2_a31dcdaa-d065-40ad-b444-896e8f2524bc/extract-utilities/0.log" Mar 09 10:04:11 crc kubenswrapper[4971]: I0309 10:04:11.117332 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-przm2_a31dcdaa-d065-40ad-b444-896e8f2524bc/extract-utilities/0.log" Mar 09 10:04:11 crc kubenswrapper[4971]: I0309 10:04:11.122394 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-przm2_a31dcdaa-d065-40ad-b444-896e8f2524bc/extract-content/0.log" Mar 09 10:04:11 crc kubenswrapper[4971]: I0309 10:04:11.125247 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-przm2_a31dcdaa-d065-40ad-b444-896e8f2524bc/extract-content/0.log" Mar 09 10:04:11 crc kubenswrapper[4971]: I0309 10:04:11.292394 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-przm2_a31dcdaa-d065-40ad-b444-896e8f2524bc/extract-utilities/0.log" Mar 09 10:04:11 crc kubenswrapper[4971]: I0309 10:04:11.322449 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-przm2_a31dcdaa-d065-40ad-b444-896e8f2524bc/extract-content/0.log" Mar 09 10:04:11 crc kubenswrapper[4971]: I0309 10:04:11.397071 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-przm2_a31dcdaa-d065-40ad-b444-896e8f2524bc/registry-server/0.log" Mar 09 10:04:11 crc kubenswrapper[4971]: I0309 10:04:11.483519 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qdtbr_136304aa-bacf-46c4-8994-bc6491555b4c/extract-utilities/0.log" Mar 09 10:04:11 crc kubenswrapper[4971]: I0309 10:04:11.700838 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qdtbr_136304aa-bacf-46c4-8994-bc6491555b4c/extract-utilities/0.log" Mar 09 10:04:11 crc kubenswrapper[4971]: I0309 10:04:11.752256 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qdtbr_136304aa-bacf-46c4-8994-bc6491555b4c/extract-content/0.log" Mar 09 10:04:11 crc kubenswrapper[4971]: I0309 10:04:11.774659 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qdtbr_136304aa-bacf-46c4-8994-bc6491555b4c/extract-content/0.log" Mar 09 10:04:11 crc kubenswrapper[4971]: I0309 10:04:11.938834 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qdtbr_136304aa-bacf-46c4-8994-bc6491555b4c/extract-utilities/0.log" Mar 09 10:04:11 crc kubenswrapper[4971]: I0309 10:04:11.988210 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qdtbr_136304aa-bacf-46c4-8994-bc6491555b4c/extract-content/0.log" Mar 09 10:04:12 crc kubenswrapper[4971]: I0309 10:04:12.536581 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qdtbr_136304aa-bacf-46c4-8994-bc6491555b4c/registry-server/0.log" Mar 09 10:04:19 crc kubenswrapper[4971]: I0309 10:04:19.151907 4971 scope.go:117] "RemoveContainer" containerID="47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" Mar 09 10:04:19 crc kubenswrapper[4971]: E0309 10:04:19.152390 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 10:04:33 crc kubenswrapper[4971]: I0309 10:04:33.151942 4971 scope.go:117] "RemoveContainer" containerID="47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" Mar 09 10:04:33 crc kubenswrapper[4971]: E0309 10:04:33.152705 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 10:04:48 crc kubenswrapper[4971]: I0309 10:04:48.152484 4971 scope.go:117] "RemoveContainer" containerID="47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" Mar 09 10:04:48 crc kubenswrapper[4971]: E0309 10:04:48.153474 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 10:05:02 crc kubenswrapper[4971]: I0309 10:05:02.151858 4971 scope.go:117] "RemoveContainer" containerID="47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" Mar 09 10:05:02 crc kubenswrapper[4971]: E0309 10:05:02.152685 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 10:05:10 crc kubenswrapper[4971]: I0309 10:05:10.436947 4971 scope.go:117] "RemoveContainer" containerID="4732f9b6cb4b774e4532abcdfa0dace1c4b84b196710fc30c5555b299ded7e47" Mar 09 10:05:10 crc kubenswrapper[4971]: I0309 10:05:10.468906 4971 scope.go:117] "RemoveContainer" containerID="466c30e2b53e4edf62fe8fec55369d93336b1fa3093a5516d92bd5b0518f9b0e" Mar 09 10:05:10 crc kubenswrapper[4971]: I0309 10:05:10.499456 4971 scope.go:117] "RemoveContainer" containerID="3c333d82222ddd5a72bedfefd528ead9d93b394c3110378c6c63bf67ceeb4f09" Mar 09 10:05:10 crc kubenswrapper[4971]: I0309 10:05:10.536323 4971 scope.go:117] "RemoveContainer" containerID="ecb5af70365e99bb17d86c3bb6ae67474ac6fa2bf570bd9afe669807aad42c61" Mar 09 10:05:10 crc kubenswrapper[4971]: I0309 10:05:10.565763 4971 scope.go:117] "RemoveContainer" containerID="5a62677f01f116d16f099dcfd7688e175521fbaa6f39a9541196f81f5f0dc3db" Mar 09 10:05:10 crc kubenswrapper[4971]: I0309 10:05:10.596766 4971 scope.go:117] "RemoveContainer" containerID="a95776a42bffa19910c66efeaed1e0d4269ff8edd7efefa0d1e23aa23d83d282" Mar 09 10:05:10 crc kubenswrapper[4971]: I0309 10:05:10.627324 4971 scope.go:117] "RemoveContainer" containerID="7ce52775298d36f70200a2f109c51e6cebafcadeef3055e9a76cf5b65b05d0ca" Mar 09 10:05:10 crc kubenswrapper[4971]: I0309 10:05:10.648960 4971 scope.go:117] "RemoveContainer" containerID="f6b0f8bb2f6fdd17ab22e25265268bc903a3a8dd8845a64ae9fdfc7a857368e4" Mar 09 10:05:17 crc kubenswrapper[4971]: I0309 10:05:17.156851 4971 scope.go:117] "RemoveContainer" containerID="47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" Mar 09 10:05:17 crc kubenswrapper[4971]: E0309 10:05:17.157685 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 10:05:30 crc kubenswrapper[4971]: I0309 10:05:30.725740 4971 generic.go:334] "Generic (PLEG): container finished" podID="30f1f59d-f892-41c9-bbcf-f1a1f8fd9677" containerID="cf0a6c15d4a3b2748b60fa01d21a47e38be63a23c582789361ef666b89cb1bfd" exitCode=0 Mar 09 10:05:30 crc kubenswrapper[4971]: I0309 10:05:30.725959 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kcgfg/must-gather-xchvf" event={"ID":"30f1f59d-f892-41c9-bbcf-f1a1f8fd9677","Type":"ContainerDied","Data":"cf0a6c15d4a3b2748b60fa01d21a47e38be63a23c582789361ef666b89cb1bfd"} Mar 09 10:05:30 crc kubenswrapper[4971]: I0309 10:05:30.726625 4971 scope.go:117] "RemoveContainer" containerID="cf0a6c15d4a3b2748b60fa01d21a47e38be63a23c582789361ef666b89cb1bfd" Mar 09 10:05:30 crc kubenswrapper[4971]: I0309 10:05:30.814515 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kcgfg_must-gather-xchvf_30f1f59d-f892-41c9-bbcf-f1a1f8fd9677/gather/0.log" Mar 09 10:05:32 crc kubenswrapper[4971]: I0309 10:05:32.152200 4971 scope.go:117] "RemoveContainer" containerID="47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" Mar 09 10:05:32 crc kubenswrapper[4971]: E0309 10:05:32.152457 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 10:05:37 crc kubenswrapper[4971]: I0309 10:05:37.617233 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kcgfg/must-gather-xchvf"] Mar 09 10:05:37 crc kubenswrapper[4971]: I0309 10:05:37.618080 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-kcgfg/must-gather-xchvf" podUID="30f1f59d-f892-41c9-bbcf-f1a1f8fd9677" containerName="copy" containerID="cri-o://eb9dbd6d716d92b782824034dcae6cac7db93f3383203e20ab5f8f75edcb1f96" gracePeriod=2 Mar 09 10:05:37 crc kubenswrapper[4971]: I0309 10:05:37.625046 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kcgfg/must-gather-xchvf"] Mar 09 10:05:37 crc kubenswrapper[4971]: I0309 10:05:37.784595 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kcgfg_must-gather-xchvf_30f1f59d-f892-41c9-bbcf-f1a1f8fd9677/copy/0.log" Mar 09 10:05:37 crc kubenswrapper[4971]: I0309 10:05:37.784956 4971 generic.go:334] "Generic (PLEG): container finished" podID="30f1f59d-f892-41c9-bbcf-f1a1f8fd9677" containerID="eb9dbd6d716d92b782824034dcae6cac7db93f3383203e20ab5f8f75edcb1f96" exitCode=143 Mar 09 10:05:37 crc kubenswrapper[4971]: I0309 10:05:37.985564 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kcgfg_must-gather-xchvf_30f1f59d-f892-41c9-bbcf-f1a1f8fd9677/copy/0.log" Mar 09 10:05:37 crc kubenswrapper[4971]: I0309 10:05:37.986391 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kcgfg/must-gather-xchvf" Mar 09 10:05:38 crc kubenswrapper[4971]: I0309 10:05:38.135646 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxs6k\" (UniqueName: \"kubernetes.io/projected/30f1f59d-f892-41c9-bbcf-f1a1f8fd9677-kube-api-access-gxs6k\") pod \"30f1f59d-f892-41c9-bbcf-f1a1f8fd9677\" (UID: \"30f1f59d-f892-41c9-bbcf-f1a1f8fd9677\") " Mar 09 10:05:38 crc kubenswrapper[4971]: I0309 10:05:38.135839 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/30f1f59d-f892-41c9-bbcf-f1a1f8fd9677-must-gather-output\") pod \"30f1f59d-f892-41c9-bbcf-f1a1f8fd9677\" (UID: \"30f1f59d-f892-41c9-bbcf-f1a1f8fd9677\") " Mar 09 10:05:38 crc kubenswrapper[4971]: I0309 10:05:38.141500 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30f1f59d-f892-41c9-bbcf-f1a1f8fd9677-kube-api-access-gxs6k" (OuterVolumeSpecName: "kube-api-access-gxs6k") pod "30f1f59d-f892-41c9-bbcf-f1a1f8fd9677" (UID: "30f1f59d-f892-41c9-bbcf-f1a1f8fd9677"). InnerVolumeSpecName "kube-api-access-gxs6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:05:38 crc kubenswrapper[4971]: I0309 10:05:38.237455 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxs6k\" (UniqueName: \"kubernetes.io/projected/30f1f59d-f892-41c9-bbcf-f1a1f8fd9677-kube-api-access-gxs6k\") on node \"crc\" DevicePath \"\"" Mar 09 10:05:38 crc kubenswrapper[4971]: I0309 10:05:38.240798 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30f1f59d-f892-41c9-bbcf-f1a1f8fd9677-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "30f1f59d-f892-41c9-bbcf-f1a1f8fd9677" (UID: "30f1f59d-f892-41c9-bbcf-f1a1f8fd9677"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:05:38 crc kubenswrapper[4971]: I0309 10:05:38.338524 4971 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/30f1f59d-f892-41c9-bbcf-f1a1f8fd9677-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 09 10:05:38 crc kubenswrapper[4971]: I0309 10:05:38.793659 4971 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kcgfg_must-gather-xchvf_30f1f59d-f892-41c9-bbcf-f1a1f8fd9677/copy/0.log" Mar 09 10:05:38 crc kubenswrapper[4971]: I0309 10:05:38.795428 4971 scope.go:117] "RemoveContainer" containerID="eb9dbd6d716d92b782824034dcae6cac7db93f3383203e20ab5f8f75edcb1f96" Mar 09 10:05:38 crc kubenswrapper[4971]: I0309 10:05:38.795459 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kcgfg/must-gather-xchvf" Mar 09 10:05:38 crc kubenswrapper[4971]: I0309 10:05:38.815316 4971 scope.go:117] "RemoveContainer" containerID="cf0a6c15d4a3b2748b60fa01d21a47e38be63a23c582789361ef666b89cb1bfd" Mar 09 10:05:39 crc kubenswrapper[4971]: I0309 10:05:39.161624 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30f1f59d-f892-41c9-bbcf-f1a1f8fd9677" path="/var/lib/kubelet/pods/30f1f59d-f892-41c9-bbcf-f1a1f8fd9677/volumes" Mar 09 10:05:45 crc kubenswrapper[4971]: I0309 10:05:45.153368 4971 scope.go:117] "RemoveContainer" containerID="47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" Mar 09 10:05:45 crc kubenswrapper[4971]: E0309 10:05:45.155729 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 10:06:00 crc kubenswrapper[4971]: I0309 10:06:00.141990 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550846-g64zg"] Mar 09 10:06:00 crc kubenswrapper[4971]: E0309 10:06:00.142977 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30f1f59d-f892-41c9-bbcf-f1a1f8fd9677" containerName="copy" Mar 09 10:06:00 crc kubenswrapper[4971]: I0309 10:06:00.142994 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f1f59d-f892-41c9-bbcf-f1a1f8fd9677" containerName="copy" Mar 09 10:06:00 crc kubenswrapper[4971]: E0309 10:06:00.143024 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0317963a-2307-420e-a2b2-0c3df19d4959" containerName="oc" Mar 09 10:06:00 crc kubenswrapper[4971]: I0309 10:06:00.143032 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="0317963a-2307-420e-a2b2-0c3df19d4959" containerName="oc" Mar 09 10:06:00 crc kubenswrapper[4971]: E0309 10:06:00.143047 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30f1f59d-f892-41c9-bbcf-f1a1f8fd9677" containerName="gather" Mar 09 10:06:00 crc kubenswrapper[4971]: I0309 10:06:00.143055 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f1f59d-f892-41c9-bbcf-f1a1f8fd9677" containerName="gather" Mar 09 10:06:00 crc kubenswrapper[4971]: I0309 10:06:00.143248 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="30f1f59d-f892-41c9-bbcf-f1a1f8fd9677" containerName="copy" Mar 09 10:06:00 crc kubenswrapper[4971]: I0309 10:06:00.143271 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="0317963a-2307-420e-a2b2-0c3df19d4959" containerName="oc" Mar 09 10:06:00 crc kubenswrapper[4971]: I0309 10:06:00.143306 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="30f1f59d-f892-41c9-bbcf-f1a1f8fd9677" containerName="gather" Mar 09 10:06:00 crc kubenswrapper[4971]: I0309 10:06:00.143936 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550846-g64zg" Mar 09 10:06:00 crc kubenswrapper[4971]: I0309 10:06:00.146300 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:06:00 crc kubenswrapper[4971]: I0309 10:06:00.146761 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:06:00 crc kubenswrapper[4971]: I0309 10:06:00.149920 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-xhrv2" Mar 09 10:06:00 crc kubenswrapper[4971]: I0309 10:06:00.152327 4971 scope.go:117] "RemoveContainer" containerID="47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" Mar 09 10:06:00 crc kubenswrapper[4971]: E0309 10:06:00.152566 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 10:06:00 crc kubenswrapper[4971]: I0309 10:06:00.153785 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550846-g64zg"] Mar 09 10:06:00 crc kubenswrapper[4971]: I0309 10:06:00.271956 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqxw8\" (UniqueName: \"kubernetes.io/projected/b3b47177-d617-4e44-81d6-efc7484b993b-kube-api-access-rqxw8\") pod \"auto-csr-approver-29550846-g64zg\" (UID: \"b3b47177-d617-4e44-81d6-efc7484b993b\") " pod="openshift-infra/auto-csr-approver-29550846-g64zg" Mar 09 10:06:00 crc kubenswrapper[4971]: I0309 10:06:00.375287 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqxw8\" (UniqueName: \"kubernetes.io/projected/b3b47177-d617-4e44-81d6-efc7484b993b-kube-api-access-rqxw8\") pod \"auto-csr-approver-29550846-g64zg\" (UID: \"b3b47177-d617-4e44-81d6-efc7484b993b\") " pod="openshift-infra/auto-csr-approver-29550846-g64zg" Mar 09 10:06:00 crc kubenswrapper[4971]: I0309 10:06:00.408992 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqxw8\" (UniqueName: \"kubernetes.io/projected/b3b47177-d617-4e44-81d6-efc7484b993b-kube-api-access-rqxw8\") pod \"auto-csr-approver-29550846-g64zg\" (UID: \"b3b47177-d617-4e44-81d6-efc7484b993b\") " pod="openshift-infra/auto-csr-approver-29550846-g64zg" Mar 09 10:06:00 crc kubenswrapper[4971]: I0309 10:06:00.465284 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550846-g64zg" Mar 09 10:06:00 crc kubenswrapper[4971]: I0309 10:06:00.916834 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550846-g64zg"] Mar 09 10:06:00 crc kubenswrapper[4971]: I0309 10:06:00.972795 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550846-g64zg" event={"ID":"b3b47177-d617-4e44-81d6-efc7484b993b","Type":"ContainerStarted","Data":"f060ebd55cc93b8bf41e7bf650826bc71dfe49960ac9e748a68ae35b8421df30"} Mar 09 10:06:02 crc kubenswrapper[4971]: I0309 10:06:02.993215 4971 generic.go:334] "Generic (PLEG): container finished" podID="b3b47177-d617-4e44-81d6-efc7484b993b" containerID="a37c8f4a685f670313ef4f75c721130553c9301eb520266a001dc4aaeccc551a" exitCode=0 Mar 09 10:06:02 crc kubenswrapper[4971]: I0309 10:06:02.993417 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550846-g64zg" event={"ID":"b3b47177-d617-4e44-81d6-efc7484b993b","Type":"ContainerDied","Data":"a37c8f4a685f670313ef4f75c721130553c9301eb520266a001dc4aaeccc551a"} Mar 09 10:06:04 crc kubenswrapper[4971]: I0309 10:06:04.282785 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550846-g64zg" Mar 09 10:06:04 crc kubenswrapper[4971]: I0309 10:06:04.439134 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqxw8\" (UniqueName: \"kubernetes.io/projected/b3b47177-d617-4e44-81d6-efc7484b993b-kube-api-access-rqxw8\") pod \"b3b47177-d617-4e44-81d6-efc7484b993b\" (UID: \"b3b47177-d617-4e44-81d6-efc7484b993b\") " Mar 09 10:06:04 crc kubenswrapper[4971]: I0309 10:06:04.446020 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3b47177-d617-4e44-81d6-efc7484b993b-kube-api-access-rqxw8" (OuterVolumeSpecName: "kube-api-access-rqxw8") pod "b3b47177-d617-4e44-81d6-efc7484b993b" (UID: "b3b47177-d617-4e44-81d6-efc7484b993b"). InnerVolumeSpecName "kube-api-access-rqxw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:06:04 crc kubenswrapper[4971]: I0309 10:06:04.541431 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqxw8\" (UniqueName: \"kubernetes.io/projected/b3b47177-d617-4e44-81d6-efc7484b993b-kube-api-access-rqxw8\") on node \"crc\" DevicePath \"\"" Mar 09 10:06:05 crc kubenswrapper[4971]: I0309 10:06:05.008798 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550846-g64zg" event={"ID":"b3b47177-d617-4e44-81d6-efc7484b993b","Type":"ContainerDied","Data":"f060ebd55cc93b8bf41e7bf650826bc71dfe49960ac9e748a68ae35b8421df30"} Mar 09 10:06:05 crc kubenswrapper[4971]: I0309 10:06:05.008836 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f060ebd55cc93b8bf41e7bf650826bc71dfe49960ac9e748a68ae35b8421df30" Mar 09 10:06:05 crc kubenswrapper[4971]: I0309 10:06:05.008876 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550846-g64zg" Mar 09 10:06:05 crc kubenswrapper[4971]: I0309 10:06:05.345522 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550840-td6gm"] Mar 09 10:06:05 crc kubenswrapper[4971]: I0309 10:06:05.351586 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550840-td6gm"] Mar 09 10:06:07 crc kubenswrapper[4971]: I0309 10:06:07.166296 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb66c60e-a138-4804-9aaf-db389174e600" path="/var/lib/kubelet/pods/eb66c60e-a138-4804-9aaf-db389174e600/volumes" Mar 09 10:06:10 crc kubenswrapper[4971]: I0309 10:06:10.787853 4971 scope.go:117] "RemoveContainer" containerID="c4ecc5a1f2d36b3a9c1f8a6b86b36611cc38f88183b958b2dd145d47eb8f469f" Mar 09 10:06:10 crc kubenswrapper[4971]: I0309 10:06:10.816072 4971 scope.go:117] "RemoveContainer" containerID="b41063b93081525248eaaf64bb95b955d94452c3ca2c31d4954f1b31cba02039" Mar 09 10:06:10 crc kubenswrapper[4971]: I0309 10:06:10.858136 4971 scope.go:117] "RemoveContainer" containerID="73a86e24c36b93d5ca44dc44599927bcf62f23e1605c0315854cbf4e70734cb6" Mar 09 10:06:10 crc kubenswrapper[4971]: I0309 10:06:10.893611 4971 scope.go:117] "RemoveContainer" containerID="50f20ce0bf95f462480f5417e1377c40128cdb475c8b05fba7fd110117ed1896" Mar 09 10:06:10 crc kubenswrapper[4971]: I0309 10:06:10.928938 4971 scope.go:117] "RemoveContainer" containerID="d52b25a01b3bc92c585e33660df738331f5a5cd1546de767cbaf368a82f8761a" Mar 09 10:06:10 crc kubenswrapper[4971]: I0309 10:06:10.957659 4971 scope.go:117] "RemoveContainer" containerID="68f513651164bac4e873985650e6dd7c50049c37cb341a47f3d6f1edf2bd97a4" Mar 09 10:06:11 crc kubenswrapper[4971]: I0309 10:06:11.001000 4971 scope.go:117] "RemoveContainer" containerID="61b867e30f8fe175a116c548a92649ea336dc608ac8f50edb676419f68ea2343" Mar 09 10:06:11 crc kubenswrapper[4971]: I0309 10:06:11.034942 4971 scope.go:117] "RemoveContainer" containerID="19a3eec3364417e11fa09e5352c12f11703d798d0c7cef43bb83339bf7456876" Mar 09 10:06:12 crc kubenswrapper[4971]: I0309 10:06:12.151938 4971 scope.go:117] "RemoveContainer" containerID="47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" Mar 09 10:06:12 crc kubenswrapper[4971]: E0309 10:06:12.152131 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 10:06:27 crc kubenswrapper[4971]: I0309 10:06:27.156332 4971 scope.go:117] "RemoveContainer" containerID="47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" Mar 09 10:06:27 crc kubenswrapper[4971]: E0309 10:06:27.157097 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 10:06:41 crc kubenswrapper[4971]: I0309 10:06:41.151734 4971 scope.go:117] "RemoveContainer" containerID="47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" Mar 09 10:06:41 crc kubenswrapper[4971]: E0309 10:06:41.152607 4971 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p56wx_openshift-machine-config-operator(05fde3ad-1182-4b15-bb1a-f365ecc92d75)\"" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" Mar 09 10:06:53 crc kubenswrapper[4971]: I0309 10:06:53.151937 4971 scope.go:117] "RemoveContainer" containerID="47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" Mar 09 10:06:53 crc kubenswrapper[4971]: I0309 10:06:53.796426 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" event={"ID":"05fde3ad-1182-4b15-bb1a-f365ecc92d75","Type":"ContainerStarted","Data":"5e4262b6156b737db456cbad7dac00cd099bb509fa826739e48420e5a2b2118b"} Mar 09 10:07:11 crc kubenswrapper[4971]: I0309 10:07:11.219100 4971 scope.go:117] "RemoveContainer" containerID="cd636d6025b257e9f53085f75309c5876201edfae28121ba7a1624559675e6a4" Mar 09 10:07:11 crc kubenswrapper[4971]: I0309 10:07:11.257497 4971 scope.go:117] "RemoveContainer" containerID="a8b7cce18f02193cca732d9769dc6f3e19f411a0b5ad3b777e795ebba71b99e7" Mar 09 10:07:11 crc kubenswrapper[4971]: I0309 10:07:11.305943 4971 scope.go:117] "RemoveContainer" containerID="482a4801c0c1b311da565025675e1a6aaa1ff8dbfd62aa7bd77d2c675b688b64" Mar 09 10:07:11 crc kubenswrapper[4971]: I0309 10:07:11.343761 4971 scope.go:117] "RemoveContainer" containerID="293423da4e0a9a0ae438579ef410d5337f4b42f07585d8bd88b623446084b8f1" Mar 09 10:07:11 crc kubenswrapper[4971]: I0309 10:07:11.367917 4971 scope.go:117] "RemoveContainer" containerID="0b5f58db7841898efe0f54a28ba295479a11453ff2ee34fb515fcc92b8fe7f5e" Mar 09 10:07:11 crc kubenswrapper[4971]: I0309 10:07:11.398088 4971 scope.go:117] "RemoveContainer" containerID="e0c8f2818a23065647eea7145c2d2ebeac0f8d490c56efca08f9a76471e68c7d" Mar 09 10:07:11 crc kubenswrapper[4971]: I0309 10:07:11.439753 4971 scope.go:117] "RemoveContainer" containerID="d93110eeddc1858a96ea5f8cbc6ca309fbf6b963d629c9ffde23ec8cdece4f04" Mar 09 10:07:11 crc kubenswrapper[4971]: I0309 10:07:11.469856 4971 scope.go:117] "RemoveContainer" containerID="3a752b5a987227c9d24eae5351d193caecb4de2ac42d931bca06f7cd140065f7" Mar 09 10:07:11 crc kubenswrapper[4971]: I0309 10:07:11.496930 4971 scope.go:117] "RemoveContainer" containerID="f50802e17a85d376c9d2e653d714e35101a371e2bba637b26a23a2b18e76a606" Mar 09 10:07:54 crc kubenswrapper[4971]: I0309 10:07:54.597173 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dtlmk"] Mar 09 10:07:54 crc kubenswrapper[4971]: E0309 10:07:54.598173 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3b47177-d617-4e44-81d6-efc7484b993b" containerName="oc" Mar 09 10:07:54 crc kubenswrapper[4971]: I0309 10:07:54.598192 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3b47177-d617-4e44-81d6-efc7484b993b" containerName="oc" Mar 09 10:07:54 crc kubenswrapper[4971]: I0309 10:07:54.598403 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3b47177-d617-4e44-81d6-efc7484b993b" containerName="oc" Mar 09 10:07:54 crc kubenswrapper[4971]: I0309 10:07:54.599868 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dtlmk" Mar 09 10:07:54 crc kubenswrapper[4971]: I0309 10:07:54.604896 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dtlmk"] Mar 09 10:07:54 crc kubenswrapper[4971]: I0309 10:07:54.791443 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36353c08-70c5-4011-8143-9dff8d33e099-utilities\") pod \"certified-operators-dtlmk\" (UID: \"36353c08-70c5-4011-8143-9dff8d33e099\") " pod="openshift-marketplace/certified-operators-dtlmk" Mar 09 10:07:54 crc kubenswrapper[4971]: I0309 10:07:54.791569 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slgjt\" (UniqueName: \"kubernetes.io/projected/36353c08-70c5-4011-8143-9dff8d33e099-kube-api-access-slgjt\") pod \"certified-operators-dtlmk\" (UID: \"36353c08-70c5-4011-8143-9dff8d33e099\") " pod="openshift-marketplace/certified-operators-dtlmk" Mar 09 10:07:54 crc kubenswrapper[4971]: I0309 10:07:54.791604 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36353c08-70c5-4011-8143-9dff8d33e099-catalog-content\") pod \"certified-operators-dtlmk\" (UID: \"36353c08-70c5-4011-8143-9dff8d33e099\") " pod="openshift-marketplace/certified-operators-dtlmk" Mar 09 10:07:54 crc kubenswrapper[4971]: I0309 10:07:54.892731 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36353c08-70c5-4011-8143-9dff8d33e099-utilities\") pod \"certified-operators-dtlmk\" (UID: \"36353c08-70c5-4011-8143-9dff8d33e099\") " pod="openshift-marketplace/certified-operators-dtlmk" Mar 09 10:07:54 crc kubenswrapper[4971]: I0309 10:07:54.892820 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slgjt\" (UniqueName: \"kubernetes.io/projected/36353c08-70c5-4011-8143-9dff8d33e099-kube-api-access-slgjt\") pod \"certified-operators-dtlmk\" (UID: \"36353c08-70c5-4011-8143-9dff8d33e099\") " pod="openshift-marketplace/certified-operators-dtlmk" Mar 09 10:07:54 crc kubenswrapper[4971]: I0309 10:07:54.892844 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36353c08-70c5-4011-8143-9dff8d33e099-catalog-content\") pod \"certified-operators-dtlmk\" (UID: \"36353c08-70c5-4011-8143-9dff8d33e099\") " pod="openshift-marketplace/certified-operators-dtlmk" Mar 09 10:07:54 crc kubenswrapper[4971]: I0309 10:07:54.893232 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36353c08-70c5-4011-8143-9dff8d33e099-utilities\") pod \"certified-operators-dtlmk\" (UID: \"36353c08-70c5-4011-8143-9dff8d33e099\") " pod="openshift-marketplace/certified-operators-dtlmk" Mar 09 10:07:54 crc kubenswrapper[4971]: I0309 10:07:54.893282 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36353c08-70c5-4011-8143-9dff8d33e099-catalog-content\") pod \"certified-operators-dtlmk\" (UID: \"36353c08-70c5-4011-8143-9dff8d33e099\") " pod="openshift-marketplace/certified-operators-dtlmk" Mar 09 10:07:54 crc kubenswrapper[4971]: I0309 10:07:54.917472 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slgjt\" (UniqueName: \"kubernetes.io/projected/36353c08-70c5-4011-8143-9dff8d33e099-kube-api-access-slgjt\") pod \"certified-operators-dtlmk\" (UID: \"36353c08-70c5-4011-8143-9dff8d33e099\") " pod="openshift-marketplace/certified-operators-dtlmk" Mar 09 10:07:55 crc kubenswrapper[4971]: I0309 10:07:55.217669 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dtlmk" Mar 09 10:07:55 crc kubenswrapper[4971]: I0309 10:07:55.666547 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dtlmk"] Mar 09 10:07:56 crc kubenswrapper[4971]: I0309 10:07:56.384623 4971 generic.go:334] "Generic (PLEG): container finished" podID="36353c08-70c5-4011-8143-9dff8d33e099" containerID="9ddc060b135e7164a2ce987060bd29cf50578199c14dccf80a88965b68ad80eb" exitCode=0 Mar 09 10:07:56 crc kubenswrapper[4971]: I0309 10:07:56.384689 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtlmk" event={"ID":"36353c08-70c5-4011-8143-9dff8d33e099","Type":"ContainerDied","Data":"9ddc060b135e7164a2ce987060bd29cf50578199c14dccf80a88965b68ad80eb"} Mar 09 10:07:56 crc kubenswrapper[4971]: I0309 10:07:56.384859 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtlmk" event={"ID":"36353c08-70c5-4011-8143-9dff8d33e099","Type":"ContainerStarted","Data":"3b87105d5e56136a5ecf7c82be674999435e99996279f3e842abbd491a209708"} Mar 09 10:07:56 crc kubenswrapper[4971]: I0309 10:07:56.386679 4971 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 10:07:57 crc kubenswrapper[4971]: I0309 10:07:57.392853 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtlmk" event={"ID":"36353c08-70c5-4011-8143-9dff8d33e099","Type":"ContainerStarted","Data":"ba052e604666d39655492a1c785e002f4c3c448472a3f392a3153a57b870e509"} Mar 09 10:07:58 crc kubenswrapper[4971]: I0309 10:07:58.402868 4971 generic.go:334] "Generic (PLEG): container finished" podID="36353c08-70c5-4011-8143-9dff8d33e099" containerID="ba052e604666d39655492a1c785e002f4c3c448472a3f392a3153a57b870e509" exitCode=0 Mar 09 10:07:58 crc kubenswrapper[4971]: I0309 10:07:58.402914 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtlmk" event={"ID":"36353c08-70c5-4011-8143-9dff8d33e099","Type":"ContainerDied","Data":"ba052e604666d39655492a1c785e002f4c3c448472a3f392a3153a57b870e509"} Mar 09 10:07:59 crc kubenswrapper[4971]: I0309 10:07:59.426758 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtlmk" event={"ID":"36353c08-70c5-4011-8143-9dff8d33e099","Type":"ContainerStarted","Data":"b3a69a65f7236d0bb4260acedb05eb8e226bac12bab221c64f8edbf9760d3aff"} Mar 09 10:07:59 crc kubenswrapper[4971]: I0309 10:07:59.453044 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dtlmk" podStartSLOduration=3.002716124 podStartE2EDuration="5.453028274s" podCreationTimestamp="2026-03-09 10:07:54 +0000 UTC" firstStartedPulling="2026-03-09 10:07:56.386469195 +0000 UTC m=+2879.946397005" lastFinishedPulling="2026-03-09 10:07:58.836781335 +0000 UTC m=+2882.396709155" observedRunningTime="2026-03-09 10:07:59.449263316 +0000 UTC m=+2883.009191126" watchObservedRunningTime="2026-03-09 10:07:59.453028274 +0000 UTC m=+2883.012956094" Mar 09 10:08:00 crc kubenswrapper[4971]: I0309 10:08:00.142242 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550848-7qdzb"] Mar 09 10:08:00 crc kubenswrapper[4971]: I0309 10:08:00.143669 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550848-7qdzb" Mar 09 10:08:00 crc kubenswrapper[4971]: I0309 10:08:00.145631 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-xhrv2" Mar 09 10:08:00 crc kubenswrapper[4971]: I0309 10:08:00.146438 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:08:00 crc kubenswrapper[4971]: I0309 10:08:00.147894 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:08:00 crc kubenswrapper[4971]: I0309 10:08:00.152289 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550848-7qdzb"] Mar 09 10:08:00 crc kubenswrapper[4971]: I0309 10:08:00.172718 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw2dj\" (UniqueName: \"kubernetes.io/projected/cbd0a9c4-fc2e-48b1-b85d-5372dcea3a13-kube-api-access-vw2dj\") pod \"auto-csr-approver-29550848-7qdzb\" (UID: \"cbd0a9c4-fc2e-48b1-b85d-5372dcea3a13\") " pod="openshift-infra/auto-csr-approver-29550848-7qdzb" Mar 09 10:08:00 crc kubenswrapper[4971]: I0309 10:08:00.274017 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw2dj\" (UniqueName: \"kubernetes.io/projected/cbd0a9c4-fc2e-48b1-b85d-5372dcea3a13-kube-api-access-vw2dj\") pod \"auto-csr-approver-29550848-7qdzb\" (UID: \"cbd0a9c4-fc2e-48b1-b85d-5372dcea3a13\") " pod="openshift-infra/auto-csr-approver-29550848-7qdzb" Mar 09 10:08:00 crc kubenswrapper[4971]: I0309 10:08:00.291390 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw2dj\" (UniqueName: \"kubernetes.io/projected/cbd0a9c4-fc2e-48b1-b85d-5372dcea3a13-kube-api-access-vw2dj\") pod \"auto-csr-approver-29550848-7qdzb\" (UID: \"cbd0a9c4-fc2e-48b1-b85d-5372dcea3a13\") " pod="openshift-infra/auto-csr-approver-29550848-7qdzb" Mar 09 10:08:00 crc kubenswrapper[4971]: I0309 10:08:00.463196 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550848-7qdzb" Mar 09 10:08:00 crc kubenswrapper[4971]: I0309 10:08:00.899683 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550848-7qdzb"] Mar 09 10:08:01 crc kubenswrapper[4971]: I0309 10:08:01.443434 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550848-7qdzb" event={"ID":"cbd0a9c4-fc2e-48b1-b85d-5372dcea3a13","Type":"ContainerStarted","Data":"e594770c99ce48e9fb475b24aa696884c67feb43e12a6eaeec863c27176a2bae"} Mar 09 10:08:02 crc kubenswrapper[4971]: I0309 10:08:02.452142 4971 generic.go:334] "Generic (PLEG): container finished" podID="cbd0a9c4-fc2e-48b1-b85d-5372dcea3a13" containerID="18b3858c3f8f1ffa302575d039cad5d36a8782056c41cb21d8f689af65a1ce24" exitCode=0 Mar 09 10:08:02 crc kubenswrapper[4971]: I0309 10:08:02.452223 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550848-7qdzb" event={"ID":"cbd0a9c4-fc2e-48b1-b85d-5372dcea3a13","Type":"ContainerDied","Data":"18b3858c3f8f1ffa302575d039cad5d36a8782056c41cb21d8f689af65a1ce24"} Mar 09 10:08:03 crc kubenswrapper[4971]: I0309 10:08:03.722696 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550848-7qdzb" Mar 09 10:08:03 crc kubenswrapper[4971]: I0309 10:08:03.924884 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw2dj\" (UniqueName: \"kubernetes.io/projected/cbd0a9c4-fc2e-48b1-b85d-5372dcea3a13-kube-api-access-vw2dj\") pod \"cbd0a9c4-fc2e-48b1-b85d-5372dcea3a13\" (UID: \"cbd0a9c4-fc2e-48b1-b85d-5372dcea3a13\") " Mar 09 10:08:03 crc kubenswrapper[4971]: I0309 10:08:03.929591 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbd0a9c4-fc2e-48b1-b85d-5372dcea3a13-kube-api-access-vw2dj" (OuterVolumeSpecName: "kube-api-access-vw2dj") pod "cbd0a9c4-fc2e-48b1-b85d-5372dcea3a13" (UID: "cbd0a9c4-fc2e-48b1-b85d-5372dcea3a13"). InnerVolumeSpecName "kube-api-access-vw2dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:08:04 crc kubenswrapper[4971]: I0309 10:08:04.026752 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw2dj\" (UniqueName: \"kubernetes.io/projected/cbd0a9c4-fc2e-48b1-b85d-5372dcea3a13-kube-api-access-vw2dj\") on node \"crc\" DevicePath \"\"" Mar 09 10:08:04 crc kubenswrapper[4971]: I0309 10:08:04.474097 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550848-7qdzb" event={"ID":"cbd0a9c4-fc2e-48b1-b85d-5372dcea3a13","Type":"ContainerDied","Data":"e594770c99ce48e9fb475b24aa696884c67feb43e12a6eaeec863c27176a2bae"} Mar 09 10:08:04 crc kubenswrapper[4971]: I0309 10:08:04.474146 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e594770c99ce48e9fb475b24aa696884c67feb43e12a6eaeec863c27176a2bae" Mar 09 10:08:04 crc kubenswrapper[4971]: I0309 10:08:04.474161 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550848-7qdzb" Mar 09 10:08:04 crc kubenswrapper[4971]: I0309 10:08:04.787794 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550842-kwsrg"] Mar 09 10:08:04 crc kubenswrapper[4971]: I0309 10:08:04.792536 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550842-kwsrg"] Mar 09 10:08:05 crc kubenswrapper[4971]: I0309 10:08:05.162697 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b62d5246-a7fe-4be6-9935-732dafc959a0" path="/var/lib/kubelet/pods/b62d5246-a7fe-4be6-9935-732dafc959a0/volumes" Mar 09 10:08:05 crc kubenswrapper[4971]: I0309 10:08:05.218323 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dtlmk" Mar 09 10:08:05 crc kubenswrapper[4971]: I0309 10:08:05.218395 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dtlmk" Mar 09 10:08:05 crc kubenswrapper[4971]: I0309 10:08:05.261832 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dtlmk" Mar 09 10:08:05 crc kubenswrapper[4971]: I0309 10:08:05.546095 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dtlmk" Mar 09 10:08:05 crc kubenswrapper[4971]: I0309 10:08:05.599281 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dtlmk"] Mar 09 10:08:07 crc kubenswrapper[4971]: I0309 10:08:07.500546 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dtlmk" podUID="36353c08-70c5-4011-8143-9dff8d33e099" containerName="registry-server" containerID="cri-o://b3a69a65f7236d0bb4260acedb05eb8e226bac12bab221c64f8edbf9760d3aff" gracePeriod=2 Mar 09 10:08:07 crc kubenswrapper[4971]: I0309 10:08:07.953060 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dtlmk" Mar 09 10:08:08 crc kubenswrapper[4971]: I0309 10:08:08.086642 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slgjt\" (UniqueName: \"kubernetes.io/projected/36353c08-70c5-4011-8143-9dff8d33e099-kube-api-access-slgjt\") pod \"36353c08-70c5-4011-8143-9dff8d33e099\" (UID: \"36353c08-70c5-4011-8143-9dff8d33e099\") " Mar 09 10:08:08 crc kubenswrapper[4971]: I0309 10:08:08.086714 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36353c08-70c5-4011-8143-9dff8d33e099-catalog-content\") pod \"36353c08-70c5-4011-8143-9dff8d33e099\" (UID: \"36353c08-70c5-4011-8143-9dff8d33e099\") " Mar 09 10:08:08 crc kubenswrapper[4971]: I0309 10:08:08.086774 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36353c08-70c5-4011-8143-9dff8d33e099-utilities\") pod \"36353c08-70c5-4011-8143-9dff8d33e099\" (UID: \"36353c08-70c5-4011-8143-9dff8d33e099\") " Mar 09 10:08:08 crc kubenswrapper[4971]: I0309 10:08:08.092340 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36353c08-70c5-4011-8143-9dff8d33e099-utilities" (OuterVolumeSpecName: "utilities") pod "36353c08-70c5-4011-8143-9dff8d33e099" (UID: "36353c08-70c5-4011-8143-9dff8d33e099"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:08:08 crc kubenswrapper[4971]: I0309 10:08:08.096225 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36353c08-70c5-4011-8143-9dff8d33e099-kube-api-access-slgjt" (OuterVolumeSpecName: "kube-api-access-slgjt") pod "36353c08-70c5-4011-8143-9dff8d33e099" (UID: "36353c08-70c5-4011-8143-9dff8d33e099"). InnerVolumeSpecName "kube-api-access-slgjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:08:08 crc kubenswrapper[4971]: I0309 10:08:08.160544 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36353c08-70c5-4011-8143-9dff8d33e099-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36353c08-70c5-4011-8143-9dff8d33e099" (UID: "36353c08-70c5-4011-8143-9dff8d33e099"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:08:08 crc kubenswrapper[4971]: I0309 10:08:08.190131 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slgjt\" (UniqueName: \"kubernetes.io/projected/36353c08-70c5-4011-8143-9dff8d33e099-kube-api-access-slgjt\") on node \"crc\" DevicePath \"\"" Mar 09 10:08:08 crc kubenswrapper[4971]: I0309 10:08:08.190194 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36353c08-70c5-4011-8143-9dff8d33e099-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 10:08:08 crc kubenswrapper[4971]: I0309 10:08:08.190219 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36353c08-70c5-4011-8143-9dff8d33e099-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 10:08:08 crc kubenswrapper[4971]: I0309 10:08:08.513699 4971 generic.go:334] "Generic (PLEG): container finished" podID="36353c08-70c5-4011-8143-9dff8d33e099" containerID="b3a69a65f7236d0bb4260acedb05eb8e226bac12bab221c64f8edbf9760d3aff" exitCode=0 Mar 09 10:08:08 crc kubenswrapper[4971]: I0309 10:08:08.513748 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtlmk" event={"ID":"36353c08-70c5-4011-8143-9dff8d33e099","Type":"ContainerDied","Data":"b3a69a65f7236d0bb4260acedb05eb8e226bac12bab221c64f8edbf9760d3aff"} Mar 09 10:08:08 crc kubenswrapper[4971]: I0309 10:08:08.513785 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtlmk" event={"ID":"36353c08-70c5-4011-8143-9dff8d33e099","Type":"ContainerDied","Data":"3b87105d5e56136a5ecf7c82be674999435e99996279f3e842abbd491a209708"} Mar 09 10:08:08 crc kubenswrapper[4971]: I0309 10:08:08.513805 4971 scope.go:117] "RemoveContainer" containerID="b3a69a65f7236d0bb4260acedb05eb8e226bac12bab221c64f8edbf9760d3aff" Mar 09 10:08:08 crc kubenswrapper[4971]: I0309 10:08:08.513801 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dtlmk" Mar 09 10:08:08 crc kubenswrapper[4971]: I0309 10:08:08.535972 4971 scope.go:117] "RemoveContainer" containerID="ba052e604666d39655492a1c785e002f4c3c448472a3f392a3153a57b870e509" Mar 09 10:08:08 crc kubenswrapper[4971]: I0309 10:08:08.548134 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dtlmk"] Mar 09 10:08:08 crc kubenswrapper[4971]: I0309 10:08:08.562720 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dtlmk"] Mar 09 10:08:08 crc kubenswrapper[4971]: I0309 10:08:08.570434 4971 scope.go:117] "RemoveContainer" containerID="9ddc060b135e7164a2ce987060bd29cf50578199c14dccf80a88965b68ad80eb" Mar 09 10:08:08 crc kubenswrapper[4971]: I0309 10:08:08.591419 4971 scope.go:117] "RemoveContainer" containerID="b3a69a65f7236d0bb4260acedb05eb8e226bac12bab221c64f8edbf9760d3aff" Mar 09 10:08:08 crc kubenswrapper[4971]: E0309 10:08:08.592409 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3a69a65f7236d0bb4260acedb05eb8e226bac12bab221c64f8edbf9760d3aff\": container with ID starting with b3a69a65f7236d0bb4260acedb05eb8e226bac12bab221c64f8edbf9760d3aff not found: ID does not exist" containerID="b3a69a65f7236d0bb4260acedb05eb8e226bac12bab221c64f8edbf9760d3aff" Mar 09 10:08:08 crc kubenswrapper[4971]: I0309 10:08:08.592471 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3a69a65f7236d0bb4260acedb05eb8e226bac12bab221c64f8edbf9760d3aff"} err="failed to get container status \"b3a69a65f7236d0bb4260acedb05eb8e226bac12bab221c64f8edbf9760d3aff\": rpc error: code = NotFound desc = could not find container \"b3a69a65f7236d0bb4260acedb05eb8e226bac12bab221c64f8edbf9760d3aff\": container with ID starting with b3a69a65f7236d0bb4260acedb05eb8e226bac12bab221c64f8edbf9760d3aff not found: ID does not exist" Mar 09 10:08:08 crc kubenswrapper[4971]: I0309 10:08:08.592509 4971 scope.go:117] "RemoveContainer" containerID="ba052e604666d39655492a1c785e002f4c3c448472a3f392a3153a57b870e509" Mar 09 10:08:08 crc kubenswrapper[4971]: E0309 10:08:08.593001 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba052e604666d39655492a1c785e002f4c3c448472a3f392a3153a57b870e509\": container with ID starting with ba052e604666d39655492a1c785e002f4c3c448472a3f392a3153a57b870e509 not found: ID does not exist" containerID="ba052e604666d39655492a1c785e002f4c3c448472a3f392a3153a57b870e509" Mar 09 10:08:08 crc kubenswrapper[4971]: I0309 10:08:08.593042 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba052e604666d39655492a1c785e002f4c3c448472a3f392a3153a57b870e509"} err="failed to get container status \"ba052e604666d39655492a1c785e002f4c3c448472a3f392a3153a57b870e509\": rpc error: code = NotFound desc = could not find container \"ba052e604666d39655492a1c785e002f4c3c448472a3f392a3153a57b870e509\": container with ID starting with ba052e604666d39655492a1c785e002f4c3c448472a3f392a3153a57b870e509 not found: ID does not exist" Mar 09 10:08:08 crc kubenswrapper[4971]: I0309 10:08:08.593069 4971 scope.go:117] "RemoveContainer" containerID="9ddc060b135e7164a2ce987060bd29cf50578199c14dccf80a88965b68ad80eb" Mar 09 10:08:08 crc kubenswrapper[4971]: E0309 10:08:08.593434 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ddc060b135e7164a2ce987060bd29cf50578199c14dccf80a88965b68ad80eb\": container with ID starting with 9ddc060b135e7164a2ce987060bd29cf50578199c14dccf80a88965b68ad80eb not found: ID does not exist" containerID="9ddc060b135e7164a2ce987060bd29cf50578199c14dccf80a88965b68ad80eb" Mar 09 10:08:08 crc kubenswrapper[4971]: I0309 10:08:08.593455 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ddc060b135e7164a2ce987060bd29cf50578199c14dccf80a88965b68ad80eb"} err="failed to get container status \"9ddc060b135e7164a2ce987060bd29cf50578199c14dccf80a88965b68ad80eb\": rpc error: code = NotFound desc = could not find container \"9ddc060b135e7164a2ce987060bd29cf50578199c14dccf80a88965b68ad80eb\": container with ID starting with 9ddc060b135e7164a2ce987060bd29cf50578199c14dccf80a88965b68ad80eb not found: ID does not exist" Mar 09 10:08:09 crc kubenswrapper[4971]: I0309 10:08:09.163096 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36353c08-70c5-4011-8143-9dff8d33e099" path="/var/lib/kubelet/pods/36353c08-70c5-4011-8143-9dff8d33e099/volumes" Mar 09 10:08:11 crc kubenswrapper[4971]: I0309 10:08:11.650486 4971 scope.go:117] "RemoveContainer" containerID="d60e3d02ec86c24df0b1021efac9bdd6358fe7ca123da1f824e62aaa57122002" Mar 09 10:08:11 crc kubenswrapper[4971]: I0309 10:08:11.693931 4971 scope.go:117] "RemoveContainer" containerID="39a3b1b3cc4e2473b788319e2286c8add3dd9867083cf796351ec71ee81267bc" Mar 09 10:08:11 crc kubenswrapper[4971]: I0309 10:08:11.731933 4971 scope.go:117] "RemoveContainer" containerID="84a0b990a76a3e599c0ae045b32188cf72edcc478ba44d1eaf4590ae2b360387" Mar 09 10:09:11 crc kubenswrapper[4971]: I0309 10:09:11.281659 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kdpld"] Mar 09 10:09:11 crc kubenswrapper[4971]: E0309 10:09:11.282766 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36353c08-70c5-4011-8143-9dff8d33e099" containerName="extract-utilities" Mar 09 10:09:11 crc kubenswrapper[4971]: I0309 10:09:11.282789 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="36353c08-70c5-4011-8143-9dff8d33e099" containerName="extract-utilities" Mar 09 10:09:11 crc kubenswrapper[4971]: E0309 10:09:11.282817 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36353c08-70c5-4011-8143-9dff8d33e099" containerName="extract-content" Mar 09 10:09:11 crc kubenswrapper[4971]: I0309 10:09:11.282828 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="36353c08-70c5-4011-8143-9dff8d33e099" containerName="extract-content" Mar 09 10:09:11 crc kubenswrapper[4971]: E0309 10:09:11.282849 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36353c08-70c5-4011-8143-9dff8d33e099" containerName="registry-server" Mar 09 10:09:11 crc kubenswrapper[4971]: I0309 10:09:11.282860 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="36353c08-70c5-4011-8143-9dff8d33e099" containerName="registry-server" Mar 09 10:09:11 crc kubenswrapper[4971]: E0309 10:09:11.282883 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd0a9c4-fc2e-48b1-b85d-5372dcea3a13" containerName="oc" Mar 09 10:09:11 crc kubenswrapper[4971]: I0309 10:09:11.282894 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd0a9c4-fc2e-48b1-b85d-5372dcea3a13" containerName="oc" Mar 09 10:09:11 crc kubenswrapper[4971]: I0309 10:09:11.283165 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="36353c08-70c5-4011-8143-9dff8d33e099" containerName="registry-server" Mar 09 10:09:11 crc kubenswrapper[4971]: I0309 10:09:11.283210 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd0a9c4-fc2e-48b1-b85d-5372dcea3a13" containerName="oc" Mar 09 10:09:11 crc kubenswrapper[4971]: I0309 10:09:11.284989 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdpld" Mar 09 10:09:11 crc kubenswrapper[4971]: I0309 10:09:11.296020 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kdpld"] Mar 09 10:09:11 crc kubenswrapper[4971]: I0309 10:09:11.409164 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7b955e2-9fbc-43bc-a5bb-cb63059583a1-catalog-content\") pod \"community-operators-kdpld\" (UID: \"d7b955e2-9fbc-43bc-a5bb-cb63059583a1\") " pod="openshift-marketplace/community-operators-kdpld" Mar 09 10:09:11 crc kubenswrapper[4971]: I0309 10:09:11.409231 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6bzl\" (UniqueName: \"kubernetes.io/projected/d7b955e2-9fbc-43bc-a5bb-cb63059583a1-kube-api-access-t6bzl\") pod \"community-operators-kdpld\" (UID: \"d7b955e2-9fbc-43bc-a5bb-cb63059583a1\") " pod="openshift-marketplace/community-operators-kdpld" Mar 09 10:09:11 crc kubenswrapper[4971]: I0309 10:09:11.409307 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7b955e2-9fbc-43bc-a5bb-cb63059583a1-utilities\") pod \"community-operators-kdpld\" (UID: \"d7b955e2-9fbc-43bc-a5bb-cb63059583a1\") " pod="openshift-marketplace/community-operators-kdpld" Mar 09 10:09:11 crc kubenswrapper[4971]: I0309 10:09:11.511146 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7b955e2-9fbc-43bc-a5bb-cb63059583a1-catalog-content\") pod \"community-operators-kdpld\" (UID: \"d7b955e2-9fbc-43bc-a5bb-cb63059583a1\") " pod="openshift-marketplace/community-operators-kdpld" Mar 09 10:09:11 crc kubenswrapper[4971]: I0309 10:09:11.511224 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6bzl\" (UniqueName: \"kubernetes.io/projected/d7b955e2-9fbc-43bc-a5bb-cb63059583a1-kube-api-access-t6bzl\") pod \"community-operators-kdpld\" (UID: \"d7b955e2-9fbc-43bc-a5bb-cb63059583a1\") " pod="openshift-marketplace/community-operators-kdpld" Mar 09 10:09:11 crc kubenswrapper[4971]: I0309 10:09:11.511260 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7b955e2-9fbc-43bc-a5bb-cb63059583a1-utilities\") pod \"community-operators-kdpld\" (UID: \"d7b955e2-9fbc-43bc-a5bb-cb63059583a1\") " pod="openshift-marketplace/community-operators-kdpld" Mar 09 10:09:11 crc kubenswrapper[4971]: I0309 10:09:11.511672 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7b955e2-9fbc-43bc-a5bb-cb63059583a1-utilities\") pod \"community-operators-kdpld\" (UID: \"d7b955e2-9fbc-43bc-a5bb-cb63059583a1\") " pod="openshift-marketplace/community-operators-kdpld" Mar 09 10:09:11 crc kubenswrapper[4971]: I0309 10:09:11.511891 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7b955e2-9fbc-43bc-a5bb-cb63059583a1-catalog-content\") pod \"community-operators-kdpld\" (UID: \"d7b955e2-9fbc-43bc-a5bb-cb63059583a1\") " pod="openshift-marketplace/community-operators-kdpld" Mar 09 10:09:11 crc kubenswrapper[4971]: I0309 10:09:11.531306 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6bzl\" (UniqueName: \"kubernetes.io/projected/d7b955e2-9fbc-43bc-a5bb-cb63059583a1-kube-api-access-t6bzl\") pod \"community-operators-kdpld\" (UID: \"d7b955e2-9fbc-43bc-a5bb-cb63059583a1\") " pod="openshift-marketplace/community-operators-kdpld" Mar 09 10:09:11 crc kubenswrapper[4971]: I0309 10:09:11.610161 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdpld" Mar 09 10:09:11 crc kubenswrapper[4971]: I0309 10:09:11.903680 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kdpld"] Mar 09 10:09:12 crc kubenswrapper[4971]: I0309 10:09:12.060241 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdpld" event={"ID":"d7b955e2-9fbc-43bc-a5bb-cb63059583a1","Type":"ContainerStarted","Data":"5118e56cb84a542dec4b90105857009496f2ed9b01faaab0c6134374cba9d20e"} Mar 09 10:09:13 crc kubenswrapper[4971]: I0309 10:09:13.069067 4971 generic.go:334] "Generic (PLEG): container finished" podID="d7b955e2-9fbc-43bc-a5bb-cb63059583a1" containerID="4b3193624b7e5bcd6d6160ba86720af6baffd083ec9fe6c3afceddca3ebf9c31" exitCode=0 Mar 09 10:09:13 crc kubenswrapper[4971]: I0309 10:09:13.069139 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdpld" event={"ID":"d7b955e2-9fbc-43bc-a5bb-cb63059583a1","Type":"ContainerDied","Data":"4b3193624b7e5bcd6d6160ba86720af6baffd083ec9fe6c3afceddca3ebf9c31"} Mar 09 10:09:14 crc kubenswrapper[4971]: I0309 10:09:14.079031 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdpld" event={"ID":"d7b955e2-9fbc-43bc-a5bb-cb63059583a1","Type":"ContainerStarted","Data":"88f91fc498579a77b422df15fb336ea8d6d014c9eb15021f44df732d543a9965"} Mar 09 10:09:14 crc kubenswrapper[4971]: I0309 10:09:14.795421 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:09:14 crc kubenswrapper[4971]: I0309 10:09:14.795495 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:09:15 crc kubenswrapper[4971]: I0309 10:09:15.089016 4971 generic.go:334] "Generic (PLEG): container finished" podID="d7b955e2-9fbc-43bc-a5bb-cb63059583a1" containerID="88f91fc498579a77b422df15fb336ea8d6d014c9eb15021f44df732d543a9965" exitCode=0 Mar 09 10:09:15 crc kubenswrapper[4971]: I0309 10:09:15.089074 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdpld" event={"ID":"d7b955e2-9fbc-43bc-a5bb-cb63059583a1","Type":"ContainerDied","Data":"88f91fc498579a77b422df15fb336ea8d6d014c9eb15021f44df732d543a9965"} Mar 09 10:09:16 crc kubenswrapper[4971]: I0309 10:09:16.098326 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdpld" event={"ID":"d7b955e2-9fbc-43bc-a5bb-cb63059583a1","Type":"ContainerStarted","Data":"3fc63406f1463dca9b1210fcec00da8654e6b08e89b45d14c7f699f12436a016"} Mar 09 10:09:16 crc kubenswrapper[4971]: I0309 10:09:16.128177 4971 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kdpld" podStartSLOduration=2.715707252 podStartE2EDuration="5.128159641s" podCreationTimestamp="2026-03-09 10:09:11 +0000 UTC" firstStartedPulling="2026-03-09 10:09:13.071065122 +0000 UTC m=+2956.630992932" lastFinishedPulling="2026-03-09 10:09:15.483517511 +0000 UTC m=+2959.043445321" observedRunningTime="2026-03-09 10:09:16.122063567 +0000 UTC m=+2959.681991377" watchObservedRunningTime="2026-03-09 10:09:16.128159641 +0000 UTC m=+2959.688087451" Mar 09 10:09:21 crc kubenswrapper[4971]: I0309 10:09:21.611332 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kdpld" Mar 09 10:09:21 crc kubenswrapper[4971]: I0309 10:09:21.613981 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kdpld" Mar 09 10:09:21 crc kubenswrapper[4971]: I0309 10:09:21.665135 4971 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kdpld" Mar 09 10:09:22 crc kubenswrapper[4971]: I0309 10:09:22.203816 4971 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kdpld" Mar 09 10:09:22 crc kubenswrapper[4971]: I0309 10:09:22.257821 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kdpld"] Mar 09 10:09:24 crc kubenswrapper[4971]: I0309 10:09:24.166030 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kdpld" podUID="d7b955e2-9fbc-43bc-a5bb-cb63059583a1" containerName="registry-server" containerID="cri-o://3fc63406f1463dca9b1210fcec00da8654e6b08e89b45d14c7f699f12436a016" gracePeriod=2 Mar 09 10:09:24 crc kubenswrapper[4971]: I0309 10:09:24.609878 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdpld" Mar 09 10:09:24 crc kubenswrapper[4971]: I0309 10:09:24.707060 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7b955e2-9fbc-43bc-a5bb-cb63059583a1-catalog-content\") pod \"d7b955e2-9fbc-43bc-a5bb-cb63059583a1\" (UID: \"d7b955e2-9fbc-43bc-a5bb-cb63059583a1\") " Mar 09 10:09:24 crc kubenswrapper[4971]: I0309 10:09:24.707152 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7b955e2-9fbc-43bc-a5bb-cb63059583a1-utilities\") pod \"d7b955e2-9fbc-43bc-a5bb-cb63059583a1\" (UID: \"d7b955e2-9fbc-43bc-a5bb-cb63059583a1\") " Mar 09 10:09:24 crc kubenswrapper[4971]: I0309 10:09:24.707218 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6bzl\" (UniqueName: \"kubernetes.io/projected/d7b955e2-9fbc-43bc-a5bb-cb63059583a1-kube-api-access-t6bzl\") pod \"d7b955e2-9fbc-43bc-a5bb-cb63059583a1\" (UID: \"d7b955e2-9fbc-43bc-a5bb-cb63059583a1\") " Mar 09 10:09:24 crc kubenswrapper[4971]: I0309 10:09:24.708512 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7b955e2-9fbc-43bc-a5bb-cb63059583a1-utilities" (OuterVolumeSpecName: "utilities") pod "d7b955e2-9fbc-43bc-a5bb-cb63059583a1" (UID: "d7b955e2-9fbc-43bc-a5bb-cb63059583a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:09:24 crc kubenswrapper[4971]: I0309 10:09:24.713243 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7b955e2-9fbc-43bc-a5bb-cb63059583a1-kube-api-access-t6bzl" (OuterVolumeSpecName: "kube-api-access-t6bzl") pod "d7b955e2-9fbc-43bc-a5bb-cb63059583a1" (UID: "d7b955e2-9fbc-43bc-a5bb-cb63059583a1"). InnerVolumeSpecName "kube-api-access-t6bzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:09:24 crc kubenswrapper[4971]: I0309 10:09:24.767665 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7b955e2-9fbc-43bc-a5bb-cb63059583a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7b955e2-9fbc-43bc-a5bb-cb63059583a1" (UID: "d7b955e2-9fbc-43bc-a5bb-cb63059583a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:09:24 crc kubenswrapper[4971]: I0309 10:09:24.808758 4971 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7b955e2-9fbc-43bc-a5bb-cb63059583a1-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 10:09:24 crc kubenswrapper[4971]: I0309 10:09:24.809043 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6bzl\" (UniqueName: \"kubernetes.io/projected/d7b955e2-9fbc-43bc-a5bb-cb63059583a1-kube-api-access-t6bzl\") on node \"crc\" DevicePath \"\"" Mar 09 10:09:24 crc kubenswrapper[4971]: I0309 10:09:24.809054 4971 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7b955e2-9fbc-43bc-a5bb-cb63059583a1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 10:09:25 crc kubenswrapper[4971]: I0309 10:09:25.186769 4971 generic.go:334] "Generic (PLEG): container finished" podID="d7b955e2-9fbc-43bc-a5bb-cb63059583a1" containerID="3fc63406f1463dca9b1210fcec00da8654e6b08e89b45d14c7f699f12436a016" exitCode=0 Mar 09 10:09:25 crc kubenswrapper[4971]: I0309 10:09:25.186820 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdpld" event={"ID":"d7b955e2-9fbc-43bc-a5bb-cb63059583a1","Type":"ContainerDied","Data":"3fc63406f1463dca9b1210fcec00da8654e6b08e89b45d14c7f699f12436a016"} Mar 09 10:09:25 crc kubenswrapper[4971]: I0309 10:09:25.186823 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kdpld" Mar 09 10:09:25 crc kubenswrapper[4971]: I0309 10:09:25.186872 4971 scope.go:117] "RemoveContainer" containerID="3fc63406f1463dca9b1210fcec00da8654e6b08e89b45d14c7f699f12436a016" Mar 09 10:09:25 crc kubenswrapper[4971]: I0309 10:09:25.186859 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kdpld" event={"ID":"d7b955e2-9fbc-43bc-a5bb-cb63059583a1","Type":"ContainerDied","Data":"5118e56cb84a542dec4b90105857009496f2ed9b01faaab0c6134374cba9d20e"} Mar 09 10:09:25 crc kubenswrapper[4971]: I0309 10:09:25.210292 4971 scope.go:117] "RemoveContainer" containerID="88f91fc498579a77b422df15fb336ea8d6d014c9eb15021f44df732d543a9965" Mar 09 10:09:25 crc kubenswrapper[4971]: I0309 10:09:25.226141 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kdpld"] Mar 09 10:09:25 crc kubenswrapper[4971]: I0309 10:09:25.234707 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kdpld"] Mar 09 10:09:25 crc kubenswrapper[4971]: I0309 10:09:25.244005 4971 scope.go:117] "RemoveContainer" containerID="4b3193624b7e5bcd6d6160ba86720af6baffd083ec9fe6c3afceddca3ebf9c31" Mar 09 10:09:25 crc kubenswrapper[4971]: I0309 10:09:25.278497 4971 scope.go:117] "RemoveContainer" containerID="3fc63406f1463dca9b1210fcec00da8654e6b08e89b45d14c7f699f12436a016" Mar 09 10:09:25 crc kubenswrapper[4971]: E0309 10:09:25.278849 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fc63406f1463dca9b1210fcec00da8654e6b08e89b45d14c7f699f12436a016\": container with ID starting with 3fc63406f1463dca9b1210fcec00da8654e6b08e89b45d14c7f699f12436a016 not found: ID does not exist" containerID="3fc63406f1463dca9b1210fcec00da8654e6b08e89b45d14c7f699f12436a016" Mar 09 10:09:25 crc kubenswrapper[4971]: I0309 10:09:25.278890 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fc63406f1463dca9b1210fcec00da8654e6b08e89b45d14c7f699f12436a016"} err="failed to get container status \"3fc63406f1463dca9b1210fcec00da8654e6b08e89b45d14c7f699f12436a016\": rpc error: code = NotFound desc = could not find container \"3fc63406f1463dca9b1210fcec00da8654e6b08e89b45d14c7f699f12436a016\": container with ID starting with 3fc63406f1463dca9b1210fcec00da8654e6b08e89b45d14c7f699f12436a016 not found: ID does not exist" Mar 09 10:09:25 crc kubenswrapper[4971]: I0309 10:09:25.278915 4971 scope.go:117] "RemoveContainer" containerID="88f91fc498579a77b422df15fb336ea8d6d014c9eb15021f44df732d543a9965" Mar 09 10:09:25 crc kubenswrapper[4971]: E0309 10:09:25.279127 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88f91fc498579a77b422df15fb336ea8d6d014c9eb15021f44df732d543a9965\": container with ID starting with 88f91fc498579a77b422df15fb336ea8d6d014c9eb15021f44df732d543a9965 not found: ID does not exist" containerID="88f91fc498579a77b422df15fb336ea8d6d014c9eb15021f44df732d543a9965" Mar 09 10:09:25 crc kubenswrapper[4971]: I0309 10:09:25.279155 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88f91fc498579a77b422df15fb336ea8d6d014c9eb15021f44df732d543a9965"} err="failed to get container status \"88f91fc498579a77b422df15fb336ea8d6d014c9eb15021f44df732d543a9965\": rpc error: code = NotFound desc = could not find container \"88f91fc498579a77b422df15fb336ea8d6d014c9eb15021f44df732d543a9965\": container with ID starting with 88f91fc498579a77b422df15fb336ea8d6d014c9eb15021f44df732d543a9965 not found: ID does not exist" Mar 09 10:09:25 crc kubenswrapper[4971]: I0309 10:09:25.279174 4971 scope.go:117] "RemoveContainer" containerID="4b3193624b7e5bcd6d6160ba86720af6baffd083ec9fe6c3afceddca3ebf9c31" Mar 09 10:09:25 crc kubenswrapper[4971]: E0309 10:09:25.279472 4971 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b3193624b7e5bcd6d6160ba86720af6baffd083ec9fe6c3afceddca3ebf9c31\": container with ID starting with 4b3193624b7e5bcd6d6160ba86720af6baffd083ec9fe6c3afceddca3ebf9c31 not found: ID does not exist" containerID="4b3193624b7e5bcd6d6160ba86720af6baffd083ec9fe6c3afceddca3ebf9c31" Mar 09 10:09:25 crc kubenswrapper[4971]: I0309 10:09:25.279497 4971 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b3193624b7e5bcd6d6160ba86720af6baffd083ec9fe6c3afceddca3ebf9c31"} err="failed to get container status \"4b3193624b7e5bcd6d6160ba86720af6baffd083ec9fe6c3afceddca3ebf9c31\": rpc error: code = NotFound desc = could not find container \"4b3193624b7e5bcd6d6160ba86720af6baffd083ec9fe6c3afceddca3ebf9c31\": container with ID starting with 4b3193624b7e5bcd6d6160ba86720af6baffd083ec9fe6c3afceddca3ebf9c31 not found: ID does not exist" Mar 09 10:09:27 crc kubenswrapper[4971]: I0309 10:09:27.167741 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7b955e2-9fbc-43bc-a5bb-cb63059583a1" path="/var/lib/kubelet/pods/d7b955e2-9fbc-43bc-a5bb-cb63059583a1/volumes" Mar 09 10:09:44 crc kubenswrapper[4971]: I0309 10:09:44.794414 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:09:44 crc kubenswrapper[4971]: I0309 10:09:44.795131 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:10:00 crc kubenswrapper[4971]: I0309 10:10:00.143524 4971 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550850-fkxpt"] Mar 09 10:10:00 crc kubenswrapper[4971]: E0309 10:10:00.144631 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7b955e2-9fbc-43bc-a5bb-cb63059583a1" containerName="extract-content" Mar 09 10:10:00 crc kubenswrapper[4971]: I0309 10:10:00.144651 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7b955e2-9fbc-43bc-a5bb-cb63059583a1" containerName="extract-content" Mar 09 10:10:00 crc kubenswrapper[4971]: E0309 10:10:00.144668 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7b955e2-9fbc-43bc-a5bb-cb63059583a1" containerName="extract-utilities" Mar 09 10:10:00 crc kubenswrapper[4971]: I0309 10:10:00.144680 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7b955e2-9fbc-43bc-a5bb-cb63059583a1" containerName="extract-utilities" Mar 09 10:10:00 crc kubenswrapper[4971]: E0309 10:10:00.144712 4971 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7b955e2-9fbc-43bc-a5bb-cb63059583a1" containerName="registry-server" Mar 09 10:10:00 crc kubenswrapper[4971]: I0309 10:10:00.144725 4971 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7b955e2-9fbc-43bc-a5bb-cb63059583a1" containerName="registry-server" Mar 09 10:10:00 crc kubenswrapper[4971]: I0309 10:10:00.144992 4971 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7b955e2-9fbc-43bc-a5bb-cb63059583a1" containerName="registry-server" Mar 09 10:10:00 crc kubenswrapper[4971]: I0309 10:10:00.145730 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550850-fkxpt" Mar 09 10:10:00 crc kubenswrapper[4971]: I0309 10:10:00.148748 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:10:00 crc kubenswrapper[4971]: I0309 10:10:00.148940 4971 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-xhrv2" Mar 09 10:10:00 crc kubenswrapper[4971]: I0309 10:10:00.152940 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550850-fkxpt"] Mar 09 10:10:00 crc kubenswrapper[4971]: I0309 10:10:00.153689 4971 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:10:00 crc kubenswrapper[4971]: I0309 10:10:00.251955 4971 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c46dz\" (UniqueName: \"kubernetes.io/projected/f086c7e3-2611-408e-b386-5bc73f95e1d9-kube-api-access-c46dz\") pod \"auto-csr-approver-29550850-fkxpt\" (UID: \"f086c7e3-2611-408e-b386-5bc73f95e1d9\") " pod="openshift-infra/auto-csr-approver-29550850-fkxpt" Mar 09 10:10:00 crc kubenswrapper[4971]: I0309 10:10:00.353266 4971 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c46dz\" (UniqueName: \"kubernetes.io/projected/f086c7e3-2611-408e-b386-5bc73f95e1d9-kube-api-access-c46dz\") pod \"auto-csr-approver-29550850-fkxpt\" (UID: \"f086c7e3-2611-408e-b386-5bc73f95e1d9\") " pod="openshift-infra/auto-csr-approver-29550850-fkxpt" Mar 09 10:10:00 crc kubenswrapper[4971]: I0309 10:10:00.370465 4971 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c46dz\" (UniqueName: \"kubernetes.io/projected/f086c7e3-2611-408e-b386-5bc73f95e1d9-kube-api-access-c46dz\") pod \"auto-csr-approver-29550850-fkxpt\" (UID: \"f086c7e3-2611-408e-b386-5bc73f95e1d9\") " pod="openshift-infra/auto-csr-approver-29550850-fkxpt" Mar 09 10:10:00 crc kubenswrapper[4971]: I0309 10:10:00.463761 4971 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550850-fkxpt" Mar 09 10:10:00 crc kubenswrapper[4971]: I0309 10:10:00.904624 4971 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550850-fkxpt"] Mar 09 10:10:01 crc kubenswrapper[4971]: I0309 10:10:01.657398 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550850-fkxpt" event={"ID":"f086c7e3-2611-408e-b386-5bc73f95e1d9","Type":"ContainerStarted","Data":"876841ce8e84aa810b9eabb04bde2291d11db5820d019c12bdf8259fb1356714"} Mar 09 10:10:02 crc kubenswrapper[4971]: I0309 10:10:02.665744 4971 generic.go:334] "Generic (PLEG): container finished" podID="f086c7e3-2611-408e-b386-5bc73f95e1d9" containerID="2668173ae3edde06b16c388ec636b78e11f09bfa70db868f693258c866bd4cff" exitCode=0 Mar 09 10:10:02 crc kubenswrapper[4971]: I0309 10:10:02.665991 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550850-fkxpt" event={"ID":"f086c7e3-2611-408e-b386-5bc73f95e1d9","Type":"ContainerDied","Data":"2668173ae3edde06b16c388ec636b78e11f09bfa70db868f693258c866bd4cff"} Mar 09 10:10:03 crc kubenswrapper[4971]: I0309 10:10:03.917365 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550850-fkxpt" Mar 09 10:10:04 crc kubenswrapper[4971]: I0309 10:10:04.007912 4971 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c46dz\" (UniqueName: \"kubernetes.io/projected/f086c7e3-2611-408e-b386-5bc73f95e1d9-kube-api-access-c46dz\") pod \"f086c7e3-2611-408e-b386-5bc73f95e1d9\" (UID: \"f086c7e3-2611-408e-b386-5bc73f95e1d9\") " Mar 09 10:10:04 crc kubenswrapper[4971]: I0309 10:10:04.015473 4971 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f086c7e3-2611-408e-b386-5bc73f95e1d9-kube-api-access-c46dz" (OuterVolumeSpecName: "kube-api-access-c46dz") pod "f086c7e3-2611-408e-b386-5bc73f95e1d9" (UID: "f086c7e3-2611-408e-b386-5bc73f95e1d9"). InnerVolumeSpecName "kube-api-access-c46dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:10:04 crc kubenswrapper[4971]: I0309 10:10:04.110225 4971 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c46dz\" (UniqueName: \"kubernetes.io/projected/f086c7e3-2611-408e-b386-5bc73f95e1d9-kube-api-access-c46dz\") on node \"crc\" DevicePath \"\"" Mar 09 10:10:04 crc kubenswrapper[4971]: I0309 10:10:04.682258 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550850-fkxpt" event={"ID":"f086c7e3-2611-408e-b386-5bc73f95e1d9","Type":"ContainerDied","Data":"876841ce8e84aa810b9eabb04bde2291d11db5820d019c12bdf8259fb1356714"} Mar 09 10:10:04 crc kubenswrapper[4971]: I0309 10:10:04.682290 4971 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550850-fkxpt" Mar 09 10:10:04 crc kubenswrapper[4971]: I0309 10:10:04.682305 4971 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="876841ce8e84aa810b9eabb04bde2291d11db5820d019c12bdf8259fb1356714" Mar 09 10:10:04 crc kubenswrapper[4971]: I0309 10:10:04.980719 4971 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550844-f6wb5"] Mar 09 10:10:04 crc kubenswrapper[4971]: I0309 10:10:04.987948 4971 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550844-f6wb5"] Mar 09 10:10:05 crc kubenswrapper[4971]: I0309 10:10:05.162021 4971 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0317963a-2307-420e-a2b2-0c3df19d4959" path="/var/lib/kubelet/pods/0317963a-2307-420e-a2b2-0c3df19d4959/volumes" Mar 09 10:10:11 crc kubenswrapper[4971]: I0309 10:10:11.842754 4971 scope.go:117] "RemoveContainer" containerID="9a6087d19bbbe3332b8637aa5f386f1478a8f140d17f32f4cdf1583666df107c" Mar 09 10:10:14 crc kubenswrapper[4971]: I0309 10:10:14.795467 4971 patch_prober.go:28] interesting pod/machine-config-daemon-p56wx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:10:14 crc kubenswrapper[4971]: I0309 10:10:14.795864 4971 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:10:14 crc kubenswrapper[4971]: I0309 10:10:14.795928 4971 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" Mar 09 10:10:14 crc kubenswrapper[4971]: I0309 10:10:14.796741 4971 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5e4262b6156b737db456cbad7dac00cd099bb509fa826739e48420e5a2b2118b"} pod="openshift-machine-config-operator/machine-config-daemon-p56wx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 10:10:14 crc kubenswrapper[4971]: I0309 10:10:14.796842 4971 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" podUID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerName="machine-config-daemon" containerID="cri-o://5e4262b6156b737db456cbad7dac00cd099bb509fa826739e48420e5a2b2118b" gracePeriod=600 Mar 09 10:10:15 crc kubenswrapper[4971]: I0309 10:10:15.767888 4971 generic.go:334] "Generic (PLEG): container finished" podID="05fde3ad-1182-4b15-bb1a-f365ecc92d75" containerID="5e4262b6156b737db456cbad7dac00cd099bb509fa826739e48420e5a2b2118b" exitCode=0 Mar 09 10:10:15 crc kubenswrapper[4971]: I0309 10:10:15.767934 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" event={"ID":"05fde3ad-1182-4b15-bb1a-f365ecc92d75","Type":"ContainerDied","Data":"5e4262b6156b737db456cbad7dac00cd099bb509fa826739e48420e5a2b2118b"} Mar 09 10:10:15 crc kubenswrapper[4971]: I0309 10:10:15.768531 4971 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p56wx" event={"ID":"05fde3ad-1182-4b15-bb1a-f365ecc92d75","Type":"ContainerStarted","Data":"252d9e1d14ea7aa6827f4fa75aa669ce2c6773198bcc1f6a498092ebbcab2395"} Mar 09 10:10:15 crc kubenswrapper[4971]: I0309 10:10:15.768570 4971 scope.go:117] "RemoveContainer" containerID="47243fe0b476c14ca7384b9b460547405437939dc86210c54e37fcc9ba8f9819" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515153516237024455 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015153516240017364 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015153510032016500 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015153510033015451 5ustar corecore